00:00:00.001 Started by upstream project "autotest-per-patch" build number 127117 00:00:00.001 originally caused by: 00:00:00.002 Started by user sys_sgci 00:00:00.065 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/crypto-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.066 The recommended git tool is: git 00:00:00.066 using credential 00000000-0000-0000-0000-000000000002 00:00:00.070 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/crypto-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.098 Fetching changes from the remote Git repository 00:00:00.100 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.144 Using shallow fetch with depth 1 00:00:00.144 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.144 > git --version # timeout=10 00:00:00.210 > git --version # 'git version 2.39.2' 00:00:00.210 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.227 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.227 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:05.436 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:05.446 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:05.457 Checking out Revision f0c44d8f8e3d61ecd9e3e442b9b5901b0cc7ca08 (FETCH_HEAD) 00:00:05.457 > git config core.sparsecheckout # timeout=10 00:00:05.466 > git read-tree -mu HEAD # timeout=10 00:00:05.480 > git checkout -f f0c44d8f8e3d61ecd9e3e442b9b5901b0cc7ca08 # timeout=5 00:00:05.508 Commit message: "spdk-abi-per-patch: fix check-so-deps-docker-autotest parameters" 00:00:05.508 > git rev-list --no-walk f0c44d8f8e3d61ecd9e3e442b9b5901b0cc7ca08 # timeout=10 00:00:05.586 [Pipeline] Start of Pipeline 00:00:05.598 [Pipeline] library 00:00:05.600 Loading library shm_lib@master 00:00:05.600 Library shm_lib@master is cached. Copying from home. 00:00:05.615 [Pipeline] node 00:00:05.626 Running on WFP3 in /var/jenkins/workspace/crypto-phy-autotest 00:00:05.628 [Pipeline] { 00:00:05.635 [Pipeline] catchError 00:00:05.636 [Pipeline] { 00:00:05.646 [Pipeline] wrap 00:00:05.653 [Pipeline] { 00:00:05.659 [Pipeline] stage 00:00:05.661 [Pipeline] { (Prologue) 00:00:05.836 [Pipeline] sh 00:00:06.118 + logger -p user.info -t JENKINS-CI 00:00:06.138 [Pipeline] echo 00:00:06.139 Node: WFP3 00:00:06.150 [Pipeline] sh 00:00:06.445 [Pipeline] setCustomBuildProperty 00:00:06.454 [Pipeline] echo 00:00:06.456 Cleanup processes 00:00:06.460 [Pipeline] sh 00:00:06.739 + sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:06.739 79478 sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:06.750 [Pipeline] sh 00:00:07.027 ++ sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:07.027 ++ grep -v 'sudo pgrep' 00:00:07.027 ++ awk '{print $1}' 00:00:07.027 + sudo kill -9 00:00:07.027 + true 00:00:07.041 [Pipeline] cleanWs 00:00:07.049 [WS-CLEANUP] Deleting project workspace... 00:00:07.049 [WS-CLEANUP] Deferred wipeout is used... 00:00:07.055 [WS-CLEANUP] done 00:00:07.059 [Pipeline] setCustomBuildProperty 00:00:07.072 [Pipeline] sh 00:00:07.352 + sudo git config --global --replace-all safe.directory '*' 00:00:07.442 [Pipeline] httpRequest 00:00:07.482 [Pipeline] echo 00:00:07.484 Sorcerer 10.211.164.101 is alive 00:00:07.492 [Pipeline] httpRequest 00:00:07.497 HttpMethod: GET 00:00:07.498 URL: http://10.211.164.101/packages/jbp_f0c44d8f8e3d61ecd9e3e442b9b5901b0cc7ca08.tar.gz 00:00:07.498 Sending request to url: http://10.211.164.101/packages/jbp_f0c44d8f8e3d61ecd9e3e442b9b5901b0cc7ca08.tar.gz 00:00:07.504 Response Code: HTTP/1.1 200 OK 00:00:07.505 Success: Status code 200 is in the accepted range: 200,404 00:00:07.505 Saving response body to /var/jenkins/workspace/crypto-phy-autotest/jbp_f0c44d8f8e3d61ecd9e3e442b9b5901b0cc7ca08.tar.gz 00:00:09.608 [Pipeline] sh 00:00:09.889 + tar --no-same-owner -xf jbp_f0c44d8f8e3d61ecd9e3e442b9b5901b0cc7ca08.tar.gz 00:00:09.904 [Pipeline] httpRequest 00:00:09.932 [Pipeline] echo 00:00:09.933 Sorcerer 10.211.164.101 is alive 00:00:09.944 [Pipeline] httpRequest 00:00:09.948 HttpMethod: GET 00:00:09.948 URL: http://10.211.164.101/packages/spdk_68f79842378fdd3ebc3795ae0c42ef8e24177970.tar.gz 00:00:09.949 Sending request to url: http://10.211.164.101/packages/spdk_68f79842378fdd3ebc3795ae0c42ef8e24177970.tar.gz 00:00:09.970 Response Code: HTTP/1.1 200 OK 00:00:09.971 Success: Status code 200 is in the accepted range: 200,404 00:00:09.971 Saving response body to /var/jenkins/workspace/crypto-phy-autotest/spdk_68f79842378fdd3ebc3795ae0c42ef8e24177970.tar.gz 00:00:57.374 [Pipeline] sh 00:00:57.658 + tar --no-same-owner -xf spdk_68f79842378fdd3ebc3795ae0c42ef8e24177970.tar.gz 00:01:00.204 [Pipeline] sh 00:01:00.487 + git -C spdk log --oneline -n5 00:01:00.487 68f798423 scripts/perf: Remove vhost/common.sh source from run_vhost_test.sh 00:01:00.487 8711e7e9b autotest: reduce accel tests runs with SPDK_TEST_ACCEL flag 00:01:00.487 50222f810 configure: don't exit on non Intel platforms 00:01:00.487 78cbcfdde test/scheduler: fix cpu mask for rpc governor tests 00:01:00.487 ba69d4678 event/scheduler: remove custom opts from static scheduler 00:01:00.502 [Pipeline] } 00:01:00.530 [Pipeline] // stage 00:01:00.542 [Pipeline] stage 00:01:00.544 [Pipeline] { (Prepare) 00:01:00.568 [Pipeline] writeFile 00:01:00.579 [Pipeline] sh 00:01:00.855 + logger -p user.info -t JENKINS-CI 00:01:00.865 [Pipeline] sh 00:01:01.143 + logger -p user.info -t JENKINS-CI 00:01:01.155 [Pipeline] sh 00:01:01.434 + cat autorun-spdk.conf 00:01:01.434 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:01.434 SPDK_TEST_BLOCKDEV=1 00:01:01.434 SPDK_TEST_ISAL=1 00:01:01.434 SPDK_TEST_CRYPTO=1 00:01:01.434 SPDK_TEST_REDUCE=1 00:01:01.434 SPDK_TEST_VBDEV_COMPRESS=1 00:01:01.434 SPDK_RUN_UBSAN=1 00:01:01.434 SPDK_TEST_ACCEL=1 00:01:01.439 RUN_NIGHTLY=0 00:01:01.444 [Pipeline] readFile 00:01:01.469 [Pipeline] withEnv 00:01:01.471 [Pipeline] { 00:01:01.483 [Pipeline] sh 00:01:01.762 + set -ex 00:01:01.762 + [[ -f /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf ]] 00:01:01.762 + source /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf 00:01:01.762 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:01.762 ++ SPDK_TEST_BLOCKDEV=1 00:01:01.762 ++ SPDK_TEST_ISAL=1 00:01:01.762 ++ SPDK_TEST_CRYPTO=1 00:01:01.762 ++ SPDK_TEST_REDUCE=1 00:01:01.762 ++ SPDK_TEST_VBDEV_COMPRESS=1 00:01:01.762 ++ SPDK_RUN_UBSAN=1 00:01:01.762 ++ SPDK_TEST_ACCEL=1 00:01:01.762 ++ RUN_NIGHTLY=0 00:01:01.762 + case $SPDK_TEST_NVMF_NICS in 00:01:01.762 + DRIVERS= 00:01:01.762 + [[ -n '' ]] 00:01:01.762 + exit 0 00:01:01.771 [Pipeline] } 00:01:01.788 [Pipeline] // withEnv 00:01:01.793 [Pipeline] } 00:01:01.810 [Pipeline] // stage 00:01:01.819 [Pipeline] catchError 00:01:01.821 [Pipeline] { 00:01:01.836 [Pipeline] timeout 00:01:01.836 Timeout set to expire in 1 hr 0 min 00:01:01.838 [Pipeline] { 00:01:01.853 [Pipeline] stage 00:01:01.855 [Pipeline] { (Tests) 00:01:01.870 [Pipeline] sh 00:01:02.152 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/crypto-phy-autotest 00:01:02.152 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest 00:01:02.152 + DIR_ROOT=/var/jenkins/workspace/crypto-phy-autotest 00:01:02.152 + [[ -n /var/jenkins/workspace/crypto-phy-autotest ]] 00:01:02.152 + DIR_SPDK=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:01:02.152 + DIR_OUTPUT=/var/jenkins/workspace/crypto-phy-autotest/output 00:01:02.152 + [[ -d /var/jenkins/workspace/crypto-phy-autotest/spdk ]] 00:01:02.152 + [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/output ]] 00:01:02.152 + mkdir -p /var/jenkins/workspace/crypto-phy-autotest/output 00:01:02.152 + [[ -d /var/jenkins/workspace/crypto-phy-autotest/output ]] 00:01:02.152 + [[ crypto-phy-autotest == pkgdep-* ]] 00:01:02.152 + cd /var/jenkins/workspace/crypto-phy-autotest 00:01:02.152 + source /etc/os-release 00:01:02.152 ++ NAME='Fedora Linux' 00:01:02.152 ++ VERSION='38 (Cloud Edition)' 00:01:02.152 ++ ID=fedora 00:01:02.152 ++ VERSION_ID=38 00:01:02.152 ++ VERSION_CODENAME= 00:01:02.152 ++ PLATFORM_ID=platform:f38 00:01:02.152 ++ PRETTY_NAME='Fedora Linux 38 (Cloud Edition)' 00:01:02.152 ++ ANSI_COLOR='0;38;2;60;110;180' 00:01:02.152 ++ LOGO=fedora-logo-icon 00:01:02.152 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:38 00:01:02.152 ++ HOME_URL=https://fedoraproject.org/ 00:01:02.152 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f38/system-administrators-guide/ 00:01:02.152 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:01:02.153 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:01:02.153 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:01:02.153 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=38 00:01:02.153 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:01:02.153 ++ REDHAT_SUPPORT_PRODUCT_VERSION=38 00:01:02.153 ++ SUPPORT_END=2024-05-14 00:01:02.153 ++ VARIANT='Cloud Edition' 00:01:02.153 ++ VARIANT_ID=cloud 00:01:02.153 + uname -a 00:01:02.153 Linux spdk-wfp-03 6.7.0-68.fc38.x86_64 #1 SMP PREEMPT_DYNAMIC Mon Jan 15 02:47:10 UTC 2024 x86_64 GNU/Linux 00:01:02.153 + sudo /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:01:05.444 Hugepages 00:01:05.444 node hugesize free / total 00:01:05.444 node0 1048576kB 0 / 0 00:01:05.444 node0 2048kB 0 / 0 00:01:05.444 node1 1048576kB 0 / 0 00:01:05.444 node1 2048kB 0 / 0 00:01:05.444 00:01:05.444 Type BDF Vendor Device NUMA Driver Device Block devices 00:01:05.444 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:01:05.444 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:01:05.444 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:01:05.444 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:01:05.444 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:01:05.444 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:01:05.444 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:01:05.444 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:01:05.444 NVMe 0000:5e:00.0 8086 0a54 0 nvme nvme0 nvme0n1 00:01:05.444 NVMe 0000:5f:00.0 1b96 2600 0 nvme nvme1 nvme1n1 nvme1n2 00:01:05.444 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:01:05.444 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:01:05.444 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:01:05.444 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:01:05.444 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:01:05.445 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:01:05.445 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:01:05.445 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:01:05.445 + rm -f /tmp/spdk-ld-path 00:01:05.445 + source autorun-spdk.conf 00:01:05.445 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:05.445 ++ SPDK_TEST_BLOCKDEV=1 00:01:05.445 ++ SPDK_TEST_ISAL=1 00:01:05.445 ++ SPDK_TEST_CRYPTO=1 00:01:05.445 ++ SPDK_TEST_REDUCE=1 00:01:05.445 ++ SPDK_TEST_VBDEV_COMPRESS=1 00:01:05.445 ++ SPDK_RUN_UBSAN=1 00:01:05.445 ++ SPDK_TEST_ACCEL=1 00:01:05.445 ++ RUN_NIGHTLY=0 00:01:05.445 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:01:05.445 + [[ -n '' ]] 00:01:05.445 + sudo git config --global --add safe.directory /var/jenkins/workspace/crypto-phy-autotest/spdk 00:01:05.445 + for M in /var/spdk/build-*-manifest.txt 00:01:05.445 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:01:05.445 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/crypto-phy-autotest/output/ 00:01:05.445 + for M in /var/spdk/build-*-manifest.txt 00:01:05.445 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:01:05.445 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/crypto-phy-autotest/output/ 00:01:05.445 ++ uname 00:01:05.445 + [[ Linux == \L\i\n\u\x ]] 00:01:05.445 + sudo dmesg -T 00:01:05.445 + sudo dmesg --clear 00:01:05.445 + dmesg_pid=80539 00:01:05.445 + [[ Fedora Linux == FreeBSD ]] 00:01:05.445 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:05.445 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:05.445 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:01:05.445 + [[ -x /usr/src/fio-static/fio ]] 00:01:05.445 + sudo dmesg -Tw 00:01:05.445 + export FIO_BIN=/usr/src/fio-static/fio 00:01:05.445 + FIO_BIN=/usr/src/fio-static/fio 00:01:05.445 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\c\r\y\p\t\o\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:01:05.445 + [[ ! -v VFIO_QEMU_BIN ]] 00:01:05.445 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:01:05.445 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:05.445 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:05.445 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:01:05.445 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:05.445 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:05.445 + spdk/autorun.sh /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf 00:01:05.445 Test configuration: 00:01:05.445 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:05.445 SPDK_TEST_BLOCKDEV=1 00:01:05.445 SPDK_TEST_ISAL=1 00:01:05.445 SPDK_TEST_CRYPTO=1 00:01:05.445 SPDK_TEST_REDUCE=1 00:01:05.445 SPDK_TEST_VBDEV_COMPRESS=1 00:01:05.445 SPDK_RUN_UBSAN=1 00:01:05.445 SPDK_TEST_ACCEL=1 00:01:05.445 RUN_NIGHTLY=0 23:22:50 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:01:05.445 23:22:50 -- scripts/common.sh@508 -- $ [[ -e /bin/wpdk_common.sh ]] 00:01:05.445 23:22:50 -- scripts/common.sh@516 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:01:05.445 23:22:50 -- scripts/common.sh@517 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:01:05.445 23:22:50 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:05.445 23:22:50 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:05.445 23:22:50 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:05.445 23:22:50 -- paths/export.sh@5 -- $ export PATH 00:01:05.445 23:22:50 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:05.445 23:22:50 -- common/autobuild_common.sh@446 -- $ out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:01:05.445 23:22:50 -- common/autobuild_common.sh@447 -- $ date +%s 00:01:05.445 23:22:50 -- common/autobuild_common.sh@447 -- $ mktemp -dt spdk_1721856170.XXXXXX 00:01:05.445 23:22:50 -- common/autobuild_common.sh@447 -- $ SPDK_WORKSPACE=/tmp/spdk_1721856170.Zlvcpo 00:01:05.445 23:22:50 -- common/autobuild_common.sh@449 -- $ [[ -n '' ]] 00:01:05.445 23:22:50 -- common/autobuild_common.sh@453 -- $ '[' -n '' ']' 00:01:05.445 23:22:50 -- common/autobuild_common.sh@456 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/' 00:01:05.445 23:22:50 -- common/autobuild_common.sh@460 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp' 00:01:05.445 23:22:50 -- common/autobuild_common.sh@462 -- $ scanbuild='scan-build -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:01:05.445 23:22:50 -- common/autobuild_common.sh@463 -- $ get_config_params 00:01:05.445 23:22:50 -- common/autotest_common.sh@398 -- $ xtrace_disable 00:01:05.445 23:22:50 -- common/autotest_common.sh@10 -- $ set +x 00:01:05.445 23:22:50 -- common/autobuild_common.sh@463 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk' 00:01:05.445 23:22:50 -- common/autobuild_common.sh@465 -- $ start_monitor_resources 00:01:05.445 23:22:50 -- pm/common@17 -- $ local monitor 00:01:05.445 23:22:50 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:05.445 23:22:50 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:05.445 23:22:50 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:05.445 23:22:50 -- pm/common@21 -- $ date +%s 00:01:05.445 23:22:50 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:05.445 23:22:50 -- pm/common@21 -- $ date +%s 00:01:05.445 23:22:50 -- pm/common@25 -- $ sleep 1 00:01:05.445 23:22:50 -- pm/common@21 -- $ date +%s 00:01:05.445 23:22:50 -- pm/common@21 -- $ date +%s 00:01:05.445 23:22:50 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721856170 00:01:05.445 23:22:50 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721856170 00:01:05.445 23:22:50 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721856170 00:01:05.445 23:22:50 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721856170 00:01:05.445 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721856170_collect-vmstat.pm.log 00:01:05.445 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721856170_collect-cpu-load.pm.log 00:01:05.445 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721856170_collect-cpu-temp.pm.log 00:01:05.445 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721856170_collect-bmc-pm.bmc.pm.log 00:01:06.382 23:22:51 -- common/autobuild_common.sh@466 -- $ trap stop_monitor_resources EXIT 00:01:06.382 23:22:51 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:01:06.382 23:22:51 -- spdk/autobuild.sh@12 -- $ umask 022 00:01:06.382 23:22:51 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:01:06.382 23:22:51 -- spdk/autobuild.sh@16 -- $ date -u 00:01:06.382 Wed Jul 24 09:22:51 PM UTC 2024 00:01:06.382 23:22:51 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:01:06.382 v24.09-pre-312-g68f798423 00:01:06.382 23:22:51 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:01:06.382 23:22:51 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:01:06.382 23:22:51 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:01:06.382 23:22:51 -- common/autotest_common.sh@1101 -- $ '[' 3 -le 1 ']' 00:01:06.382 23:22:51 -- common/autotest_common.sh@1107 -- $ xtrace_disable 00:01:06.382 23:22:51 -- common/autotest_common.sh@10 -- $ set +x 00:01:06.382 ************************************ 00:01:06.382 START TEST ubsan 00:01:06.382 ************************************ 00:01:06.382 23:22:51 ubsan -- common/autotest_common.sh@1125 -- $ echo 'using ubsan' 00:01:06.382 using ubsan 00:01:06.382 00:01:06.382 real 0m0.000s 00:01:06.382 user 0m0.000s 00:01:06.382 sys 0m0.000s 00:01:06.382 23:22:51 ubsan -- common/autotest_common.sh@1126 -- $ xtrace_disable 00:01:06.382 23:22:51 ubsan -- common/autotest_common.sh@10 -- $ set +x 00:01:06.382 ************************************ 00:01:06.382 END TEST ubsan 00:01:06.382 ************************************ 00:01:06.641 23:22:51 -- spdk/autobuild.sh@27 -- $ '[' -n '' ']' 00:01:06.641 23:22:51 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:01:06.641 23:22:51 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:01:06.641 23:22:51 -- spdk/autobuild.sh@51 -- $ [[ 0 -eq 1 ]] 00:01:06.641 23:22:51 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:01:06.641 23:22:51 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:01:06.641 23:22:51 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:01:06.641 23:22:51 -- spdk/autobuild.sh@62 -- $ [[ 0 -eq 1 ]] 00:01:06.641 23:22:51 -- spdk/autobuild.sh@67 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk --with-shared 00:01:06.641 Using default SPDK env in /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:01:06.641 Using default DPDK in /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:01:06.900 Using 'verbs' RDMA provider 00:01:20.077 Configuring ISA-L (logfile: /var/jenkins/workspace/crypto-phy-autotest/spdk/.spdk-isal.log)...done. 00:01:32.287 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/crypto-phy-autotest/spdk/.spdk-isal-crypto.log)...done. 00:01:32.287 Creating mk/config.mk...done. 00:01:32.287 Creating mk/cc.flags.mk...done. 00:01:32.287 Type 'make' to build. 00:01:32.287 23:23:16 -- spdk/autobuild.sh@69 -- $ run_test make make -j96 00:01:32.287 23:23:16 -- common/autotest_common.sh@1101 -- $ '[' 3 -le 1 ']' 00:01:32.287 23:23:16 -- common/autotest_common.sh@1107 -- $ xtrace_disable 00:01:32.287 23:23:16 -- common/autotest_common.sh@10 -- $ set +x 00:01:32.287 ************************************ 00:01:32.287 START TEST make 00:01:32.287 ************************************ 00:01:32.287 23:23:16 make -- common/autotest_common.sh@1125 -- $ make -j96 00:01:32.287 make[1]: Nothing to be done for 'all'. 00:02:04.385 The Meson build system 00:02:04.385 Version: 1.3.1 00:02:04.385 Source dir: /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk 00:02:04.385 Build dir: /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp 00:02:04.385 Build type: native build 00:02:04.385 Program cat found: YES (/usr/bin/cat) 00:02:04.385 Project name: DPDK 00:02:04.385 Project version: 24.03.0 00:02:04.385 C compiler for the host machine: cc (gcc 13.2.1 "cc (GCC) 13.2.1 20231011 (Red Hat 13.2.1-4)") 00:02:04.385 C linker for the host machine: cc ld.bfd 2.39-16 00:02:04.385 Host machine cpu family: x86_64 00:02:04.385 Host machine cpu: x86_64 00:02:04.385 Message: ## Building in Developer Mode ## 00:02:04.385 Program pkg-config found: YES (/usr/bin/pkg-config) 00:02:04.385 Program check-symbols.sh found: YES (/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/buildtools/check-symbols.sh) 00:02:04.385 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/buildtools/options-ibverbs-static.sh) 00:02:04.385 Program python3 found: YES (/usr/bin/python3) 00:02:04.385 Program cat found: YES (/usr/bin/cat) 00:02:04.385 Compiler for C supports arguments -march=native: YES 00:02:04.385 Checking for size of "void *" : 8 00:02:04.385 Checking for size of "void *" : 8 (cached) 00:02:04.385 Compiler for C supports link arguments -Wl,--undefined-version: NO 00:02:04.385 Library m found: YES 00:02:04.385 Library numa found: YES 00:02:04.385 Has header "numaif.h" : YES 00:02:04.385 Library fdt found: NO 00:02:04.385 Library execinfo found: NO 00:02:04.385 Has header "execinfo.h" : YES 00:02:04.385 Found pkg-config: YES (/usr/bin/pkg-config) 1.8.0 00:02:04.385 Run-time dependency libarchive found: NO (tried pkgconfig) 00:02:04.385 Run-time dependency libbsd found: NO (tried pkgconfig) 00:02:04.385 Run-time dependency jansson found: NO (tried pkgconfig) 00:02:04.385 Run-time dependency openssl found: YES 3.0.9 00:02:04.385 Run-time dependency libpcap found: YES 1.10.4 00:02:04.385 Has header "pcap.h" with dependency libpcap: YES 00:02:04.385 Compiler for C supports arguments -Wcast-qual: YES 00:02:04.385 Compiler for C supports arguments -Wdeprecated: YES 00:02:04.385 Compiler for C supports arguments -Wformat: YES 00:02:04.385 Compiler for C supports arguments -Wformat-nonliteral: NO 00:02:04.385 Compiler for C supports arguments -Wformat-security: NO 00:02:04.385 Compiler for C supports arguments -Wmissing-declarations: YES 00:02:04.385 Compiler for C supports arguments -Wmissing-prototypes: YES 00:02:04.385 Compiler for C supports arguments -Wnested-externs: YES 00:02:04.385 Compiler for C supports arguments -Wold-style-definition: YES 00:02:04.385 Compiler for C supports arguments -Wpointer-arith: YES 00:02:04.385 Compiler for C supports arguments -Wsign-compare: YES 00:02:04.385 Compiler for C supports arguments -Wstrict-prototypes: YES 00:02:04.385 Compiler for C supports arguments -Wundef: YES 00:02:04.385 Compiler for C supports arguments -Wwrite-strings: YES 00:02:04.385 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:02:04.385 Compiler for C supports arguments -Wno-packed-not-aligned: YES 00:02:04.385 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:02:04.385 Compiler for C supports arguments -Wno-zero-length-bounds: YES 00:02:04.385 Program objdump found: YES (/usr/bin/objdump) 00:02:04.385 Compiler for C supports arguments -mavx512f: YES 00:02:04.385 Checking if "AVX512 checking" compiles: YES 00:02:04.385 Fetching value of define "__SSE4_2__" : 1 00:02:04.385 Fetching value of define "__AES__" : 1 00:02:04.385 Fetching value of define "__AVX__" : 1 00:02:04.385 Fetching value of define "__AVX2__" : 1 00:02:04.385 Fetching value of define "__AVX512BW__" : 1 00:02:04.385 Fetching value of define "__AVX512CD__" : 1 00:02:04.385 Fetching value of define "__AVX512DQ__" : 1 00:02:04.385 Fetching value of define "__AVX512F__" : 1 00:02:04.385 Fetching value of define "__AVX512VL__" : 1 00:02:04.385 Fetching value of define "__PCLMUL__" : 1 00:02:04.385 Fetching value of define "__RDRND__" : 1 00:02:04.385 Fetching value of define "__RDSEED__" : 1 00:02:04.385 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:02:04.386 Fetching value of define "__znver1__" : (undefined) 00:02:04.386 Fetching value of define "__znver2__" : (undefined) 00:02:04.386 Fetching value of define "__znver3__" : (undefined) 00:02:04.386 Fetching value of define "__znver4__" : (undefined) 00:02:04.386 Compiler for C supports arguments -Wno-format-truncation: YES 00:02:04.386 Message: lib/log: Defining dependency "log" 00:02:04.386 Message: lib/kvargs: Defining dependency "kvargs" 00:02:04.386 Message: lib/telemetry: Defining dependency "telemetry" 00:02:04.386 Checking for function "getentropy" : NO 00:02:04.386 Message: lib/eal: Defining dependency "eal" 00:02:04.386 Message: lib/ring: Defining dependency "ring" 00:02:04.386 Message: lib/rcu: Defining dependency "rcu" 00:02:04.386 Message: lib/mempool: Defining dependency "mempool" 00:02:04.386 Message: lib/mbuf: Defining dependency "mbuf" 00:02:04.386 Fetching value of define "__PCLMUL__" : 1 (cached) 00:02:04.386 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:04.386 Fetching value of define "__AVX512BW__" : 1 (cached) 00:02:04.386 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:02:04.386 Fetching value of define "__AVX512VL__" : 1 (cached) 00:02:04.386 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:02:04.386 Compiler for C supports arguments -mpclmul: YES 00:02:04.386 Compiler for C supports arguments -maes: YES 00:02:04.386 Compiler for C supports arguments -mavx512f: YES (cached) 00:02:04.386 Compiler for C supports arguments -mavx512bw: YES 00:02:04.386 Compiler for C supports arguments -mavx512dq: YES 00:02:04.386 Compiler for C supports arguments -mavx512vl: YES 00:02:04.386 Compiler for C supports arguments -mvpclmulqdq: YES 00:02:04.386 Compiler for C supports arguments -mavx2: YES 00:02:04.386 Compiler for C supports arguments -mavx: YES 00:02:04.386 Message: lib/net: Defining dependency "net" 00:02:04.386 Message: lib/meter: Defining dependency "meter" 00:02:04.386 Message: lib/ethdev: Defining dependency "ethdev" 00:02:04.386 Message: lib/pci: Defining dependency "pci" 00:02:04.386 Message: lib/cmdline: Defining dependency "cmdline" 00:02:04.386 Message: lib/hash: Defining dependency "hash" 00:02:04.386 Message: lib/timer: Defining dependency "timer" 00:02:04.386 Message: lib/compressdev: Defining dependency "compressdev" 00:02:04.386 Message: lib/cryptodev: Defining dependency "cryptodev" 00:02:04.386 Message: lib/dmadev: Defining dependency "dmadev" 00:02:04.386 Compiler for C supports arguments -Wno-cast-qual: YES 00:02:04.386 Message: lib/power: Defining dependency "power" 00:02:04.386 Message: lib/reorder: Defining dependency "reorder" 00:02:04.386 Message: lib/security: Defining dependency "security" 00:02:04.386 Has header "linux/userfaultfd.h" : YES 00:02:04.386 Has header "linux/vduse.h" : YES 00:02:04.386 Message: lib/vhost: Defining dependency "vhost" 00:02:04.386 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:02:04.386 Message: drivers/bus/auxiliary: Defining dependency "bus_auxiliary" 00:02:04.386 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:02:04.386 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:02:04.386 Compiler for C supports arguments -std=c11: YES 00:02:04.386 Compiler for C supports arguments -Wno-strict-prototypes: YES 00:02:04.386 Compiler for C supports arguments -D_BSD_SOURCE: YES 00:02:04.386 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES 00:02:04.386 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES 00:02:04.386 Run-time dependency libmlx5 found: YES 1.24.46.0 00:02:04.386 Run-time dependency libibverbs found: YES 1.14.46.0 00:02:04.386 Library mtcr_ul found: NO 00:02:04.386 Header "infiniband/verbs.h" has symbol "IBV_FLOW_SPEC_ESP" with dependencies libmlx5, libibverbs: YES 00:02:04.386 Header "infiniband/verbs.h" has symbol "IBV_RX_HASH_IPSEC_SPI" with dependencies libmlx5, libibverbs: YES 00:02:04.386 Header "infiniband/verbs.h" has symbol "IBV_ACCESS_RELAXED_ORDERING " with dependencies libmlx5, libibverbs: YES 00:02:04.386 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CQE_RES_FORMAT_CSUM_STRIDX" with dependencies libmlx5, libibverbs: YES 00:02:04.386 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_MASK_TUNNEL_OFFLOADS" with dependencies libmlx5, libibverbs: YES 00:02:04.386 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_FLAGS_MPW_ALLOWED" with dependencies libmlx5, libibverbs: YES 00:02:04.386 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_FLAGS_CQE_128B_COMP" with dependencies libmlx5, libibverbs: YES 00:02:04.386 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CQ_INIT_ATTR_FLAGS_CQE_PAD" with dependencies libmlx5, libibverbs: YES 00:02:04.386 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_create_flow_action_packet_reformat" with dependencies libmlx5, libibverbs: YES 00:02:04.386 Header "infiniband/verbs.h" has symbol "IBV_FLOW_SPEC_MPLS" with dependencies libmlx5, libibverbs: YES 00:02:04.386 Header "infiniband/verbs.h" has symbol "IBV_WQ_FLAGS_PCI_WRITE_END_PADDING" with dependencies libmlx5, libibverbs: YES 00:02:04.386 Header "infiniband/verbs.h" has symbol "IBV_WQ_FLAG_RX_END_PADDING" with dependencies libmlx5, libibverbs: NO 00:02:04.386 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_query_devx_port" with dependencies libmlx5, libibverbs: NO 00:02:04.386 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_query_port" with dependencies libmlx5, libibverbs: YES 00:02:04.386 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_ib_port" with dependencies libmlx5, libibverbs: YES 00:02:04.386 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_obj_create" with dependencies libmlx5, libibverbs: YES 00:02:04.386 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_FLOW_ACTION_COUNTERS_DEVX" with dependencies libmlx5, libibverbs: YES 00:02:04.386 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_FLOW_ACTION_DEFAULT_MISS" with dependencies libmlx5, libibverbs: YES 00:02:04.386 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_obj_query_async" with dependencies libmlx5, libibverbs: YES 00:02:04.386 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_qp_query" with dependencies libmlx5, libibverbs: YES 00:02:04.386 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_pp_alloc" with dependencies libmlx5, libibverbs: YES 00:02:04.386 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_devx_tir" with dependencies libmlx5, libibverbs: YES 00:02:04.386 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_get_event" with dependencies libmlx5, libibverbs: YES 00:02:04.386 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_flow_meter" with dependencies libmlx5, libibverbs: YES 00:02:04.386 Header "infiniband/mlx5dv.h" has symbol "MLX5_MMAP_GET_NC_PAGES_CMD" with dependencies libmlx5, libibverbs: YES 00:02:04.386 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_DOMAIN_TYPE_NIC_RX" with dependencies libmlx5, libibverbs: YES 00:02:04.386 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_DOMAIN_TYPE_FDB" with dependencies libmlx5, libibverbs: YES 00:02:04.386 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_push_vlan" with dependencies libmlx5, libibverbs: YES 00:02:04.386 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_alloc_var" with dependencies libmlx5, libibverbs: YES 00:02:04.386 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_ENHANCED_MPSW" with dependencies libmlx5, libibverbs: NO 00:02:04.386 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_SEND_EN" with dependencies libmlx5, libibverbs: NO 00:02:04.386 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_WAIT" with dependencies libmlx5, libibverbs: NO 00:02:04.386 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_ACCESS_ASO" with dependencies libmlx5, libibverbs: NO 00:02:04.386 Header "linux/if_link.h" has symbol "IFLA_NUM_VF" with dependencies libmlx5, libibverbs: YES 00:02:04.386 Header "linux/if_link.h" has symbol "IFLA_EXT_MASK" with dependencies libmlx5, libibverbs: YES 00:02:04.386 Header "linux/if_link.h" has symbol "IFLA_PHYS_SWITCH_ID" with dependencies libmlx5, libibverbs: YES 00:02:04.386 Header "linux/if_link.h" has symbol "IFLA_PHYS_PORT_NAME" with dependencies libmlx5, libibverbs: YES 00:02:04.386 Header "rdma/rdma_netlink.h" has symbol "RDMA_NL_NLDEV" with dependencies libmlx5, libibverbs: YES 00:02:04.386 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_CMD_GET" with dependencies libmlx5, libibverbs: YES 00:02:04.386 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_CMD_PORT_GET" with dependencies libmlx5, libibverbs: YES 00:02:04.386 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_DEV_INDEX" with dependencies libmlx5, libibverbs: YES 00:02:04.386 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_DEV_NAME" with dependencies libmlx5, libibverbs: YES 00:02:04.386 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_PORT_INDEX" with dependencies libmlx5, libibverbs: YES 00:02:04.386 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_PORT_STATE" with dependencies libmlx5, libibverbs: YES 00:02:04.386 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_NDEV_INDEX" with dependencies libmlx5, libibverbs: YES 00:02:04.386 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dump_dr_domain" with dependencies libmlx5, libibverbs: YES 00:02:04.386 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_flow_sampler" with dependencies libmlx5, libibverbs: YES 00:02:04.386 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_domain_set_reclaim_device_memory" with dependencies libmlx5, libibverbs: YES 00:02:04.386 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_array" with dependencies libmlx5, libibverbs: YES 00:02:04.386 Header "linux/devlink.h" has symbol "DEVLINK_GENL_NAME" with dependencies libmlx5, libibverbs: YES 00:02:04.386 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_aso" with dependencies libmlx5, libibverbs: YES 00:02:04.386 Header "infiniband/verbs.h" has symbol "INFINIBAND_VERBS_H" with dependencies libmlx5, libibverbs: YES 00:02:04.386 Header "infiniband/mlx5dv.h" has symbol "MLX5_WQE_UMR_CTRL_FLAG_INLINE" with dependencies libmlx5, libibverbs: YES 00:02:04.386 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dump_dr_rule" with dependencies libmlx5, libibverbs: YES 00:02:04.386 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_ACTION_FLAGS_ASO_CT_DIRECTION_INITIATOR" with dependencies libmlx5, libibverbs: YES 00:02:04.386 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_domain_allow_duplicate_rules" with dependencies libmlx5, libibverbs: YES 00:02:04.387 Header "infiniband/verbs.h" has symbol "ibv_reg_mr_iova" with dependencies libmlx5, libibverbs: YES 00:02:04.387 Header "infiniband/verbs.h" has symbol "ibv_import_device" with dependencies libmlx5, libibverbs: YES 00:02:04.387 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_root_table" with dependencies libmlx5, libibverbs: YES 00:02:04.387 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_create_steering_anchor" with dependencies libmlx5, libibverbs: YES 00:02:04.387 Header "infiniband/verbs.h" has symbol "ibv_is_fork_initialized" with dependencies libmlx5, libibverbs: YES 00:02:04.387 Checking whether type "struct mlx5dv_sw_parsing_caps" has member "sw_parsing_offloads" with dependencies libmlx5, libibverbs: YES 00:02:04.387 Checking whether type "struct ibv_counter_set_init_attr" has member "counter_set_id" with dependencies libmlx5, libibverbs: NO 00:02:04.387 Checking whether type "struct ibv_counters_init_attr" has member "comp_mask" with dependencies libmlx5, libibverbs: YES 00:02:04.387 Checking whether type "struct mlx5dv_devx_uar" has member "mmap_off" with dependencies libmlx5, libibverbs: YES 00:02:04.387 Checking whether type "struct mlx5dv_flow_matcher_attr" has member "ft_type" with dependencies libmlx5, libibverbs: YES 00:02:04.387 Configuring mlx5_autoconf.h using configuration 00:02:04.387 Message: drivers/common/mlx5: Defining dependency "common_mlx5" 00:02:04.387 Run-time dependency libcrypto found: YES 3.0.9 00:02:04.387 Library IPSec_MB found: YES 00:02:04.387 Fetching value of define "IMB_VERSION_STR" : "1.5.0" 00:02:04.387 Message: drivers/common/qat: Defining dependency "common_qat" 00:02:04.387 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:02:04.387 Message: Disabling raw/* drivers: missing internal dependency "rawdev" 00:02:04.387 Library IPSec_MB found: YES 00:02:04.387 Fetching value of define "IMB_VERSION_STR" : "1.5.0" (cached) 00:02:04.387 Message: drivers/crypto/ipsec_mb: Defining dependency "crypto_ipsec_mb" 00:02:04.387 Compiler for C supports arguments -std=c11: YES (cached) 00:02:04.387 Compiler for C supports arguments -Wno-strict-prototypes: YES (cached) 00:02:04.387 Compiler for C supports arguments -D_BSD_SOURCE: YES (cached) 00:02:04.387 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES (cached) 00:02:04.387 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES (cached) 00:02:04.387 Message: drivers/crypto/mlx5: Defining dependency "crypto_mlx5" 00:02:04.387 Run-time dependency libisal found: NO (tried pkgconfig) 00:02:04.387 Library libisal found: NO 00:02:04.387 Message: drivers/compress/isal: Defining dependency "compress_isal" 00:02:04.387 Compiler for C supports arguments -std=c11: YES (cached) 00:02:04.387 Compiler for C supports arguments -Wno-strict-prototypes: YES (cached) 00:02:04.387 Compiler for C supports arguments -D_BSD_SOURCE: YES (cached) 00:02:04.387 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES (cached) 00:02:04.387 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES (cached) 00:02:04.387 Message: drivers/compress/mlx5: Defining dependency "compress_mlx5" 00:02:04.387 Message: Disabling regex/* drivers: missing internal dependency "regexdev" 00:02:04.387 Message: Disabling ml/* drivers: missing internal dependency "mldev" 00:02:04.387 Message: Disabling event/* drivers: missing internal dependency "eventdev" 00:02:04.387 Message: Disabling baseband/* drivers: missing internal dependency "bbdev" 00:02:04.387 Message: Disabling gpu/* drivers: missing internal dependency "gpudev" 00:02:04.387 Program doxygen found: YES (/usr/bin/doxygen) 00:02:04.387 Configuring doxy-api-html.conf using configuration 00:02:04.387 Configuring doxy-api-man.conf using configuration 00:02:04.387 Program mandb found: YES (/usr/bin/mandb) 00:02:04.387 Program sphinx-build found: NO 00:02:04.387 Configuring rte_build_config.h using configuration 00:02:04.387 Message: 00:02:04.387 ================= 00:02:04.387 Applications Enabled 00:02:04.387 ================= 00:02:04.387 00:02:04.387 apps: 00:02:04.387 00:02:04.387 00:02:04.387 Message: 00:02:04.387 ================= 00:02:04.387 Libraries Enabled 00:02:04.387 ================= 00:02:04.387 00:02:04.387 libs: 00:02:04.387 log, kvargs, telemetry, eal, ring, rcu, mempool, mbuf, 00:02:04.387 net, meter, ethdev, pci, cmdline, hash, timer, compressdev, 00:02:04.387 cryptodev, dmadev, power, reorder, security, vhost, 00:02:04.387 00:02:04.387 Message: 00:02:04.387 =============== 00:02:04.387 Drivers Enabled 00:02:04.387 =============== 00:02:04.387 00:02:04.387 common: 00:02:04.387 mlx5, qat, 00:02:04.387 bus: 00:02:04.387 auxiliary, pci, vdev, 00:02:04.387 mempool: 00:02:04.387 ring, 00:02:04.387 dma: 00:02:04.387 00:02:04.387 net: 00:02:04.387 00:02:04.387 crypto: 00:02:04.387 ipsec_mb, mlx5, 00:02:04.387 compress: 00:02:04.387 isal, mlx5, 00:02:04.387 vdpa: 00:02:04.387 00:02:04.387 00:02:04.387 Message: 00:02:04.387 ================= 00:02:04.387 Content Skipped 00:02:04.387 ================= 00:02:04.387 00:02:04.387 apps: 00:02:04.387 dumpcap: explicitly disabled via build config 00:02:04.387 graph: explicitly disabled via build config 00:02:04.387 pdump: explicitly disabled via build config 00:02:04.387 proc-info: explicitly disabled via build config 00:02:04.387 test-acl: explicitly disabled via build config 00:02:04.387 test-bbdev: explicitly disabled via build config 00:02:04.387 test-cmdline: explicitly disabled via build config 00:02:04.387 test-compress-perf: explicitly disabled via build config 00:02:04.387 test-crypto-perf: explicitly disabled via build config 00:02:04.387 test-dma-perf: explicitly disabled via build config 00:02:04.387 test-eventdev: explicitly disabled via build config 00:02:04.387 test-fib: explicitly disabled via build config 00:02:04.387 test-flow-perf: explicitly disabled via build config 00:02:04.387 test-gpudev: explicitly disabled via build config 00:02:04.387 test-mldev: explicitly disabled via build config 00:02:04.387 test-pipeline: explicitly disabled via build config 00:02:04.387 test-pmd: explicitly disabled via build config 00:02:04.387 test-regex: explicitly disabled via build config 00:02:04.387 test-sad: explicitly disabled via build config 00:02:04.387 test-security-perf: explicitly disabled via build config 00:02:04.387 00:02:04.387 libs: 00:02:04.387 argparse: explicitly disabled via build config 00:02:04.387 metrics: explicitly disabled via build config 00:02:04.387 acl: explicitly disabled via build config 00:02:04.387 bbdev: explicitly disabled via build config 00:02:04.387 bitratestats: explicitly disabled via build config 00:02:04.387 bpf: explicitly disabled via build config 00:02:04.387 cfgfile: explicitly disabled via build config 00:02:04.387 distributor: explicitly disabled via build config 00:02:04.387 efd: explicitly disabled via build config 00:02:04.387 eventdev: explicitly disabled via build config 00:02:04.387 dispatcher: explicitly disabled via build config 00:02:04.387 gpudev: explicitly disabled via build config 00:02:04.387 gro: explicitly disabled via build config 00:02:04.387 gso: explicitly disabled via build config 00:02:04.387 ip_frag: explicitly disabled via build config 00:02:04.387 jobstats: explicitly disabled via build config 00:02:04.387 latencystats: explicitly disabled via build config 00:02:04.387 lpm: explicitly disabled via build config 00:02:04.387 member: explicitly disabled via build config 00:02:04.387 pcapng: explicitly disabled via build config 00:02:04.387 rawdev: explicitly disabled via build config 00:02:04.387 regexdev: explicitly disabled via build config 00:02:04.387 mldev: explicitly disabled via build config 00:02:04.387 rib: explicitly disabled via build config 00:02:04.387 sched: explicitly disabled via build config 00:02:04.387 stack: explicitly disabled via build config 00:02:04.387 ipsec: explicitly disabled via build config 00:02:04.387 pdcp: explicitly disabled via build config 00:02:04.387 fib: explicitly disabled via build config 00:02:04.387 port: explicitly disabled via build config 00:02:04.387 pdump: explicitly disabled via build config 00:02:04.387 table: explicitly disabled via build config 00:02:04.387 pipeline: explicitly disabled via build config 00:02:04.387 graph: explicitly disabled via build config 00:02:04.387 node: explicitly disabled via build config 00:02:04.387 00:02:04.387 drivers: 00:02:04.387 common/cpt: not in enabled drivers build config 00:02:04.387 common/dpaax: not in enabled drivers build config 00:02:04.387 common/iavf: not in enabled drivers build config 00:02:04.387 common/idpf: not in enabled drivers build config 00:02:04.387 common/ionic: not in enabled drivers build config 00:02:04.387 common/mvep: not in enabled drivers build config 00:02:04.387 common/octeontx: not in enabled drivers build config 00:02:04.387 bus/cdx: not in enabled drivers build config 00:02:04.387 bus/dpaa: not in enabled drivers build config 00:02:04.387 bus/fslmc: not in enabled drivers build config 00:02:04.387 bus/ifpga: not in enabled drivers build config 00:02:04.387 bus/platform: not in enabled drivers build config 00:02:04.387 bus/uacce: not in enabled drivers build config 00:02:04.387 bus/vmbus: not in enabled drivers build config 00:02:04.387 common/cnxk: not in enabled drivers build config 00:02:04.387 common/nfp: not in enabled drivers build config 00:02:04.387 common/nitrox: not in enabled drivers build config 00:02:04.387 common/sfc_efx: not in enabled drivers build config 00:02:04.387 mempool/bucket: not in enabled drivers build config 00:02:04.387 mempool/cnxk: not in enabled drivers build config 00:02:04.387 mempool/dpaa: not in enabled drivers build config 00:02:04.387 mempool/dpaa2: not in enabled drivers build config 00:02:04.387 mempool/octeontx: not in enabled drivers build config 00:02:04.387 mempool/stack: not in enabled drivers build config 00:02:04.387 dma/cnxk: not in enabled drivers build config 00:02:04.387 dma/dpaa: not in enabled drivers build config 00:02:04.387 dma/dpaa2: not in enabled drivers build config 00:02:04.387 dma/hisilicon: not in enabled drivers build config 00:02:04.387 dma/idxd: not in enabled drivers build config 00:02:04.387 dma/ioat: not in enabled drivers build config 00:02:04.387 dma/skeleton: not in enabled drivers build config 00:02:04.387 net/af_packet: not in enabled drivers build config 00:02:04.387 net/af_xdp: not in enabled drivers build config 00:02:04.387 net/ark: not in enabled drivers build config 00:02:04.387 net/atlantic: not in enabled drivers build config 00:02:04.387 net/avp: not in enabled drivers build config 00:02:04.387 net/axgbe: not in enabled drivers build config 00:02:04.387 net/bnx2x: not in enabled drivers build config 00:02:04.388 net/bnxt: not in enabled drivers build config 00:02:04.388 net/bonding: not in enabled drivers build config 00:02:04.388 net/cnxk: not in enabled drivers build config 00:02:04.388 net/cpfl: not in enabled drivers build config 00:02:04.388 net/cxgbe: not in enabled drivers build config 00:02:04.388 net/dpaa: not in enabled drivers build config 00:02:04.388 net/dpaa2: not in enabled drivers build config 00:02:04.388 net/e1000: not in enabled drivers build config 00:02:04.388 net/ena: not in enabled drivers build config 00:02:04.388 net/enetc: not in enabled drivers build config 00:02:04.388 net/enetfec: not in enabled drivers build config 00:02:04.388 net/enic: not in enabled drivers build config 00:02:04.388 net/failsafe: not in enabled drivers build config 00:02:04.388 net/fm10k: not in enabled drivers build config 00:02:04.388 net/gve: not in enabled drivers build config 00:02:04.388 net/hinic: not in enabled drivers build config 00:02:04.388 net/hns3: not in enabled drivers build config 00:02:04.388 net/i40e: not in enabled drivers build config 00:02:04.388 net/iavf: not in enabled drivers build config 00:02:04.388 net/ice: not in enabled drivers build config 00:02:04.388 net/idpf: not in enabled drivers build config 00:02:04.388 net/igc: not in enabled drivers build config 00:02:04.388 net/ionic: not in enabled drivers build config 00:02:04.388 net/ipn3ke: not in enabled drivers build config 00:02:04.388 net/ixgbe: not in enabled drivers build config 00:02:04.388 net/mana: not in enabled drivers build config 00:02:04.388 net/memif: not in enabled drivers build config 00:02:04.388 net/mlx4: not in enabled drivers build config 00:02:04.388 net/mlx5: not in enabled drivers build config 00:02:04.388 net/mvneta: not in enabled drivers build config 00:02:04.388 net/mvpp2: not in enabled drivers build config 00:02:04.388 net/netvsc: not in enabled drivers build config 00:02:04.388 net/nfb: not in enabled drivers build config 00:02:04.388 net/nfp: not in enabled drivers build config 00:02:04.388 net/ngbe: not in enabled drivers build config 00:02:04.388 net/null: not in enabled drivers build config 00:02:04.388 net/octeontx: not in enabled drivers build config 00:02:04.388 net/octeon_ep: not in enabled drivers build config 00:02:04.388 net/pcap: not in enabled drivers build config 00:02:04.388 net/pfe: not in enabled drivers build config 00:02:04.388 net/qede: not in enabled drivers build config 00:02:04.388 net/ring: not in enabled drivers build config 00:02:04.388 net/sfc: not in enabled drivers build config 00:02:04.388 net/softnic: not in enabled drivers build config 00:02:04.388 net/tap: not in enabled drivers build config 00:02:04.388 net/thunderx: not in enabled drivers build config 00:02:04.388 net/txgbe: not in enabled drivers build config 00:02:04.388 net/vdev_netvsc: not in enabled drivers build config 00:02:04.388 net/vhost: not in enabled drivers build config 00:02:04.388 net/virtio: not in enabled drivers build config 00:02:04.388 net/vmxnet3: not in enabled drivers build config 00:02:04.388 raw/*: missing internal dependency, "rawdev" 00:02:04.388 crypto/armv8: not in enabled drivers build config 00:02:04.388 crypto/bcmfs: not in enabled drivers build config 00:02:04.388 crypto/caam_jr: not in enabled drivers build config 00:02:04.388 crypto/ccp: not in enabled drivers build config 00:02:04.388 crypto/cnxk: not in enabled drivers build config 00:02:04.388 crypto/dpaa_sec: not in enabled drivers build config 00:02:04.388 crypto/dpaa2_sec: not in enabled drivers build config 00:02:04.388 crypto/mvsam: not in enabled drivers build config 00:02:04.388 crypto/nitrox: not in enabled drivers build config 00:02:04.388 crypto/null: not in enabled drivers build config 00:02:04.388 crypto/octeontx: not in enabled drivers build config 00:02:04.388 crypto/openssl: not in enabled drivers build config 00:02:04.388 crypto/scheduler: not in enabled drivers build config 00:02:04.388 crypto/uadk: not in enabled drivers build config 00:02:04.388 crypto/virtio: not in enabled drivers build config 00:02:04.388 compress/nitrox: not in enabled drivers build config 00:02:04.388 compress/octeontx: not in enabled drivers build config 00:02:04.388 compress/zlib: not in enabled drivers build config 00:02:04.388 regex/*: missing internal dependency, "regexdev" 00:02:04.388 ml/*: missing internal dependency, "mldev" 00:02:04.388 vdpa/ifc: not in enabled drivers build config 00:02:04.388 vdpa/mlx5: not in enabled drivers build config 00:02:04.388 vdpa/nfp: not in enabled drivers build config 00:02:04.388 vdpa/sfc: not in enabled drivers build config 00:02:04.388 event/*: missing internal dependency, "eventdev" 00:02:04.388 baseband/*: missing internal dependency, "bbdev" 00:02:04.388 gpu/*: missing internal dependency, "gpudev" 00:02:04.388 00:02:04.388 00:02:04.388 Build targets in project: 115 00:02:04.388 00:02:04.388 DPDK 24.03.0 00:02:04.388 00:02:04.388 User defined options 00:02:04.388 buildtype : debug 00:02:04.388 default_library : shared 00:02:04.388 libdir : lib 00:02:04.388 prefix : /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:02:04.388 c_args : -Wno-stringop-overflow -fcommon -Wno-stringop-overread -Wno-array-bounds -I/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib -DNO_COMPAT_IMB_API_053 -I/var/jenkins/workspace/crypto-phy-autotest/spdk/isa-l -I/var/jenkins/workspace/crypto-phy-autotest/spdk/isalbuild -fPIC -Werror 00:02:04.388 c_link_args : -L/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib -L/var/jenkins/workspace/crypto-phy-autotest/spdk/isa-l/.libs -lisal 00:02:04.388 cpu_instruction_set: native 00:02:04.388 disable_apps : test-fib,test-sad,test,test-regex,test-security-perf,test-bbdev,dumpcap,test-crypto-perf,test-flow-perf,test-gpudev,test-cmdline,test-dma-perf,test-eventdev,test-pipeline,test-acl,proc-info,test-compress-perf,graph,test-pmd,test-mldev,pdump 00:02:04.388 disable_libs : bbdev,argparse,latencystats,member,gpudev,mldev,pipeline,lpm,efd,regexdev,sched,node,dispatcher,table,bpf,port,gro,fib,cfgfile,ip_frag,gso,rawdev,ipsec,pdcp,rib,acl,metrics,graph,pcapng,jobstats,eventdev,stack,bitratestats,distributor,pdump 00:02:04.388 enable_docs : false 00:02:04.388 enable_drivers : bus,bus/pci,bus/vdev,mempool/ring,crypto/qat,compress/qat,common/qat,common/mlx5,bus/auxiliary,crypto,crypto/aesni_mb,crypto/mlx5,crypto/ipsec_mb,compress,compress/isal,compress/mlx5 00:02:04.388 enable_kmods : false 00:02:04.388 max_lcores : 128 00:02:04.388 tests : false 00:02:04.388 00:02:04.388 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:02:04.388 ninja: Entering directory `/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp' 00:02:04.388 [1/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:02:04.388 [2/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:02:04.388 [3/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:02:04.388 [4/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:02:04.388 [5/378] Compiling C object lib/librte_log.a.p/log_log_linux.c.o 00:02:04.388 [6/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:02:04.388 [7/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:02:04.388 [8/378] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:02:04.388 [9/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:02:04.388 [10/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:02:04.388 [11/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:02:04.388 [12/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:02:04.388 [13/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:02:04.388 [14/378] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:02:04.388 [15/378] Compiling C object lib/librte_log.a.p/log_log.c.o 00:02:04.388 [16/378] Linking static target lib/librte_kvargs.a 00:02:04.388 [17/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:02:04.388 [18/378] Linking static target lib/librte_log.a 00:02:04.388 [19/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:02:04.388 [20/378] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:02:04.388 [21/378] Linking static target lib/librte_pci.a 00:02:04.388 [22/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:02:04.388 [23/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:02:04.388 [24/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:02:04.649 [25/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:02:04.649 [26/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:02:04.649 [27/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:02:04.649 [28/378] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:02:04.649 [29/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:02:04.649 [30/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:02:04.649 [31/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:02:04.649 [32/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:02:04.649 [33/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:02:04.649 [34/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:02:04.649 [35/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:02:04.649 [36/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:02:04.649 [37/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:02:04.649 [38/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:02:04.649 [39/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:02:04.649 [40/378] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:02:04.649 [41/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:02:04.649 [42/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:02:04.649 [43/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:02:04.649 [44/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:02:04.649 [45/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:02:04.649 [46/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:02:04.649 [47/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:02:04.649 [48/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:02:04.649 [49/378] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:02:04.649 [50/378] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:02:04.649 [51/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:02:04.649 [52/378] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:02:04.649 [53/378] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:02:04.649 [54/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:02:04.913 [55/378] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:02:04.913 [56/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:02:04.914 [57/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:02:04.914 [58/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:02:04.914 [59/378] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:04.914 [60/378] Compiling C object lib/librte_hash.a.p/hash_rte_hash_crc.c.o 00:02:04.914 [61/378] Linking static target lib/net/libnet_crc_avx512_lib.a 00:02:04.914 [62/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:02:04.914 [63/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:02:04.914 [64/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:02:04.914 [65/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:02:04.914 [66/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:02:04.914 [67/378] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:02:04.914 [68/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:02:04.914 [69/378] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:02:04.914 [70/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:02:04.914 [71/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:02:04.914 [72/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:02:04.914 [73/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:02:04.914 [74/378] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:02:04.914 [75/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:02:04.914 [76/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:02:04.914 [77/378] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:02:04.914 [78/378] Linking static target lib/librte_ring.a 00:02:04.914 [79/378] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:02:04.914 [80/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:02:04.914 [81/378] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:02:04.914 [82/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:02:04.914 [83/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:02:04.914 [84/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:02:04.914 [85/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:02:04.914 [86/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:02:04.914 [87/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:02:04.914 [88/378] Linking static target lib/librte_meter.a 00:02:04.914 [89/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:02:04.914 [90/378] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:02:04.914 [91/378] Linking static target lib/librte_telemetry.a 00:02:04.914 [92/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:02:04.914 [93/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:02:04.914 [94/378] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:02:04.914 [95/378] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:02:04.914 [96/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:02:04.914 [97/378] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:02:04.914 [98/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:02:04.914 [99/378] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:02:04.914 [100/378] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:02:04.914 [101/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:02:04.914 [102/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:02:04.914 [103/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:02:04.914 [104/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:02:04.914 [105/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:02:04.914 [106/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:02:04.914 [107/378] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:02:04.914 [108/378] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:02:04.914 [109/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:02:04.914 [110/378] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_auxiliary_params.c.o 00:02:04.914 [111/378] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:02:04.914 [112/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:02:04.914 [113/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:02:04.914 [114/378] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:02:04.914 [115/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:02:04.914 [116/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:02:04.914 [117/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:02:04.914 [118/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_linux_ethtool.c.o 00:02:04.914 [119/378] Linking static target lib/librte_rcu.a 00:02:04.914 [120/378] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:02:04.914 [121/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:02:04.914 [122/378] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:02:04.914 [123/378] Linking static target lib/librte_cmdline.a 00:02:04.914 [124/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:02:04.914 [125/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:02:05.181 [126/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:02:05.181 [127/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_logs.c.o 00:02:05.181 [128/378] Linking static target lib/librte_net.a 00:02:05.181 [129/378] Linking static target lib/librte_mempool.a 00:02:05.181 [130/378] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:02:05.181 [131/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:02:05.181 [132/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:02:05.181 [133/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:02:05.181 [134/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_telemetry.c.o 00:02:05.181 [135/378] Generating lib/log.sym_chk with a custom command (wrapped by meson to capture output) 00:02:05.181 [136/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:02:05.181 [137/378] Linking static target lib/librte_eal.a 00:02:05.181 [138/378] Linking target lib/librte_log.so.24.1 00:02:05.181 [139/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:02:05.181 [140/378] Linking static target lib/librte_mbuf.a 00:02:05.181 [141/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:02:05.440 [142/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_glue.c.o 00:02:05.440 [143/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:02:05.440 [144/378] Compiling C object lib/librte_hash.a.p/hash_rte_thash_gfni.c.o 00:02:05.440 [145/378] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev_trace_points.c.o 00:02:05.440 [146/378] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:02:05.440 [147/378] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:02:05.440 [148/378] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:02:05.440 [149/378] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:02:05.440 [150/378] Linking static target lib/librte_timer.a 00:02:05.440 [151/378] Generating symbol file lib/librte_log.so.24.1.p/librte_log.so.24.1.symbols 00:02:05.440 [152/378] Compiling C object lib/librte_power.a.p/power_power_amd_pstate_cpufreq.c.o 00:02:05.440 [153/378] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:02:05.440 [154/378] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:02:05.440 [155/378] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_auxiliary_common.c.o 00:02:05.440 [156/378] Compiling C object lib/librte_power.a.p/power_rte_power_uncore.c.o 00:02:05.440 [157/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:02:05.440 [158/378] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:02:05.440 [159/378] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:02:05.440 [160/378] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:02:05.440 [161/378] Linking target lib/librte_kvargs.so.24.1 00:02:05.440 [162/378] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:02:05.440 [163/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:02:05.440 [164/378] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:02:05.440 [165/378] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_linux_auxiliary.c.o 00:02:05.440 [166/378] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:02:05.440 [167/378] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:02:05.440 [168/378] Compiling C object lib/librte_power.a.p/power_power_intel_uncore.c.o 00:02:05.441 [169/378] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:02:05.441 [170/378] Linking static target drivers/libtmp_rte_bus_auxiliary.a 00:02:05.441 [171/378] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:02:05.441 [172/378] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:02:05.441 [173/378] Linking static target lib/librte_compressdev.a 00:02:05.441 [174/378] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:02:05.700 [175/378] Linking static target lib/librte_dmadev.a 00:02:05.700 [176/378] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:02:05.700 [177/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:02:05.700 [178/378] Linking static target drivers/libtmp_rte_bus_vdev.a 00:02:05.700 [179/378] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:02:05.700 [180/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:02:05.700 [181/378] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:02:05.700 [182/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:02:05.700 [183/378] Linking static target lib/librte_power.a 00:02:05.700 [184/378] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:02:05.700 [185/378] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:02:05.700 [186/378] Linking static target lib/librte_reorder.a 00:02:05.700 [187/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_common.c.o 00:02:05.700 [188/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen1.c.o 00:02:05.700 [189/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen3.c.o 00:02:05.700 [190/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen3.c.o 00:02:05.700 [191/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen5.c.o 00:02:05.700 [192/378] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:02:05.700 [193/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_pf2vf.c.o 00:02:05.700 [194/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen2.c.o 00:02:05.700 [195/378] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:02:05.700 [196/378] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net_ctrl.c.o 00:02:05.700 [197/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:02:05.700 [198/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen1.c.o 00:02:05.700 [199/378] Linking static target lib/librte_security.a 00:02:05.700 [200/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen2.c.o 00:02:05.700 [201/378] Linking static target drivers/libtmp_rte_bus_pci.a 00:02:05.700 [202/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen4.c.o 00:02:05.700 [203/378] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:02:05.700 [204/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_mp.c.o 00:02:05.700 [205/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_malloc.c.o 00:02:05.700 [206/378] Compiling C object lib/librte_vhost.a.p/vhost_vduse.c.o 00:02:05.700 [207/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen4.c.o 00:02:05.700 [208/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_qat_comp_pmd.c.o 00:02:05.700 [209/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen_lce.c.o 00:02:05.700 [210/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen5.c.o 00:02:05.700 [211/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:02:05.700 [212/378] Generating drivers/rte_bus_auxiliary.pmd.c with a custom command 00:02:05.700 [213/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_devx.c.o 00:02:05.700 [214/378] Generating symbol file lib/librte_kvargs.so.24.1.p/librte_kvargs.so.24.1.symbols 00:02:05.700 [215/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_pci.c.o 00:02:05.700 [216/378] Linking target lib/librte_telemetry.so.24.1 00:02:05.700 [217/378] Compiling C object drivers/librte_bus_auxiliary.a.p/meson-generated_.._rte_bus_auxiliary.pmd.c.o 00:02:05.700 [218/378] Compiling C object drivers/librte_bus_auxiliary.so.24.1.p/meson-generated_.._rte_bus_auxiliary.pmd.c.o 00:02:05.700 [219/378] Linking static target drivers/librte_bus_auxiliary.a 00:02:05.700 [220/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_verbs.c.o 00:02:05.959 [221/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_auxiliary.c.o 00:02:05.959 [222/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_sym.c.o 00:02:05.959 [223/378] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:02:05.959 [224/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen2.c.o 00:02:05.959 [225/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_device.c.o 00:02:05.959 [226/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_asym_pmd_gen1.c.o 00:02:05.959 [227/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_crypto.c.o 00:02:05.959 [228/378] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:02:05.959 [229/378] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:02:05.959 [230/378] Compiling C object drivers/librte_bus_vdev.so.24.1.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:02:05.959 [231/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_utils.c.o 00:02:05.959 [232/378] Linking static target drivers/librte_bus_vdev.a 00:02:05.959 [233/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen_lce.c.o 00:02:05.959 [234/378] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:02:05.959 [235/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen5.c.o 00:02:05.959 [236/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common.c.o 00:02:05.959 [237/378] Linking static target lib/librte_hash.a 00:02:05.959 [238/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_os.c.o 00:02:05.959 [239/378] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:02:05.959 [240/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_nl.c.o 00:02:05.959 [241/378] Generating symbol file lib/librte_telemetry.so.24.1.p/librte_telemetry.so.24.1.symbols 00:02:05.959 [242/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_qp.c.o 00:02:05.959 [243/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_ipsec_mb_ops.c.o 00:02:05.959 [244/378] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:02:05.959 [245/378] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:02:05.959 [246/378] Linking static target drivers/libtmp_rte_mempool_ring.a 00:02:05.959 [247/378] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:02:05.959 [248/378] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:02:05.959 [249/378] Compiling C object drivers/librte_bus_pci.so.24.1.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:02:05.959 [250/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_mr.c.o 00:02:05.959 [251/378] Linking static target drivers/librte_bus_pci.a 00:02:05.959 [252/378] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:02:05.959 [253/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_qat_comp.c.o 00:02:05.959 [254/378] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:02:05.959 [255/378] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:02:05.959 [256/378] Linking static target lib/librte_cryptodev.a 00:02:05.959 [257/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_ipsec_mb_private.c.o 00:02:05.959 [258/378] Generating drivers/rte_bus_auxiliary.sym_chk with a custom command (wrapped by meson to capture output) 00:02:05.959 [259/378] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:02:06.217 [260/378] Compiling C object drivers/libtmp_rte_compress_mlx5.a.p/compress_mlx5_mlx5_compress.c.o 00:02:06.217 [261/378] Linking static target drivers/libtmp_rte_compress_mlx5.a 00:02:06.217 [262/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_zuc.c.o 00:02:06.218 [263/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_chacha_poly.c.o 00:02:06.218 [264/378] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:02:06.218 [265/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_gcm.c.o 00:02:06.218 [266/378] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:06.218 [267/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_sym_session.c.o 00:02:06.218 [268/378] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:02:06.218 [269/378] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:06.218 [270/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen4.c.o 00:02:06.218 [271/378] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:06.218 [272/378] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:06.218 [273/378] Compiling C object drivers/librte_mempool_ring.so.24.1.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:06.218 [274/378] Compiling C object drivers/libtmp_rte_compress_isal.a.p/compress_isal_isal_compress_pmd_ops.c.o 00:02:06.218 [275/378] Linking static target drivers/librte_mempool_ring.a 00:02:06.218 [276/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_aesni_gcm.c.o 00:02:06.218 [277/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_kasumi.c.o 00:02:06.218 [278/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto.c.o 00:02:06.218 [279/378] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:02:06.218 [280/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_dek.c.o 00:02:06.218 [281/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_xts.c.o 00:02:06.218 [282/378] Linking static target drivers/libtmp_rte_crypto_mlx5.a 00:02:06.218 [283/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen3.c.o 00:02:06.218 [284/378] Compiling C object drivers/libtmp_rte_compress_isal.a.p/compress_isal_isal_compress_pmd.c.o 00:02:06.218 [285/378] Generating drivers/rte_compress_mlx5.pmd.c with a custom command 00:02:06.218 [286/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_snow3g.c.o 00:02:06.218 [287/378] Linking static target drivers/libtmp_rte_compress_isal.a 00:02:06.476 [288/378] Compiling C object drivers/librte_compress_mlx5.a.p/meson-generated_.._rte_compress_mlx5.pmd.c.o 00:02:06.476 [289/378] Compiling C object drivers/librte_compress_mlx5.so.24.1.p/meson-generated_.._rte_compress_mlx5.pmd.c.o 00:02:06.476 [290/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_devx_cmds.c.o 00:02:06.476 [291/378] Linking static target drivers/librte_compress_mlx5.a 00:02:06.476 [292/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_sym_pmd_gen1.c.o 00:02:06.476 [293/378] Linking static target drivers/libtmp_rte_common_mlx5.a 00:02:06.476 [294/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_aesni_mb.c.o 00:02:06.476 [295/378] Linking static target drivers/libtmp_rte_crypto_ipsec_mb.a 00:02:06.476 [296/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:02:06.476 [297/378] Linking static target lib/librte_ethdev.a 00:02:06.476 [298/378] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:02:06.476 [299/378] Generating drivers/rte_crypto_mlx5.pmd.c with a custom command 00:02:06.476 [300/378] Compiling C object drivers/librte_crypto_mlx5.so.24.1.p/meson-generated_.._rte_crypto_mlx5.pmd.c.o 00:02:06.476 [301/378] Compiling C object drivers/librte_crypto_mlx5.a.p/meson-generated_.._rte_crypto_mlx5.pmd.c.o 00:02:06.476 [302/378] Generating drivers/rte_compress_isal.pmd.c with a custom command 00:02:06.476 [303/378] Linking static target drivers/librte_crypto_mlx5.a 00:02:06.476 [304/378] Compiling C object drivers/librte_compress_isal.so.24.1.p/meson-generated_.._rte_compress_isal.pmd.c.o 00:02:06.476 [305/378] Compiling C object drivers/librte_compress_isal.a.p/meson-generated_.._rte_compress_isal.pmd.c.o 00:02:06.476 [306/378] Linking static target drivers/librte_compress_isal.a 00:02:06.734 [307/378] Generating drivers/rte_common_mlx5.pmd.c with a custom command 00:02:06.734 [308/378] Generating drivers/rte_crypto_ipsec_mb.pmd.c with a custom command 00:02:06.734 [309/378] Compiling C object drivers/librte_common_mlx5.so.24.1.p/meson-generated_.._rte_common_mlx5.pmd.c.o 00:02:06.734 [310/378] Compiling C object drivers/librte_common_mlx5.a.p/meson-generated_.._rte_common_mlx5.pmd.c.o 00:02:06.734 [311/378] Compiling C object drivers/librte_crypto_ipsec_mb.a.p/meson-generated_.._rte_crypto_ipsec_mb.pmd.c.o 00:02:06.734 [312/378] Compiling C object drivers/librte_crypto_ipsec_mb.so.24.1.p/meson-generated_.._rte_crypto_ipsec_mb.pmd.c.o 00:02:06.734 [313/378] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:06.734 [314/378] Linking static target drivers/librte_common_mlx5.a 00:02:06.734 [315/378] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:02:06.734 [316/378] Linking static target drivers/librte_crypto_ipsec_mb.a 00:02:06.734 [317/378] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:02:06.993 [318/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_asym.c.o 00:02:06.993 [319/378] Linking static target drivers/libtmp_rte_common_qat.a 00:02:07.252 [320/378] Generating drivers/rte_common_qat.pmd.c with a custom command 00:02:07.252 [321/378] Compiling C object drivers/librte_common_qat.so.24.1.p/meson-generated_.._rte_common_qat.pmd.c.o 00:02:07.252 [322/378] Compiling C object drivers/librte_common_qat.a.p/meson-generated_.._rte_common_qat.pmd.c.o 00:02:07.252 [323/378] Linking static target drivers/librte_common_qat.a 00:02:07.818 [324/378] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:02:07.818 [325/378] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:07.818 [326/378] Linking static target lib/librte_vhost.a 00:02:09.720 [327/378] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:02:11.729 [328/378] Generating drivers/rte_common_mlx5.sym_chk with a custom command (wrapped by meson to capture output) 00:02:13.634 [329/378] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:14.571 [330/378] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:02:14.571 [331/378] Linking target lib/librte_eal.so.24.1 00:02:14.830 [332/378] Generating symbol file lib/librte_eal.so.24.1.p/librte_eal.so.24.1.symbols 00:02:14.830 [333/378] Linking target lib/librte_meter.so.24.1 00:02:14.830 [334/378] Linking target lib/librte_ring.so.24.1 00:02:14.830 [335/378] Linking target drivers/librte_bus_vdev.so.24.1 00:02:14.830 [336/378] Linking target lib/librte_pci.so.24.1 00:02:14.830 [337/378] Linking target lib/librte_timer.so.24.1 00:02:14.830 [338/378] Linking target drivers/librte_bus_auxiliary.so.24.1 00:02:14.830 [339/378] Linking target lib/librte_dmadev.so.24.1 00:02:15.088 [340/378] Generating symbol file lib/librte_dmadev.so.24.1.p/librte_dmadev.so.24.1.symbols 00:02:15.088 [341/378] Generating symbol file lib/librte_pci.so.24.1.p/librte_pci.so.24.1.symbols 00:02:15.088 [342/378] Generating symbol file lib/librte_meter.so.24.1.p/librte_meter.so.24.1.symbols 00:02:15.088 [343/378] Generating symbol file lib/librte_ring.so.24.1.p/librte_ring.so.24.1.symbols 00:02:15.088 [344/378] Generating symbol file drivers/librte_bus_auxiliary.so.24.1.p/librte_bus_auxiliary.so.24.1.symbols 00:02:15.088 [345/378] Generating symbol file drivers/librte_bus_vdev.so.24.1.p/librte_bus_vdev.so.24.1.symbols 00:02:15.088 [346/378] Generating symbol file lib/librte_timer.so.24.1.p/librte_timer.so.24.1.symbols 00:02:15.088 [347/378] Linking target drivers/librte_bus_pci.so.24.1 00:02:15.088 [348/378] Linking target lib/librte_rcu.so.24.1 00:02:15.088 [349/378] Linking target lib/librte_mempool.so.24.1 00:02:15.088 [350/378] Generating symbol file drivers/librte_bus_pci.so.24.1.p/librte_bus_pci.so.24.1.symbols 00:02:15.088 [351/378] Generating symbol file lib/librte_mempool.so.24.1.p/librte_mempool.so.24.1.symbols 00:02:15.088 [352/378] Generating symbol file lib/librte_rcu.so.24.1.p/librte_rcu.so.24.1.symbols 00:02:15.347 [353/378] Linking target drivers/librte_mempool_ring.so.24.1 00:02:15.347 [354/378] Linking target lib/librte_mbuf.so.24.1 00:02:15.347 [355/378] Generating symbol file lib/librte_mbuf.so.24.1.p/librte_mbuf.so.24.1.symbols 00:02:15.347 [356/378] Linking target lib/librte_compressdev.so.24.1 00:02:15.347 [357/378] Linking target lib/librte_net.so.24.1 00:02:15.347 [358/378] Linking target lib/librte_reorder.so.24.1 00:02:15.348 [359/378] Linking target lib/librte_cryptodev.so.24.1 00:02:15.606 [360/378] Generating symbol file lib/librte_compressdev.so.24.1.p/librte_compressdev.so.24.1.symbols 00:02:15.606 [361/378] Generating symbol file lib/librte_cryptodev.so.24.1.p/librte_cryptodev.so.24.1.symbols 00:02:15.606 [362/378] Generating symbol file lib/librte_net.so.24.1.p/librte_net.so.24.1.symbols 00:02:15.606 [363/378] Linking target lib/librte_security.so.24.1 00:02:15.606 [364/378] Linking target lib/librte_cmdline.so.24.1 00:02:15.606 [365/378] Linking target lib/librte_hash.so.24.1 00:02:15.606 [366/378] Linking target drivers/librte_compress_isal.so.24.1 00:02:15.606 [367/378] Linking target lib/librte_ethdev.so.24.1 00:02:15.606 [368/378] Generating symbol file lib/librte_security.so.24.1.p/librte_security.so.24.1.symbols 00:02:15.606 [369/378] Generating symbol file lib/librte_hash.so.24.1.p/librte_hash.so.24.1.symbols 00:02:15.864 [370/378] Generating symbol file lib/librte_ethdev.so.24.1.p/librte_ethdev.so.24.1.symbols 00:02:15.864 [371/378] Linking target drivers/librte_common_mlx5.so.24.1 00:02:15.864 [372/378] Linking target lib/librte_power.so.24.1 00:02:15.864 [373/378] Linking target lib/librte_vhost.so.24.1 00:02:15.864 [374/378] Generating symbol file drivers/librte_common_mlx5.so.24.1.p/librte_common_mlx5.so.24.1.symbols 00:02:15.864 [375/378] Linking target drivers/librte_crypto_ipsec_mb.so.24.1 00:02:15.864 [376/378] Linking target drivers/librte_compress_mlx5.so.24.1 00:02:15.864 [377/378] Linking target drivers/librte_crypto_mlx5.so.24.1 00:02:15.864 [378/378] Linking target drivers/librte_common_qat.so.24.1 00:02:15.864 INFO: autodetecting backend as ninja 00:02:15.864 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp -j 96 00:02:16.799 CC lib/ut/ut.o 00:02:16.799 CC lib/ut_mock/mock.o 00:02:17.057 CC lib/log/log.o 00:02:17.057 CC lib/log/log_flags.o 00:02:17.057 CC lib/log/log_deprecated.o 00:02:17.057 LIB libspdk_ut.a 00:02:17.057 LIB libspdk_ut_mock.a 00:02:17.057 SO libspdk_ut.so.2.0 00:02:17.057 LIB libspdk_log.a 00:02:17.057 SO libspdk_ut_mock.so.6.0 00:02:17.057 SO libspdk_log.so.7.0 00:02:17.057 SYMLINK libspdk_ut.so 00:02:17.057 SYMLINK libspdk_ut_mock.so 00:02:17.315 SYMLINK libspdk_log.so 00:02:17.573 CC lib/util/base64.o 00:02:17.573 CC lib/util/bit_array.o 00:02:17.573 CC lib/util/cpuset.o 00:02:17.573 CC lib/util/crc16.o 00:02:17.573 CC lib/util/crc32.o 00:02:17.573 CC lib/util/crc32c.o 00:02:17.573 CC lib/util/crc32_ieee.o 00:02:17.573 CC lib/util/crc64.o 00:02:17.573 CC lib/util/dif.o 00:02:17.573 CC lib/util/fd_group.o 00:02:17.573 CC lib/util/fd.o 00:02:17.573 CC lib/util/file.o 00:02:17.573 CC lib/util/hexlify.o 00:02:17.573 CC lib/util/math.o 00:02:17.573 CC lib/util/iov.o 00:02:17.573 CC lib/util/net.o 00:02:17.573 CC lib/util/pipe.o 00:02:17.573 CC lib/util/strerror_tls.o 00:02:17.573 CC lib/util/string.o 00:02:17.573 CC lib/util/uuid.o 00:02:17.573 CC lib/util/xor.o 00:02:17.573 CC lib/util/zipf.o 00:02:17.573 CC lib/dma/dma.o 00:02:17.573 CC lib/ioat/ioat.o 00:02:17.573 CXX lib/trace_parser/trace.o 00:02:17.573 CC lib/vfio_user/host/vfio_user_pci.o 00:02:17.573 CC lib/vfio_user/host/vfio_user.o 00:02:17.573 LIB libspdk_dma.a 00:02:17.831 SO libspdk_dma.so.4.0 00:02:17.831 LIB libspdk_ioat.a 00:02:17.831 SYMLINK libspdk_dma.so 00:02:17.831 SO libspdk_ioat.so.7.0 00:02:17.831 LIB libspdk_vfio_user.a 00:02:17.831 SYMLINK libspdk_ioat.so 00:02:17.831 SO libspdk_vfio_user.so.5.0 00:02:17.831 LIB libspdk_util.a 00:02:17.831 SYMLINK libspdk_vfio_user.so 00:02:18.088 SO libspdk_util.so.10.0 00:02:18.088 SYMLINK libspdk_util.so 00:02:18.088 LIB libspdk_trace_parser.a 00:02:18.346 SO libspdk_trace_parser.so.5.0 00:02:18.346 SYMLINK libspdk_trace_parser.so 00:02:18.346 CC lib/rdma_utils/rdma_utils.o 00:02:18.346 CC lib/rdma_provider/common.o 00:02:18.346 CC lib/rdma_provider/rdma_provider_verbs.o 00:02:18.346 CC lib/reduce/reduce.o 00:02:18.346 CC lib/json/json_parse.o 00:02:18.346 CC lib/json/json_util.o 00:02:18.346 CC lib/json/json_write.o 00:02:18.346 CC lib/idxd/idxd.o 00:02:18.346 CC lib/idxd/idxd_user.o 00:02:18.346 CC lib/idxd/idxd_kernel.o 00:02:18.346 CC lib/conf/conf.o 00:02:18.346 CC lib/env_dpdk/env.o 00:02:18.346 CC lib/env_dpdk/memory.o 00:02:18.346 CC lib/env_dpdk/init.o 00:02:18.346 CC lib/env_dpdk/pci.o 00:02:18.346 CC lib/vmd/vmd.o 00:02:18.346 CC lib/vmd/led.o 00:02:18.346 CC lib/env_dpdk/threads.o 00:02:18.346 CC lib/env_dpdk/pci_ioat.o 00:02:18.346 CC lib/env_dpdk/pci_virtio.o 00:02:18.346 CC lib/env_dpdk/pci_vmd.o 00:02:18.346 CC lib/env_dpdk/pci_idxd.o 00:02:18.346 CC lib/env_dpdk/pci_event.o 00:02:18.346 CC lib/env_dpdk/sigbus_handler.o 00:02:18.346 CC lib/env_dpdk/pci_dpdk_2211.o 00:02:18.346 CC lib/env_dpdk/pci_dpdk.o 00:02:18.346 CC lib/env_dpdk/pci_dpdk_2207.o 00:02:18.606 LIB libspdk_rdma_provider.a 00:02:18.606 SO libspdk_rdma_provider.so.6.0 00:02:18.606 LIB libspdk_rdma_utils.a 00:02:18.606 LIB libspdk_conf.a 00:02:18.606 SO libspdk_rdma_utils.so.1.0 00:02:18.606 LIB libspdk_json.a 00:02:18.606 SYMLINK libspdk_rdma_provider.so 00:02:18.606 SO libspdk_conf.so.6.0 00:02:18.606 SYMLINK libspdk_rdma_utils.so 00:02:18.606 SO libspdk_json.so.6.0 00:02:18.606 SYMLINK libspdk_conf.so 00:02:18.864 SYMLINK libspdk_json.so 00:02:18.864 LIB libspdk_idxd.a 00:02:18.864 SO libspdk_idxd.so.12.0 00:02:18.864 LIB libspdk_reduce.a 00:02:18.864 LIB libspdk_vmd.a 00:02:18.864 SO libspdk_reduce.so.6.1 00:02:18.864 SYMLINK libspdk_idxd.so 00:02:18.864 SO libspdk_vmd.so.6.0 00:02:18.864 SYMLINK libspdk_reduce.so 00:02:19.121 SYMLINK libspdk_vmd.so 00:02:19.121 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:02:19.121 CC lib/jsonrpc/jsonrpc_server.o 00:02:19.121 CC lib/jsonrpc/jsonrpc_client.o 00:02:19.121 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:02:19.378 LIB libspdk_jsonrpc.a 00:02:19.378 SO libspdk_jsonrpc.so.6.0 00:02:19.378 SYMLINK libspdk_jsonrpc.so 00:02:19.378 LIB libspdk_env_dpdk.a 00:02:19.378 SO libspdk_env_dpdk.so.15.0 00:02:19.636 SYMLINK libspdk_env_dpdk.so 00:02:19.636 CC lib/rpc/rpc.o 00:02:19.894 LIB libspdk_rpc.a 00:02:19.894 SO libspdk_rpc.so.6.0 00:02:19.894 SYMLINK libspdk_rpc.so 00:02:20.152 CC lib/keyring/keyring.o 00:02:20.152 CC lib/keyring/keyring_rpc.o 00:02:20.152 CC lib/trace/trace.o 00:02:20.152 CC lib/trace/trace_flags.o 00:02:20.152 CC lib/trace/trace_rpc.o 00:02:20.152 CC lib/notify/notify.o 00:02:20.152 CC lib/notify/notify_rpc.o 00:02:20.411 LIB libspdk_notify.a 00:02:20.411 SO libspdk_notify.so.6.0 00:02:20.411 LIB libspdk_keyring.a 00:02:20.411 LIB libspdk_trace.a 00:02:20.411 SO libspdk_keyring.so.1.0 00:02:20.411 SYMLINK libspdk_notify.so 00:02:20.411 SO libspdk_trace.so.10.0 00:02:20.411 SYMLINK libspdk_keyring.so 00:02:20.669 SYMLINK libspdk_trace.so 00:02:20.927 CC lib/sock/sock.o 00:02:20.927 CC lib/sock/sock_rpc.o 00:02:20.927 CC lib/thread/thread.o 00:02:20.927 CC lib/thread/iobuf.o 00:02:21.186 LIB libspdk_sock.a 00:02:21.186 SO libspdk_sock.so.10.0 00:02:21.186 SYMLINK libspdk_sock.so 00:02:21.443 CC lib/nvme/nvme_ctrlr.o 00:02:21.443 CC lib/nvme/nvme_ctrlr_cmd.o 00:02:21.443 CC lib/nvme/nvme_fabric.o 00:02:21.443 CC lib/nvme/nvme_ns_cmd.o 00:02:21.443 CC lib/nvme/nvme_ns.o 00:02:21.443 CC lib/nvme/nvme_pcie_common.o 00:02:21.443 CC lib/nvme/nvme_pcie.o 00:02:21.443 CC lib/nvme/nvme_qpair.o 00:02:21.443 CC lib/nvme/nvme.o 00:02:21.443 CC lib/nvme/nvme_quirks.o 00:02:21.443 CC lib/nvme/nvme_transport.o 00:02:21.443 CC lib/nvme/nvme_discovery.o 00:02:21.443 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:02:21.443 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:02:21.443 CC lib/nvme/nvme_tcp.o 00:02:21.443 CC lib/nvme/nvme_io_msg.o 00:02:21.443 CC lib/nvme/nvme_opal.o 00:02:21.443 CC lib/nvme/nvme_zns.o 00:02:21.444 CC lib/nvme/nvme_poll_group.o 00:02:21.444 CC lib/nvme/nvme_stubs.o 00:02:21.444 CC lib/nvme/nvme_auth.o 00:02:21.444 CC lib/nvme/nvme_cuse.o 00:02:21.444 CC lib/nvme/nvme_rdma.o 00:02:22.007 LIB libspdk_thread.a 00:02:22.007 SO libspdk_thread.so.10.1 00:02:22.007 SYMLINK libspdk_thread.so 00:02:22.265 CC lib/blob/blobstore.o 00:02:22.265 CC lib/blob/zeroes.o 00:02:22.265 CC lib/blob/request.o 00:02:22.265 CC lib/blob/blob_bs_dev.o 00:02:22.265 CC lib/virtio/virtio.o 00:02:22.265 CC lib/virtio/virtio_vhost_user.o 00:02:22.265 CC lib/virtio/virtio_pci.o 00:02:22.265 CC lib/virtio/virtio_vfio_user.o 00:02:22.265 CC lib/accel/accel_rpc.o 00:02:22.265 CC lib/accel/accel.o 00:02:22.265 CC lib/accel/accel_sw.o 00:02:22.265 CC lib/init/json_config.o 00:02:22.265 CC lib/init/subsystem.o 00:02:22.265 CC lib/init/subsystem_rpc.o 00:02:22.265 CC lib/init/rpc.o 00:02:22.522 LIB libspdk_init.a 00:02:22.522 SO libspdk_init.so.5.0 00:02:22.522 LIB libspdk_virtio.a 00:02:22.522 SO libspdk_virtio.so.7.0 00:02:22.522 SYMLINK libspdk_init.so 00:02:22.523 SYMLINK libspdk_virtio.so 00:02:22.780 CC lib/event/log_rpc.o 00:02:22.780 CC lib/event/app.o 00:02:22.780 CC lib/event/reactor.o 00:02:22.780 CC lib/event/scheduler_static.o 00:02:22.780 CC lib/event/app_rpc.o 00:02:23.039 LIB libspdk_accel.a 00:02:23.039 SO libspdk_accel.so.16.0 00:02:23.039 LIB libspdk_nvme.a 00:02:23.039 SYMLINK libspdk_accel.so 00:02:23.297 SO libspdk_nvme.so.13.1 00:02:23.297 LIB libspdk_event.a 00:02:23.297 SO libspdk_event.so.14.0 00:02:23.297 SYMLINK libspdk_event.so 00:02:23.297 CC lib/bdev/bdev.o 00:02:23.297 CC lib/bdev/bdev_rpc.o 00:02:23.297 CC lib/bdev/part.o 00:02:23.297 CC lib/bdev/bdev_zone.o 00:02:23.297 CC lib/bdev/scsi_nvme.o 00:02:23.297 SYMLINK libspdk_nvme.so 00:02:24.232 LIB libspdk_blob.a 00:02:24.490 SO libspdk_blob.so.11.0 00:02:24.490 SYMLINK libspdk_blob.so 00:02:24.749 CC lib/blobfs/blobfs.o 00:02:24.749 CC lib/blobfs/tree.o 00:02:24.749 CC lib/lvol/lvol.o 00:02:25.315 LIB libspdk_bdev.a 00:02:25.315 SO libspdk_bdev.so.16.0 00:02:25.315 LIB libspdk_blobfs.a 00:02:25.315 SYMLINK libspdk_bdev.so 00:02:25.315 SO libspdk_blobfs.so.10.0 00:02:25.315 LIB libspdk_lvol.a 00:02:25.315 SYMLINK libspdk_blobfs.so 00:02:25.315 SO libspdk_lvol.so.10.0 00:02:25.573 SYMLINK libspdk_lvol.so 00:02:25.573 CC lib/scsi/dev.o 00:02:25.573 CC lib/scsi/scsi.o 00:02:25.573 CC lib/scsi/lun.o 00:02:25.573 CC lib/scsi/port.o 00:02:25.573 CC lib/scsi/scsi_bdev.o 00:02:25.573 CC lib/scsi/task.o 00:02:25.573 CC lib/scsi/scsi_pr.o 00:02:25.573 CC lib/scsi/scsi_rpc.o 00:02:25.573 CC lib/nbd/nbd.o 00:02:25.573 CC lib/nbd/nbd_rpc.o 00:02:25.573 CC lib/ftl/ftl_core.o 00:02:25.573 CC lib/ftl/ftl_layout.o 00:02:25.573 CC lib/ftl/ftl_init.o 00:02:25.573 CC lib/ftl/ftl_debug.o 00:02:25.573 CC lib/ftl/ftl_io.o 00:02:25.573 CC lib/ftl/ftl_sb.o 00:02:25.573 CC lib/ftl/ftl_l2p_flat.o 00:02:25.573 CC lib/ftl/ftl_l2p.o 00:02:25.573 CC lib/ftl/ftl_nv_cache.o 00:02:25.573 CC lib/ftl/ftl_band.o 00:02:25.573 CC lib/nvmf/ctrlr.o 00:02:25.573 CC lib/ftl/ftl_band_ops.o 00:02:25.573 CC lib/ftl/ftl_writer.o 00:02:25.573 CC lib/nvmf/ctrlr_discovery.o 00:02:25.573 CC lib/nvmf/subsystem.o 00:02:25.573 CC lib/nvmf/ctrlr_bdev.o 00:02:25.573 CC lib/ftl/ftl_rq.o 00:02:25.573 CC lib/ftl/ftl_p2l.o 00:02:25.573 CC lib/ftl/ftl_reloc.o 00:02:25.573 CC lib/ftl/ftl_l2p_cache.o 00:02:25.573 CC lib/nvmf/nvmf.o 00:02:25.573 CC lib/nvmf/nvmf_rpc.o 00:02:25.573 CC lib/nvmf/tcp.o 00:02:25.573 CC lib/ublk/ublk.o 00:02:25.573 CC lib/nvmf/transport.o 00:02:25.573 CC lib/ublk/ublk_rpc.o 00:02:25.573 CC lib/ftl/mngt/ftl_mngt.o 00:02:25.573 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:02:25.573 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:02:25.573 CC lib/nvmf/stubs.o 00:02:25.573 CC lib/nvmf/mdns_server.o 00:02:25.573 CC lib/ftl/mngt/ftl_mngt_startup.o 00:02:25.573 CC lib/nvmf/rdma.o 00:02:25.573 CC lib/ftl/mngt/ftl_mngt_md.o 00:02:25.573 CC lib/nvmf/auth.o 00:02:25.573 CC lib/ftl/mngt/ftl_mngt_misc.o 00:02:25.573 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:02:25.573 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:02:25.573 CC lib/ftl/mngt/ftl_mngt_band.o 00:02:25.573 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:02:25.573 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:02:25.573 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:02:25.573 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:02:25.573 CC lib/ftl/utils/ftl_conf.o 00:02:25.574 CC lib/ftl/utils/ftl_md.o 00:02:25.574 CC lib/ftl/utils/ftl_mempool.o 00:02:25.574 CC lib/ftl/utils/ftl_bitmap.o 00:02:25.574 CC lib/ftl/utils/ftl_property.o 00:02:25.574 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:02:25.574 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:02:25.574 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:02:25.574 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:02:25.574 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:02:25.574 CC lib/ftl/upgrade/ftl_trim_upgrade.o 00:02:25.574 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:02:25.574 CC lib/ftl/upgrade/ftl_sb_v3.o 00:02:25.574 CC lib/ftl/upgrade/ftl_sb_v5.o 00:02:25.574 CC lib/ftl/nvc/ftl_nvc_dev.o 00:02:25.574 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:02:25.574 CC lib/ftl/base/ftl_base_dev.o 00:02:25.574 CC lib/ftl/ftl_trace.o 00:02:25.574 CC lib/ftl/base/ftl_base_bdev.o 00:02:26.140 LIB libspdk_nbd.a 00:02:26.140 SO libspdk_nbd.so.7.0 00:02:26.140 SYMLINK libspdk_nbd.so 00:02:26.140 LIB libspdk_scsi.a 00:02:26.399 SO libspdk_scsi.so.9.0 00:02:26.399 LIB libspdk_ublk.a 00:02:26.399 SO libspdk_ublk.so.3.0 00:02:26.399 SYMLINK libspdk_scsi.so 00:02:26.399 SYMLINK libspdk_ublk.so 00:02:26.657 LIB libspdk_ftl.a 00:02:26.657 CC lib/vhost/vhost.o 00:02:26.657 CC lib/iscsi/conn.o 00:02:26.657 CC lib/vhost/vhost_rpc.o 00:02:26.657 CC lib/vhost/vhost_blk.o 00:02:26.657 CC lib/vhost/vhost_scsi.o 00:02:26.657 CC lib/iscsi/iscsi.o 00:02:26.657 CC lib/iscsi/init_grp.o 00:02:26.657 CC lib/vhost/rte_vhost_user.o 00:02:26.657 CC lib/iscsi/portal_grp.o 00:02:26.657 CC lib/iscsi/md5.o 00:02:26.657 CC lib/iscsi/param.o 00:02:26.657 CC lib/iscsi/tgt_node.o 00:02:26.657 CC lib/iscsi/iscsi_subsystem.o 00:02:26.657 CC lib/iscsi/iscsi_rpc.o 00:02:26.657 CC lib/iscsi/task.o 00:02:26.657 SO libspdk_ftl.so.9.0 00:02:26.916 SYMLINK libspdk_ftl.so 00:02:27.482 LIB libspdk_nvmf.a 00:02:27.482 SO libspdk_nvmf.so.19.0 00:02:27.482 LIB libspdk_vhost.a 00:02:27.482 SO libspdk_vhost.so.8.0 00:02:27.482 SYMLINK libspdk_nvmf.so 00:02:27.740 SYMLINK libspdk_vhost.so 00:02:27.740 LIB libspdk_iscsi.a 00:02:27.740 SO libspdk_iscsi.so.8.0 00:02:27.740 SYMLINK libspdk_iscsi.so 00:02:28.306 CC module/env_dpdk/env_dpdk_rpc.o 00:02:28.306 CC module/keyring/file/keyring.o 00:02:28.306 CC module/keyring/file/keyring_rpc.o 00:02:28.306 CC module/keyring/linux/keyring.o 00:02:28.306 CC module/keyring/linux/keyring_rpc.o 00:02:28.306 CC module/blob/bdev/blob_bdev.o 00:02:28.306 CC module/scheduler/dynamic/scheduler_dynamic.o 00:02:28.306 CC module/sock/posix/posix.o 00:02:28.306 CC module/accel/iaa/accel_iaa.o 00:02:28.306 CC module/accel/iaa/accel_iaa_rpc.o 00:02:28.306 CC module/accel/error/accel_error_rpc.o 00:02:28.306 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:02:28.306 CC module/accel/error/accel_error.o 00:02:28.306 CC module/accel/dsa/accel_dsa.o 00:02:28.306 CC module/accel/dpdk_compressdev/accel_dpdk_compressdev.o 00:02:28.306 CC module/accel/dpdk_compressdev/accel_dpdk_compressdev_rpc.o 00:02:28.306 CC module/accel/dsa/accel_dsa_rpc.o 00:02:28.563 CC module/accel/ioat/accel_ioat.o 00:02:28.563 CC module/accel/ioat/accel_ioat_rpc.o 00:02:28.563 CC module/scheduler/gscheduler/gscheduler.o 00:02:28.563 CC module/accel/dpdk_cryptodev/accel_dpdk_cryptodev.o 00:02:28.563 CC module/accel/dpdk_cryptodev/accel_dpdk_cryptodev_rpc.o 00:02:28.563 LIB libspdk_env_dpdk_rpc.a 00:02:28.563 SO libspdk_env_dpdk_rpc.so.6.0 00:02:28.563 SYMLINK libspdk_env_dpdk_rpc.so 00:02:28.563 LIB libspdk_keyring_linux.a 00:02:28.563 LIB libspdk_keyring_file.a 00:02:28.563 LIB libspdk_scheduler_dpdk_governor.a 00:02:28.563 LIB libspdk_scheduler_gscheduler.a 00:02:28.563 LIB libspdk_accel_error.a 00:02:28.563 SO libspdk_keyring_linux.so.1.0 00:02:28.563 LIB libspdk_scheduler_dynamic.a 00:02:28.563 SO libspdk_keyring_file.so.1.0 00:02:28.563 LIB libspdk_accel_ioat.a 00:02:28.563 SO libspdk_scheduler_dynamic.so.4.0 00:02:28.563 SO libspdk_scheduler_dpdk_governor.so.4.0 00:02:28.563 SO libspdk_scheduler_gscheduler.so.4.0 00:02:28.563 SO libspdk_accel_error.so.2.0 00:02:28.563 LIB libspdk_accel_iaa.a 00:02:28.563 SO libspdk_accel_ioat.so.6.0 00:02:28.563 SYMLINK libspdk_keyring_file.so 00:02:28.563 LIB libspdk_blob_bdev.a 00:02:28.563 SYMLINK libspdk_keyring_linux.so 00:02:28.563 SO libspdk_accel_iaa.so.3.0 00:02:28.563 SYMLINK libspdk_scheduler_gscheduler.so 00:02:28.563 SYMLINK libspdk_scheduler_dynamic.so 00:02:28.563 LIB libspdk_accel_dsa.a 00:02:28.820 SO libspdk_blob_bdev.so.11.0 00:02:28.820 SYMLINK libspdk_scheduler_dpdk_governor.so 00:02:28.820 SYMLINK libspdk_accel_error.so 00:02:28.820 SYMLINK libspdk_accel_ioat.so 00:02:28.820 SO libspdk_accel_dsa.so.5.0 00:02:28.820 SYMLINK libspdk_accel_iaa.so 00:02:28.820 SYMLINK libspdk_blob_bdev.so 00:02:28.820 SYMLINK libspdk_accel_dsa.so 00:02:29.079 LIB libspdk_sock_posix.a 00:02:29.079 SO libspdk_sock_posix.so.6.0 00:02:29.079 SYMLINK libspdk_sock_posix.so 00:02:29.079 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:02:29.079 CC module/bdev/lvol/vbdev_lvol.o 00:02:29.079 CC module/bdev/delay/vbdev_delay.o 00:02:29.079 CC module/bdev/error/vbdev_error.o 00:02:29.079 CC module/bdev/error/vbdev_error_rpc.o 00:02:29.079 CC module/bdev/delay/vbdev_delay_rpc.o 00:02:29.079 CC module/bdev/malloc/bdev_malloc.o 00:02:29.079 CC module/bdev/malloc/bdev_malloc_rpc.o 00:02:29.079 CC module/bdev/ftl/bdev_ftl.o 00:02:29.079 CC module/bdev/ftl/bdev_ftl_rpc.o 00:02:29.079 CC module/bdev/split/vbdev_split.o 00:02:29.079 CC module/bdev/crypto/vbdev_crypto.o 00:02:29.079 CC module/bdev/split/vbdev_split_rpc.o 00:02:29.079 CC module/bdev/zone_block/vbdev_zone_block.o 00:02:29.079 CC module/bdev/crypto/vbdev_crypto_rpc.o 00:02:29.079 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:02:29.079 CC module/bdev/aio/bdev_aio_rpc.o 00:02:29.079 CC module/bdev/aio/bdev_aio.o 00:02:29.079 CC module/bdev/nvme/bdev_nvme_rpc.o 00:02:29.079 CC module/bdev/nvme/bdev_nvme.o 00:02:29.079 CC module/bdev/nvme/nvme_rpc.o 00:02:29.079 CC module/bdev/gpt/gpt.o 00:02:29.079 CC module/bdev/nvme/bdev_mdns_client.o 00:02:29.079 CC module/bdev/gpt/vbdev_gpt.o 00:02:29.079 CC module/bdev/nvme/vbdev_opal.o 00:02:29.079 CC module/bdev/nvme/vbdev_opal_rpc.o 00:02:29.079 CC module/bdev/virtio/bdev_virtio_blk.o 00:02:29.079 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:02:29.079 CC module/bdev/virtio/bdev_virtio_rpc.o 00:02:29.079 CC module/bdev/compress/vbdev_compress_rpc.o 00:02:29.079 CC module/bdev/compress/vbdev_compress.o 00:02:29.079 CC module/bdev/virtio/bdev_virtio_scsi.o 00:02:29.079 CC module/bdev/iscsi/bdev_iscsi.o 00:02:29.079 CC module/bdev/null/bdev_null.o 00:02:29.079 CC module/bdev/null/bdev_null_rpc.o 00:02:29.079 CC module/bdev/passthru/vbdev_passthru.o 00:02:29.079 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:02:29.079 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:02:29.079 CC module/bdev/raid/bdev_raid.o 00:02:29.079 CC module/bdev/raid/bdev_raid_rpc.o 00:02:29.079 CC module/bdev/raid/raid0.o 00:02:29.079 CC module/bdev/raid/bdev_raid_sb.o 00:02:29.079 CC module/bdev/raid/raid1.o 00:02:29.079 CC module/bdev/raid/concat.o 00:02:29.079 CC module/blobfs/bdev/blobfs_bdev.o 00:02:29.079 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:02:29.336 LIB libspdk_accel_dpdk_compressdev.a 00:02:29.336 SO libspdk_accel_dpdk_compressdev.so.3.0 00:02:29.336 LIB libspdk_blobfs_bdev.a 00:02:29.336 SYMLINK libspdk_accel_dpdk_compressdev.so 00:02:29.336 SO libspdk_blobfs_bdev.so.6.0 00:02:29.337 LIB libspdk_bdev_error.a 00:02:29.595 LIB libspdk_bdev_split.a 00:02:29.595 SO libspdk_bdev_error.so.6.0 00:02:29.595 LIB libspdk_bdev_null.a 00:02:29.595 SYMLINK libspdk_blobfs_bdev.so 00:02:29.595 SO libspdk_bdev_split.so.6.0 00:02:29.595 LIB libspdk_bdev_ftl.a 00:02:29.595 LIB libspdk_bdev_gpt.a 00:02:29.595 SO libspdk_bdev_null.so.6.0 00:02:29.595 SYMLINK libspdk_bdev_error.so 00:02:29.595 SO libspdk_bdev_ftl.so.6.0 00:02:29.595 LIB libspdk_bdev_delay.a 00:02:29.595 LIB libspdk_bdev_passthru.a 00:02:29.595 LIB libspdk_bdev_aio.a 00:02:29.595 SO libspdk_bdev_gpt.so.6.0 00:02:29.595 LIB libspdk_bdev_zone_block.a 00:02:29.595 LIB libspdk_accel_dpdk_cryptodev.a 00:02:29.595 LIB libspdk_bdev_crypto.a 00:02:29.595 SYMLINK libspdk_bdev_split.so 00:02:29.595 SO libspdk_bdev_passthru.so.6.0 00:02:29.595 LIB libspdk_bdev_iscsi.a 00:02:29.595 LIB libspdk_bdev_malloc.a 00:02:29.595 SO libspdk_bdev_delay.so.6.0 00:02:29.595 LIB libspdk_bdev_compress.a 00:02:29.595 SO libspdk_bdev_aio.so.6.0 00:02:29.595 SYMLINK libspdk_bdev_null.so 00:02:29.595 SO libspdk_bdev_crypto.so.6.0 00:02:29.595 SO libspdk_bdev_zone_block.so.6.0 00:02:29.595 SO libspdk_accel_dpdk_cryptodev.so.3.0 00:02:29.595 SYMLINK libspdk_bdev_ftl.so 00:02:29.595 SO libspdk_bdev_iscsi.so.6.0 00:02:29.595 SO libspdk_bdev_malloc.so.6.0 00:02:29.595 SO libspdk_bdev_compress.so.6.0 00:02:29.595 SYMLINK libspdk_bdev_gpt.so 00:02:29.595 SYMLINK libspdk_bdev_delay.so 00:02:29.595 SYMLINK libspdk_bdev_passthru.so 00:02:29.595 LIB libspdk_bdev_lvol.a 00:02:29.595 SYMLINK libspdk_bdev_aio.so 00:02:29.595 SYMLINK libspdk_bdev_zone_block.so 00:02:29.595 SYMLINK libspdk_bdev_crypto.so 00:02:29.595 SYMLINK libspdk_bdev_iscsi.so 00:02:29.595 SYMLINK libspdk_bdev_malloc.so 00:02:29.595 SO libspdk_bdev_lvol.so.6.0 00:02:29.595 SYMLINK libspdk_accel_dpdk_cryptodev.so 00:02:29.595 SYMLINK libspdk_bdev_compress.so 00:02:29.595 LIB libspdk_bdev_virtio.a 00:02:29.854 SYMLINK libspdk_bdev_lvol.so 00:02:29.854 SO libspdk_bdev_virtio.so.6.0 00:02:29.854 SYMLINK libspdk_bdev_virtio.so 00:02:30.112 LIB libspdk_bdev_raid.a 00:02:30.112 SO libspdk_bdev_raid.so.6.0 00:02:30.112 SYMLINK libspdk_bdev_raid.so 00:02:30.680 LIB libspdk_bdev_nvme.a 00:02:30.939 SO libspdk_bdev_nvme.so.7.0 00:02:30.939 SYMLINK libspdk_bdev_nvme.so 00:02:31.506 CC module/event/subsystems/vmd/vmd.o 00:02:31.506 CC module/event/subsystems/vmd/vmd_rpc.o 00:02:31.506 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:02:31.506 CC module/event/subsystems/keyring/keyring.o 00:02:31.506 CC module/event/subsystems/sock/sock.o 00:02:31.506 CC module/event/subsystems/scheduler/scheduler.o 00:02:31.506 CC module/event/subsystems/iobuf/iobuf.o 00:02:31.506 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:02:31.765 LIB libspdk_event_vhost_blk.a 00:02:31.765 LIB libspdk_event_vmd.a 00:02:31.765 LIB libspdk_event_keyring.a 00:02:31.765 LIB libspdk_event_sock.a 00:02:31.765 SO libspdk_event_vhost_blk.so.3.0 00:02:31.765 LIB libspdk_event_iobuf.a 00:02:31.765 LIB libspdk_event_scheduler.a 00:02:31.765 SO libspdk_event_keyring.so.1.0 00:02:31.765 SO libspdk_event_vmd.so.6.0 00:02:31.765 SO libspdk_event_iobuf.so.3.0 00:02:31.765 SO libspdk_event_sock.so.5.0 00:02:31.765 SO libspdk_event_scheduler.so.4.0 00:02:31.765 SYMLINK libspdk_event_vhost_blk.so 00:02:31.765 SYMLINK libspdk_event_keyring.so 00:02:31.765 SYMLINK libspdk_event_vmd.so 00:02:31.765 SYMLINK libspdk_event_sock.so 00:02:31.765 SYMLINK libspdk_event_iobuf.so 00:02:31.765 SYMLINK libspdk_event_scheduler.so 00:02:32.024 CC module/event/subsystems/accel/accel.o 00:02:32.283 LIB libspdk_event_accel.a 00:02:32.283 SO libspdk_event_accel.so.6.0 00:02:32.283 SYMLINK libspdk_event_accel.so 00:02:32.542 CC module/event/subsystems/bdev/bdev.o 00:02:32.801 LIB libspdk_event_bdev.a 00:02:32.801 SO libspdk_event_bdev.so.6.0 00:02:32.801 SYMLINK libspdk_event_bdev.so 00:02:33.060 CC module/event/subsystems/nbd/nbd.o 00:02:33.060 CC module/event/subsystems/scsi/scsi.o 00:02:33.060 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:02:33.060 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:02:33.060 CC module/event/subsystems/ublk/ublk.o 00:02:33.320 LIB libspdk_event_nbd.a 00:02:33.320 SO libspdk_event_nbd.so.6.0 00:02:33.320 LIB libspdk_event_scsi.a 00:02:33.320 LIB libspdk_event_ublk.a 00:02:33.320 SO libspdk_event_scsi.so.6.0 00:02:33.320 SYMLINK libspdk_event_nbd.so 00:02:33.320 SO libspdk_event_ublk.so.3.0 00:02:33.320 LIB libspdk_event_nvmf.a 00:02:33.320 SYMLINK libspdk_event_scsi.so 00:02:33.320 SO libspdk_event_nvmf.so.6.0 00:02:33.320 SYMLINK libspdk_event_ublk.so 00:02:33.320 SYMLINK libspdk_event_nvmf.so 00:02:33.579 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:02:33.579 CC module/event/subsystems/iscsi/iscsi.o 00:02:33.838 LIB libspdk_event_vhost_scsi.a 00:02:33.838 SO libspdk_event_vhost_scsi.so.3.0 00:02:33.838 LIB libspdk_event_iscsi.a 00:02:33.838 SO libspdk_event_iscsi.so.6.0 00:02:33.838 SYMLINK libspdk_event_vhost_scsi.so 00:02:33.838 SYMLINK libspdk_event_iscsi.so 00:02:34.098 SO libspdk.so.6.0 00:02:34.098 SYMLINK libspdk.so 00:02:34.369 CC app/spdk_nvme_perf/perf.o 00:02:34.369 CC app/trace_record/trace_record.o 00:02:34.369 CXX app/trace/trace.o 00:02:34.369 CC app/spdk_nvme_discover/discovery_aer.o 00:02:34.369 CC app/spdk_lspci/spdk_lspci.o 00:02:34.369 CC test/rpc_client/rpc_client_test.o 00:02:34.369 CC app/spdk_top/spdk_top.o 00:02:34.369 CC app/spdk_nvme_identify/identify.o 00:02:34.369 TEST_HEADER include/spdk/accel_module.h 00:02:34.369 TEST_HEADER include/spdk/accel.h 00:02:34.369 TEST_HEADER include/spdk/barrier.h 00:02:34.369 TEST_HEADER include/spdk/bdev.h 00:02:34.369 TEST_HEADER include/spdk/assert.h 00:02:34.369 TEST_HEADER include/spdk/base64.h 00:02:34.369 TEST_HEADER include/spdk/bdev_module.h 00:02:34.369 TEST_HEADER include/spdk/bit_array.h 00:02:34.370 TEST_HEADER include/spdk/bit_pool.h 00:02:34.370 TEST_HEADER include/spdk/bdev_zone.h 00:02:34.370 TEST_HEADER include/spdk/blobfs_bdev.h 00:02:34.370 TEST_HEADER include/spdk/blobfs.h 00:02:34.370 TEST_HEADER include/spdk/blob_bdev.h 00:02:34.370 TEST_HEADER include/spdk/conf.h 00:02:34.370 TEST_HEADER include/spdk/blob.h 00:02:34.370 TEST_HEADER include/spdk/cpuset.h 00:02:34.370 TEST_HEADER include/spdk/config.h 00:02:34.370 TEST_HEADER include/spdk/crc64.h 00:02:34.370 TEST_HEADER include/spdk/crc16.h 00:02:34.370 TEST_HEADER include/spdk/crc32.h 00:02:34.370 TEST_HEADER include/spdk/dif.h 00:02:34.370 TEST_HEADER include/spdk/endian.h 00:02:34.370 TEST_HEADER include/spdk/event.h 00:02:34.370 TEST_HEADER include/spdk/dma.h 00:02:34.370 TEST_HEADER include/spdk/env_dpdk.h 00:02:34.370 TEST_HEADER include/spdk/env.h 00:02:34.370 TEST_HEADER include/spdk/fd_group.h 00:02:34.370 TEST_HEADER include/spdk/ftl.h 00:02:34.370 TEST_HEADER include/spdk/fd.h 00:02:34.370 TEST_HEADER include/spdk/file.h 00:02:34.370 TEST_HEADER include/spdk/gpt_spec.h 00:02:34.370 TEST_HEADER include/spdk/hexlify.h 00:02:34.370 TEST_HEADER include/spdk/histogram_data.h 00:02:34.370 TEST_HEADER include/spdk/idxd_spec.h 00:02:34.370 TEST_HEADER include/spdk/idxd.h 00:02:34.370 TEST_HEADER include/spdk/init.h 00:02:34.370 TEST_HEADER include/spdk/ioat_spec.h 00:02:34.370 TEST_HEADER include/spdk/ioat.h 00:02:34.370 TEST_HEADER include/spdk/iscsi_spec.h 00:02:34.370 TEST_HEADER include/spdk/json.h 00:02:34.370 TEST_HEADER include/spdk/keyring.h 00:02:34.370 TEST_HEADER include/spdk/likely.h 00:02:34.370 TEST_HEADER include/spdk/jsonrpc.h 00:02:34.370 TEST_HEADER include/spdk/lvol.h 00:02:34.370 TEST_HEADER include/spdk/keyring_module.h 00:02:34.370 CC app/spdk_dd/spdk_dd.o 00:02:34.370 TEST_HEADER include/spdk/log.h 00:02:34.370 TEST_HEADER include/spdk/memory.h 00:02:34.370 TEST_HEADER include/spdk/mmio.h 00:02:34.370 TEST_HEADER include/spdk/nbd.h 00:02:34.370 TEST_HEADER include/spdk/net.h 00:02:34.370 TEST_HEADER include/spdk/nvme.h 00:02:34.370 TEST_HEADER include/spdk/nvme_intel.h 00:02:34.370 TEST_HEADER include/spdk/notify.h 00:02:34.370 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:02:34.370 TEST_HEADER include/spdk/nvme_ocssd.h 00:02:34.370 TEST_HEADER include/spdk/nvme_zns.h 00:02:34.370 CC app/iscsi_tgt/iscsi_tgt.o 00:02:34.370 TEST_HEADER include/spdk/nvmf_cmd.h 00:02:34.370 TEST_HEADER include/spdk/nvme_spec.h 00:02:34.370 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:02:34.370 TEST_HEADER include/spdk/nvmf_transport.h 00:02:34.370 TEST_HEADER include/spdk/nvmf.h 00:02:34.370 TEST_HEADER include/spdk/nvmf_spec.h 00:02:34.370 TEST_HEADER include/spdk/opal_spec.h 00:02:34.370 CC app/nvmf_tgt/nvmf_main.o 00:02:34.370 TEST_HEADER include/spdk/opal.h 00:02:34.370 TEST_HEADER include/spdk/pci_ids.h 00:02:34.370 TEST_HEADER include/spdk/pipe.h 00:02:34.370 TEST_HEADER include/spdk/reduce.h 00:02:34.370 TEST_HEADER include/spdk/queue.h 00:02:34.370 CC examples/interrupt_tgt/interrupt_tgt.o 00:02:34.370 TEST_HEADER include/spdk/scheduler.h 00:02:34.370 TEST_HEADER include/spdk/rpc.h 00:02:34.370 TEST_HEADER include/spdk/scsi.h 00:02:34.370 TEST_HEADER include/spdk/scsi_spec.h 00:02:34.370 TEST_HEADER include/spdk/sock.h 00:02:34.370 TEST_HEADER include/spdk/stdinc.h 00:02:34.370 TEST_HEADER include/spdk/string.h 00:02:34.370 TEST_HEADER include/spdk/thread.h 00:02:34.370 TEST_HEADER include/spdk/trace_parser.h 00:02:34.370 TEST_HEADER include/spdk/tree.h 00:02:34.370 TEST_HEADER include/spdk/trace.h 00:02:34.370 TEST_HEADER include/spdk/ublk.h 00:02:34.370 TEST_HEADER include/spdk/util.h 00:02:34.370 TEST_HEADER include/spdk/version.h 00:02:34.370 TEST_HEADER include/spdk/uuid.h 00:02:34.370 TEST_HEADER include/spdk/vhost.h 00:02:34.370 TEST_HEADER include/spdk/vfio_user_spec.h 00:02:34.370 TEST_HEADER include/spdk/vmd.h 00:02:34.370 TEST_HEADER include/spdk/vfio_user_pci.h 00:02:34.370 TEST_HEADER include/spdk/xor.h 00:02:34.370 CXX test/cpp_headers/accel.o 00:02:34.370 TEST_HEADER include/spdk/zipf.h 00:02:34.370 CXX test/cpp_headers/accel_module.o 00:02:34.370 CXX test/cpp_headers/assert.o 00:02:34.370 CXX test/cpp_headers/barrier.o 00:02:34.370 CXX test/cpp_headers/bdev.o 00:02:34.370 CC app/spdk_tgt/spdk_tgt.o 00:02:34.370 CXX test/cpp_headers/base64.o 00:02:34.370 CXX test/cpp_headers/bdev_zone.o 00:02:34.370 CXX test/cpp_headers/bit_array.o 00:02:34.370 CXX test/cpp_headers/bdev_module.o 00:02:34.370 CXX test/cpp_headers/bit_pool.o 00:02:34.370 CXX test/cpp_headers/blob_bdev.o 00:02:34.370 CXX test/cpp_headers/blobfs.o 00:02:34.370 CXX test/cpp_headers/blobfs_bdev.o 00:02:34.370 CXX test/cpp_headers/blob.o 00:02:34.370 CXX test/cpp_headers/conf.o 00:02:34.370 CXX test/cpp_headers/cpuset.o 00:02:34.370 CXX test/cpp_headers/config.o 00:02:34.370 CXX test/cpp_headers/crc32.o 00:02:34.370 CXX test/cpp_headers/crc16.o 00:02:34.370 CXX test/cpp_headers/crc64.o 00:02:34.370 CXX test/cpp_headers/dif.o 00:02:34.370 CXX test/cpp_headers/env_dpdk.o 00:02:34.370 CXX test/cpp_headers/endian.o 00:02:34.370 CXX test/cpp_headers/dma.o 00:02:34.370 CXX test/cpp_headers/event.o 00:02:34.370 CXX test/cpp_headers/fd_group.o 00:02:34.370 CXX test/cpp_headers/env.o 00:02:34.370 CXX test/cpp_headers/file.o 00:02:34.370 CXX test/cpp_headers/fd.o 00:02:34.370 CXX test/cpp_headers/gpt_spec.o 00:02:34.370 CXX test/cpp_headers/ftl.o 00:02:34.370 CXX test/cpp_headers/hexlify.o 00:02:34.370 CXX test/cpp_headers/histogram_data.o 00:02:34.370 CXX test/cpp_headers/idxd.o 00:02:34.370 CXX test/cpp_headers/idxd_spec.o 00:02:34.370 CXX test/cpp_headers/init.o 00:02:34.370 CXX test/cpp_headers/ioat.o 00:02:34.370 CXX test/cpp_headers/ioat_spec.o 00:02:34.370 CXX test/cpp_headers/iscsi_spec.o 00:02:34.370 CXX test/cpp_headers/json.o 00:02:34.370 CXX test/cpp_headers/jsonrpc.o 00:02:34.370 CXX test/cpp_headers/keyring.o 00:02:34.370 CXX test/cpp_headers/keyring_module.o 00:02:34.370 CXX test/cpp_headers/likely.o 00:02:34.370 CXX test/cpp_headers/log.o 00:02:34.370 CXX test/cpp_headers/memory.o 00:02:34.370 CXX test/cpp_headers/lvol.o 00:02:34.370 CXX test/cpp_headers/mmio.o 00:02:34.370 CXX test/cpp_headers/nbd.o 00:02:34.370 CXX test/cpp_headers/notify.o 00:02:34.370 CXX test/cpp_headers/nvme.o 00:02:34.370 CXX test/cpp_headers/net.o 00:02:34.370 CXX test/cpp_headers/nvme_intel.o 00:02:34.370 CXX test/cpp_headers/nvme_ocssd.o 00:02:34.370 CXX test/cpp_headers/nvme_spec.o 00:02:34.370 CXX test/cpp_headers/nvme_ocssd_spec.o 00:02:34.370 CXX test/cpp_headers/nvme_zns.o 00:02:34.370 CXX test/cpp_headers/nvmf_fc_spec.o 00:02:34.370 CXX test/cpp_headers/nvmf_cmd.o 00:02:34.370 CXX test/cpp_headers/nvmf.o 00:02:34.370 CXX test/cpp_headers/nvmf_spec.o 00:02:34.370 CXX test/cpp_headers/opal.o 00:02:34.370 CXX test/cpp_headers/nvmf_transport.o 00:02:34.370 CXX test/cpp_headers/opal_spec.o 00:02:34.370 CXX test/cpp_headers/pci_ids.o 00:02:34.370 CXX test/cpp_headers/pipe.o 00:02:34.370 CXX test/cpp_headers/queue.o 00:02:34.645 CC examples/util/zipf/zipf.o 00:02:34.645 CC app/fio/nvme/fio_plugin.o 00:02:34.645 CC examples/ioat/perf/perf.o 00:02:34.645 CC test/thread/poller_perf/poller_perf.o 00:02:34.645 CC test/app/stub/stub.o 00:02:34.645 CC app/fio/bdev/fio_plugin.o 00:02:34.645 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:02:34.645 CC test/env/memory/memory_ut.o 00:02:34.645 CC test/env/vtophys/vtophys.o 00:02:34.645 CC examples/ioat/verify/verify.o 00:02:34.645 CC test/env/pci/pci_ut.o 00:02:34.645 CC test/app/histogram_perf/histogram_perf.o 00:02:34.645 CC test/app/bdev_svc/bdev_svc.o 00:02:34.645 CC test/app/jsoncat/jsoncat.o 00:02:34.645 CC test/dma/test_dma/test_dma.o 00:02:34.645 LINK spdk_lspci 00:02:34.912 LINK rpc_client_test 00:02:34.912 LINK spdk_nvme_discover 00:02:34.912 LINK spdk_trace_record 00:02:34.912 LINK iscsi_tgt 00:02:34.912 LINK nvmf_tgt 00:02:34.912 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:02:34.912 CC test/env/mem_callbacks/mem_callbacks.o 00:02:34.912 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:02:34.912 LINK interrupt_tgt 00:02:34.912 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:02:34.912 LINK spdk_tgt 00:02:35.174 LINK poller_perf 00:02:35.174 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:02:35.174 CXX test/cpp_headers/reduce.o 00:02:35.174 CXX test/cpp_headers/rpc.o 00:02:35.174 CXX test/cpp_headers/scheduler.o 00:02:35.174 CXX test/cpp_headers/scsi.o 00:02:35.174 LINK zipf 00:02:35.174 CXX test/cpp_headers/sock.o 00:02:35.174 CXX test/cpp_headers/scsi_spec.o 00:02:35.174 CXX test/cpp_headers/stdinc.o 00:02:35.174 CXX test/cpp_headers/string.o 00:02:35.174 CXX test/cpp_headers/thread.o 00:02:35.174 CXX test/cpp_headers/trace.o 00:02:35.174 LINK vtophys 00:02:35.174 LINK histogram_perf 00:02:35.174 CXX test/cpp_headers/trace_parser.o 00:02:35.174 CXX test/cpp_headers/tree.o 00:02:35.174 CXX test/cpp_headers/ublk.o 00:02:35.174 LINK stub 00:02:35.174 CXX test/cpp_headers/util.o 00:02:35.174 CXX test/cpp_headers/uuid.o 00:02:35.174 CXX test/cpp_headers/version.o 00:02:35.174 CXX test/cpp_headers/vfio_user_pci.o 00:02:35.174 LINK jsoncat 00:02:35.174 CXX test/cpp_headers/vfio_user_spec.o 00:02:35.174 CXX test/cpp_headers/vhost.o 00:02:35.174 CXX test/cpp_headers/vmd.o 00:02:35.174 LINK env_dpdk_post_init 00:02:35.174 CXX test/cpp_headers/zipf.o 00:02:35.174 CXX test/cpp_headers/xor.o 00:02:35.174 LINK bdev_svc 00:02:35.174 LINK ioat_perf 00:02:35.174 LINK spdk_dd 00:02:35.174 LINK verify 00:02:35.434 LINK pci_ut 00:02:35.434 LINK spdk_trace 00:02:35.434 LINK test_dma 00:02:35.434 LINK spdk_bdev 00:02:35.434 LINK spdk_nvme 00:02:35.434 LINK spdk_nvme_identify 00:02:35.434 LINK nvme_fuzz 00:02:35.434 CC test/event/reactor/reactor.o 00:02:35.434 CC test/event/reactor_perf/reactor_perf.o 00:02:35.434 CC test/event/event_perf/event_perf.o 00:02:35.750 CC test/event/app_repeat/app_repeat.o 00:02:35.750 LINK spdk_nvme_perf 00:02:35.750 CC test/event/scheduler/scheduler.o 00:02:35.750 LINK vhost_fuzz 00:02:35.750 LINK mem_callbacks 00:02:35.750 LINK reactor 00:02:35.750 CC examples/vmd/lsvmd/lsvmd.o 00:02:35.750 CC examples/idxd/perf/perf.o 00:02:35.750 CC examples/vmd/led/led.o 00:02:35.750 LINK spdk_top 00:02:35.750 LINK reactor_perf 00:02:35.750 CC examples/sock/hello_world/hello_sock.o 00:02:35.750 LINK event_perf 00:02:35.750 CC app/vhost/vhost.o 00:02:35.750 CC examples/thread/thread/thread_ex.o 00:02:35.750 LINK app_repeat 00:02:36.025 LINK scheduler 00:02:36.025 LINK lsvmd 00:02:36.025 LINK led 00:02:36.025 CC test/nvme/reset/reset.o 00:02:36.025 CC test/nvme/aer/aer.o 00:02:36.025 CC test/nvme/doorbell_aers/doorbell_aers.o 00:02:36.025 CC test/nvme/simple_copy/simple_copy.o 00:02:36.025 CC test/nvme/overhead/overhead.o 00:02:36.025 CC test/nvme/e2edp/nvme_dp.o 00:02:36.025 CC test/nvme/sgl/sgl.o 00:02:36.025 CC test/nvme/cuse/cuse.o 00:02:36.025 CC test/nvme/fused_ordering/fused_ordering.o 00:02:36.025 CC test/nvme/fdp/fdp.o 00:02:36.025 CC test/nvme/reserve/reserve.o 00:02:36.025 CC test/nvme/boot_partition/boot_partition.o 00:02:36.025 CC test/nvme/compliance/nvme_compliance.o 00:02:36.025 CC test/nvme/startup/startup.o 00:02:36.025 CC test/nvme/err_injection/err_injection.o 00:02:36.025 CC test/nvme/connect_stress/connect_stress.o 00:02:36.025 LINK vhost 00:02:36.025 LINK memory_ut 00:02:36.025 LINK hello_sock 00:02:36.025 CC test/accel/dif/dif.o 00:02:36.025 CC test/blobfs/mkfs/mkfs.o 00:02:36.025 LINK thread 00:02:36.025 LINK idxd_perf 00:02:36.025 CC test/lvol/esnap/esnap.o 00:02:36.025 LINK boot_partition 00:02:36.025 LINK err_injection 00:02:36.025 LINK connect_stress 00:02:36.025 LINK doorbell_aers 00:02:36.025 LINK startup 00:02:36.025 LINK reserve 00:02:36.025 LINK fused_ordering 00:02:36.025 LINK simple_copy 00:02:36.284 LINK reset 00:02:36.284 LINK nvme_dp 00:02:36.284 LINK aer 00:02:36.284 LINK sgl 00:02:36.284 LINK overhead 00:02:36.284 LINK nvme_compliance 00:02:36.284 LINK mkfs 00:02:36.284 LINK fdp 00:02:36.284 LINK dif 00:02:36.284 CC examples/nvme/nvme_manage/nvme_manage.o 00:02:36.284 CC examples/nvme/hotplug/hotplug.o 00:02:36.284 CC examples/nvme/abort/abort.o 00:02:36.284 CC examples/nvme/arbitration/arbitration.o 00:02:36.284 CC examples/nvme/reconnect/reconnect.o 00:02:36.284 CC examples/nvme/hello_world/hello_world.o 00:02:36.284 CC examples/nvme/cmb_copy/cmb_copy.o 00:02:36.284 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:02:36.541 LINK iscsi_fuzz 00:02:36.541 CC examples/accel/perf/accel_perf.o 00:02:36.541 CC examples/blob/cli/blobcli.o 00:02:36.541 CC examples/blob/hello_world/hello_blob.o 00:02:36.541 LINK hello_world 00:02:36.541 LINK cmb_copy 00:02:36.541 LINK pmr_persistence 00:02:36.541 LINK hotplug 00:02:36.800 LINK arbitration 00:02:36.800 LINK reconnect 00:02:36.800 LINK abort 00:02:36.800 LINK nvme_manage 00:02:36.800 LINK hello_blob 00:02:36.800 CC test/bdev/bdevio/bdevio.o 00:02:36.800 LINK accel_perf 00:02:36.800 LINK cuse 00:02:37.059 LINK blobcli 00:02:37.317 LINK bdevio 00:02:37.317 CC examples/bdev/bdevperf/bdevperf.o 00:02:37.317 CC examples/bdev/hello_world/hello_bdev.o 00:02:37.575 LINK hello_bdev 00:02:37.834 LINK bdevperf 00:02:38.401 CC examples/nvmf/nvmf/nvmf.o 00:02:38.660 LINK nvmf 00:02:39.596 LINK esnap 00:02:39.855 00:02:39.855 real 1m8.391s 00:02:39.855 user 14m20.931s 00:02:39.855 sys 4m3.323s 00:02:39.855 23:24:24 make -- common/autotest_common.sh@1126 -- $ xtrace_disable 00:02:39.855 23:24:24 make -- common/autotest_common.sh@10 -- $ set +x 00:02:39.855 ************************************ 00:02:39.855 END TEST make 00:02:39.855 ************************************ 00:02:39.855 23:24:24 -- spdk/autobuild.sh@1 -- $ stop_monitor_resources 00:02:39.855 23:24:24 -- pm/common@29 -- $ signal_monitor_resources TERM 00:02:39.855 23:24:24 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:02:39.855 23:24:24 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:39.855 23:24:24 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:02:39.855 23:24:24 -- pm/common@44 -- $ pid=80585 00:02:39.855 23:24:24 -- pm/common@50 -- $ kill -TERM 80585 00:02:39.855 23:24:24 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:39.855 23:24:24 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:02:39.855 23:24:24 -- pm/common@44 -- $ pid=80587 00:02:39.855 23:24:24 -- pm/common@50 -- $ kill -TERM 80587 00:02:39.855 23:24:24 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:39.855 23:24:24 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:02:39.855 23:24:24 -- pm/common@44 -- $ pid=80588 00:02:39.855 23:24:24 -- pm/common@50 -- $ kill -TERM 80588 00:02:39.855 23:24:24 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:39.855 23:24:24 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:02:39.855 23:24:24 -- pm/common@44 -- $ pid=80612 00:02:39.855 23:24:24 -- pm/common@50 -- $ sudo -E kill -TERM 80612 00:02:39.855 23:24:24 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:02:39.855 23:24:24 -- nvmf/common.sh@7 -- # uname -s 00:02:39.855 23:24:24 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:02:39.855 23:24:24 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:02:39.855 23:24:24 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:02:39.855 23:24:24 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:02:39.855 23:24:24 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:02:39.855 23:24:24 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:02:39.855 23:24:24 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:02:39.855 23:24:24 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:02:39.855 23:24:24 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:02:39.855 23:24:24 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:02:39.856 23:24:24 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:801347e8-3fd0-e911-906e-0017a4403562 00:02:39.856 23:24:24 -- nvmf/common.sh@18 -- # NVME_HOSTID=801347e8-3fd0-e911-906e-0017a4403562 00:02:39.856 23:24:24 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:02:39.856 23:24:24 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:02:39.856 23:24:24 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:02:39.856 23:24:24 -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:02:39.856 23:24:24 -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:02:39.856 23:24:24 -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:02:39.856 23:24:24 -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:02:39.856 23:24:24 -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:02:39.856 23:24:24 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:39.856 23:24:24 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:39.856 23:24:24 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:39.856 23:24:24 -- paths/export.sh@5 -- # export PATH 00:02:39.856 23:24:24 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:39.856 23:24:24 -- nvmf/common.sh@47 -- # : 0 00:02:39.856 23:24:24 -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:02:39.856 23:24:24 -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:02:39.856 23:24:24 -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:02:39.856 23:24:24 -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:02:39.856 23:24:24 -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:02:39.856 23:24:24 -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:02:39.856 23:24:24 -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:02:39.856 23:24:24 -- nvmf/common.sh@51 -- # have_pci_nics=0 00:02:39.856 23:24:24 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:02:39.856 23:24:24 -- spdk/autotest.sh@32 -- # uname -s 00:02:39.856 23:24:24 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:02:39.856 23:24:24 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:02:39.856 23:24:24 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/coredumps 00:02:39.856 23:24:24 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:02:39.856 23:24:24 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/coredumps 00:02:39.856 23:24:24 -- spdk/autotest.sh@44 -- # modprobe nbd 00:02:39.856 23:24:24 -- spdk/autotest.sh@46 -- # type -P udevadm 00:02:39.856 23:24:24 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:02:39.856 23:24:24 -- spdk/autotest.sh@48 -- # udevadm_pid=147301 00:02:39.856 23:24:24 -- spdk/autotest.sh@53 -- # start_monitor_resources 00:02:39.856 23:24:24 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:02:39.856 23:24:24 -- pm/common@17 -- # local monitor 00:02:39.856 23:24:24 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:02:39.856 23:24:24 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:02:39.856 23:24:24 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:02:39.856 23:24:24 -- pm/common@21 -- # date +%s 00:02:39.856 23:24:24 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:02:39.856 23:24:24 -- pm/common@21 -- # date +%s 00:02:39.856 23:24:24 -- pm/common@21 -- # date +%s 00:02:39.856 23:24:24 -- pm/common@25 -- # sleep 1 00:02:39.856 23:24:24 -- pm/common@21 -- # date +%s 00:02:39.856 23:24:24 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721856264 00:02:39.856 23:24:24 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721856264 00:02:39.856 23:24:24 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721856264 00:02:39.856 23:24:24 -- pm/common@21 -- # sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721856264 00:02:39.856 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721856264_collect-cpu-temp.pm.log 00:02:39.856 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721856264_collect-vmstat.pm.log 00:02:39.856 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721856264_collect-cpu-load.pm.log 00:02:40.115 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721856264_collect-bmc-pm.bmc.pm.log 00:02:41.053 23:24:25 -- spdk/autotest.sh@55 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:02:41.053 23:24:25 -- spdk/autotest.sh@57 -- # timing_enter autotest 00:02:41.053 23:24:25 -- common/autotest_common.sh@724 -- # xtrace_disable 00:02:41.053 23:24:25 -- common/autotest_common.sh@10 -- # set +x 00:02:41.053 23:24:25 -- spdk/autotest.sh@59 -- # create_test_list 00:02:41.053 23:24:25 -- common/autotest_common.sh@748 -- # xtrace_disable 00:02:41.053 23:24:25 -- common/autotest_common.sh@10 -- # set +x 00:02:41.053 23:24:25 -- spdk/autotest.sh@61 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/autotest.sh 00:02:41.053 23:24:25 -- spdk/autotest.sh@61 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk 00:02:41.053 23:24:25 -- spdk/autotest.sh@61 -- # src=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:02:41.053 23:24:25 -- spdk/autotest.sh@62 -- # out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:02:41.053 23:24:25 -- spdk/autotest.sh@63 -- # cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:02:41.053 23:24:25 -- spdk/autotest.sh@65 -- # freebsd_update_contigmem_mod 00:02:41.053 23:24:25 -- common/autotest_common.sh@1455 -- # uname 00:02:41.053 23:24:25 -- common/autotest_common.sh@1455 -- # '[' Linux = FreeBSD ']' 00:02:41.053 23:24:25 -- spdk/autotest.sh@66 -- # freebsd_set_maxsock_buf 00:02:41.053 23:24:25 -- common/autotest_common.sh@1475 -- # uname 00:02:41.053 23:24:25 -- common/autotest_common.sh@1475 -- # [[ Linux = FreeBSD ]] 00:02:41.053 23:24:25 -- spdk/autotest.sh@71 -- # grep CC_TYPE mk/cc.mk 00:02:41.053 23:24:25 -- spdk/autotest.sh@71 -- # CC_TYPE=CC_TYPE=gcc 00:02:41.053 23:24:25 -- spdk/autotest.sh@72 -- # hash lcov 00:02:41.053 23:24:25 -- spdk/autotest.sh@72 -- # [[ CC_TYPE=gcc == *\c\l\a\n\g* ]] 00:02:41.053 23:24:25 -- spdk/autotest.sh@80 -- # export 'LCOV_OPTS= 00:02:41.053 --rc lcov_branch_coverage=1 00:02:41.053 --rc lcov_function_coverage=1 00:02:41.053 --rc genhtml_branch_coverage=1 00:02:41.053 --rc genhtml_function_coverage=1 00:02:41.053 --rc genhtml_legend=1 00:02:41.053 --rc geninfo_all_blocks=1 00:02:41.053 ' 00:02:41.053 23:24:25 -- spdk/autotest.sh@80 -- # LCOV_OPTS=' 00:02:41.053 --rc lcov_branch_coverage=1 00:02:41.053 --rc lcov_function_coverage=1 00:02:41.053 --rc genhtml_branch_coverage=1 00:02:41.053 --rc genhtml_function_coverage=1 00:02:41.053 --rc genhtml_legend=1 00:02:41.053 --rc geninfo_all_blocks=1 00:02:41.053 ' 00:02:41.053 23:24:25 -- spdk/autotest.sh@81 -- # export 'LCOV=lcov 00:02:41.053 --rc lcov_branch_coverage=1 00:02:41.053 --rc lcov_function_coverage=1 00:02:41.053 --rc genhtml_branch_coverage=1 00:02:41.053 --rc genhtml_function_coverage=1 00:02:41.053 --rc genhtml_legend=1 00:02:41.053 --rc geninfo_all_blocks=1 00:02:41.053 --no-external' 00:02:41.053 23:24:25 -- spdk/autotest.sh@81 -- # LCOV='lcov 00:02:41.053 --rc lcov_branch_coverage=1 00:02:41.053 --rc lcov_function_coverage=1 00:02:41.053 --rc genhtml_branch_coverage=1 00:02:41.053 --rc genhtml_function_coverage=1 00:02:41.053 --rc genhtml_legend=1 00:02:41.053 --rc geninfo_all_blocks=1 00:02:41.053 --no-external' 00:02:41.053 23:24:25 -- spdk/autotest.sh@83 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -v 00:02:41.053 lcov: LCOV version 1.14 00:02:41.053 23:24:25 -- spdk/autotest.sh@85 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -c -i -t Baseline -d /var/jenkins/workspace/crypto-phy-autotest/spdk -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_base.info 00:02:41.991 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel.gcno:no functions found 00:02:41.991 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel.gcno 00:02:41.991 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel_module.gcno:no functions found 00:02:41.991 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel_module.gcno 00:02:41.991 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/assert.gcno:no functions found 00:02:41.991 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/assert.gcno 00:02:41.991 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_array.gcno:no functions found 00:02:41.991 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_array.gcno 00:02:41.991 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/base64.gcno:no functions found 00:02:41.991 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/base64.gcno 00:02:41.991 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_zone.gcno:no functions found 00:02:41.991 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_zone.gcno 00:02:41.991 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_pool.gcno:no functions found 00:02:41.991 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_pool.gcno 00:02:41.991 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev.gcno:no functions found 00:02:41.991 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev.gcno 00:02:42.251 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs_bdev.gcno:no functions found 00:02:42.251 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs_bdev.gcno 00:02:42.251 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob_bdev.gcno:no functions found 00:02:42.251 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob_bdev.gcno 00:02:42.251 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/barrier.gcno:no functions found 00:02:42.251 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/barrier.gcno 00:02:42.251 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/cpuset.gcno:no functions found 00:02:42.251 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/cpuset.gcno 00:02:42.251 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/conf.gcno:no functions found 00:02:42.251 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/conf.gcno 00:02:42.251 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_module.gcno:no functions found 00:02:42.251 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_module.gcno 00:02:42.251 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob.gcno:no functions found 00:02:42.251 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob.gcno 00:02:42.251 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/config.gcno:no functions found 00:02:42.251 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/config.gcno 00:02:42.251 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs.gcno:no functions found 00:02:42.251 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs.gcno 00:02:42.251 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env_dpdk.gcno:no functions found 00:02:42.251 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env_dpdk.gcno 00:02:42.251 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc32.gcno:no functions found 00:02:42.251 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc32.gcno 00:02:42.251 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc64.gcno:no functions found 00:02:42.251 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc64.gcno 00:02:42.251 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc16.gcno:no functions found 00:02:42.251 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc16.gcno 00:02:42.251 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dma.gcno:no functions found 00:02:42.251 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dma.gcno 00:02:42.251 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/event.gcno:no functions found 00:02:42.251 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/event.gcno 00:02:42.251 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dif.gcno:no functions found 00:02:42.251 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dif.gcno 00:02:42.251 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/endian.gcno:no functions found 00:02:42.251 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/endian.gcno 00:02:42.251 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/gpt_spec.gcno:no functions found 00:02:42.251 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/gpt_spec.gcno 00:02:42.251 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env.gcno:no functions found 00:02:42.251 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env.gcno 00:02:42.251 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd_group.gcno:no functions found 00:02:42.251 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd_group.gcno 00:02:42.251 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/file.gcno:no functions found 00:02:42.251 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/file.gcno 00:02:42.251 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ftl.gcno:no functions found 00:02:42.251 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ftl.gcno 00:02:42.251 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd.gcno:no functions found 00:02:42.251 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd.gcno 00:02:42.251 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/hexlify.gcno:no functions found 00:02:42.251 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/hexlify.gcno 00:02:42.251 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/histogram_data.gcno:no functions found 00:02:42.251 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/histogram_data.gcno 00:02:42.251 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat_spec.gcno:no functions found 00:02:42.251 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat_spec.gcno 00:02:42.251 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd_spec.gcno:no functions found 00:02:42.251 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd_spec.gcno 00:02:42.251 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd.gcno:no functions found 00:02:42.251 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd.gcno 00:02:42.251 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat.gcno:no functions found 00:02:42.251 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat.gcno 00:02:42.251 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/init.gcno:no functions found 00:02:42.251 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/init.gcno 00:02:42.251 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/jsonrpc.gcno:no functions found 00:02:42.251 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/jsonrpc.gcno 00:02:42.251 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/iscsi_spec.gcno:no functions found 00:02:42.251 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/iscsi_spec.gcno 00:02:42.251 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring.gcno:no functions found 00:02:42.251 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring.gcno 00:02:42.251 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring_module.gcno:no functions found 00:02:42.251 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring_module.gcno 00:02:42.251 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/json.gcno:no functions found 00:02:42.251 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/json.gcno 00:02:42.251 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/likely.gcno:no functions found 00:02:42.251 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/likely.gcno 00:02:42.251 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/log.gcno:no functions found 00:02:42.251 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/log.gcno 00:02:42.251 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/lvol.gcno:no functions found 00:02:42.251 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/lvol.gcno 00:02:42.251 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/memory.gcno:no functions found 00:02:42.251 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/memory.gcno 00:02:42.251 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme.gcno:no functions found 00:02:42.251 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme.gcno 00:02:42.251 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/notify.gcno:no functions found 00:02:42.251 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/notify.gcno 00:02:42.251 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nbd.gcno:no functions found 00:02:42.251 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nbd.gcno 00:02:42.251 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/mmio.gcno:no functions found 00:02:42.252 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/mmio.gcno 00:02:42.252 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_spec.gcno:no functions found 00:02:42.252 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_spec.gcno 00:02:42.252 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_intel.gcno:no functions found 00:02:42.252 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_intel.gcno 00:02:42.252 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd.gcno:no functions found 00:02:42.252 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd.gcno 00:02:42.252 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_fc_spec.gcno:no functions found 00:02:42.252 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_fc_spec.gcno 00:02:42.512 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_zns.gcno:no functions found 00:02:42.512 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_zns.gcno 00:02:42.512 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/net.gcno:no functions found 00:02:42.512 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/net.gcno 00:02:42.512 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_cmd.gcno:no functions found 00:02:42.512 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_cmd.gcno 00:02:42.512 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf.gcno:no functions found 00:02:42.512 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf.gcno 00:02:42.512 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_spec.gcno:no functions found 00:02:42.512 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_spec.gcno 00:02:42.512 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_transport.gcno:no functions found 00:02:42.512 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_transport.gcno 00:02:42.512 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd_spec.gcno:no functions found 00:02:42.512 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd_spec.gcno 00:02:42.512 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal.gcno:no functions found 00:02:42.512 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal.gcno 00:02:42.512 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal_spec.gcno:no functions found 00:02:42.512 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal_spec.gcno 00:02:42.512 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pipe.gcno:no functions found 00:02:42.512 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pipe.gcno 00:02:42.512 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pci_ids.gcno:no functions found 00:02:42.512 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pci_ids.gcno 00:02:42.512 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/queue.gcno:no functions found 00:02:42.512 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/queue.gcno 00:02:42.512 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/reduce.gcno:no functions found 00:02:42.512 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/reduce.gcno 00:02:42.512 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/rpc.gcno:no functions found 00:02:42.512 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/rpc.gcno 00:02:42.512 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi.gcno:no functions found 00:02:42.512 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi.gcno 00:02:42.512 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/sock.gcno:no functions found 00:02:42.512 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/sock.gcno 00:02:42.512 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/stdinc.gcno:no functions found 00:02:42.512 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/stdinc.gcno 00:02:42.512 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi_spec.gcno:no functions found 00:02:42.512 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi_spec.gcno 00:02:42.512 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/thread.gcno:no functions found 00:02:42.512 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/thread.gcno 00:02:42.512 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scheduler.gcno:no functions found 00:02:42.512 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scheduler.gcno 00:02:42.512 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace.gcno:no functions found 00:02:42.512 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace.gcno 00:02:42.512 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/string.gcno:no functions found 00:02:42.512 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/string.gcno 00:02:42.512 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ublk.gcno:no functions found 00:02:42.512 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ublk.gcno 00:02:42.512 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace_parser.gcno:no functions found 00:02:42.512 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace_parser.gcno 00:02:42.512 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/tree.gcno:no functions found 00:02:42.512 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/tree.gcno 00:02:42.512 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/util.gcno:no functions found 00:02:42.512 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/util.gcno 00:02:42.512 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/uuid.gcno:no functions found 00:02:42.512 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/uuid.gcno 00:02:42.512 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/version.gcno:no functions found 00:02:42.512 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/version.gcno 00:02:42.512 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_pci.gcno:no functions found 00:02:42.512 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_pci.gcno 00:02:42.512 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_spec.gcno:no functions found 00:02:42.512 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_spec.gcno 00:02:42.512 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vmd.gcno:no functions found 00:02:42.512 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vmd.gcno 00:02:42.512 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vhost.gcno:no functions found 00:02:42.512 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vhost.gcno 00:02:42.512 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/zipf.gcno:no functions found 00:02:42.513 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/zipf.gcno 00:02:42.513 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/xor.gcno:no functions found 00:02:42.513 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/xor.gcno 00:02:52.493 /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno:no functions found 00:02:52.493 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno 00:03:02.467 23:24:46 -- spdk/autotest.sh@89 -- # timing_enter pre_cleanup 00:03:02.467 23:24:46 -- common/autotest_common.sh@724 -- # xtrace_disable 00:03:02.467 23:24:46 -- common/autotest_common.sh@10 -- # set +x 00:03:02.467 23:24:46 -- spdk/autotest.sh@91 -- # rm -f 00:03:02.467 23:24:46 -- spdk/autotest.sh@94 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:03:04.372 0000:5f:00.0 (1b96 2600): Already using the nvme driver 00:03:04.372 0000:5e:00.0 (8086 0a54): Already using the nvme driver 00:03:04.372 0000:00:04.7 (8086 2021): Already using the ioatdma driver 00:03:04.372 0000:00:04.6 (8086 2021): Already using the ioatdma driver 00:03:04.372 0000:00:04.5 (8086 2021): Already using the ioatdma driver 00:03:04.372 0000:00:04.4 (8086 2021): Already using the ioatdma driver 00:03:04.372 0000:00:04.3 (8086 2021): Already using the ioatdma driver 00:03:04.372 0000:00:04.2 (8086 2021): Already using the ioatdma driver 00:03:04.631 0000:00:04.1 (8086 2021): Already using the ioatdma driver 00:03:04.631 0000:00:04.0 (8086 2021): Already using the ioatdma driver 00:03:04.631 0000:80:04.7 (8086 2021): Already using the ioatdma driver 00:03:04.631 0000:80:04.6 (8086 2021): Already using the ioatdma driver 00:03:04.631 0000:80:04.5 (8086 2021): Already using the ioatdma driver 00:03:04.631 0000:80:04.4 (8086 2021): Already using the ioatdma driver 00:03:04.631 0000:80:04.3 (8086 2021): Already using the ioatdma driver 00:03:04.631 0000:80:04.2 (8086 2021): Already using the ioatdma driver 00:03:04.631 0000:80:04.1 (8086 2021): Already using the ioatdma driver 00:03:04.631 0000:80:04.0 (8086 2021): Already using the ioatdma driver 00:03:04.890 23:24:49 -- spdk/autotest.sh@96 -- # get_zoned_devs 00:03:04.890 23:24:49 -- common/autotest_common.sh@1669 -- # zoned_devs=() 00:03:04.890 23:24:49 -- common/autotest_common.sh@1669 -- # local -gA zoned_devs 00:03:04.890 23:24:49 -- common/autotest_common.sh@1670 -- # local nvme bdf 00:03:04.890 23:24:49 -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:03:04.890 23:24:49 -- common/autotest_common.sh@1673 -- # is_block_zoned nvme0n1 00:03:04.890 23:24:49 -- common/autotest_common.sh@1662 -- # local device=nvme0n1 00:03:04.890 23:24:49 -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:04.890 23:24:49 -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:03:04.890 23:24:49 -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:03:04.890 23:24:49 -- common/autotest_common.sh@1673 -- # is_block_zoned nvme1n1 00:03:04.890 23:24:49 -- common/autotest_common.sh@1662 -- # local device=nvme1n1 00:03:04.890 23:24:49 -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme1n1/queue/zoned ]] 00:03:04.890 23:24:49 -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:03:04.890 23:24:49 -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:03:04.890 23:24:49 -- common/autotest_common.sh@1673 -- # is_block_zoned nvme1n2 00:03:04.890 23:24:49 -- common/autotest_common.sh@1662 -- # local device=nvme1n2 00:03:04.890 23:24:49 -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme1n2/queue/zoned ]] 00:03:04.890 23:24:49 -- common/autotest_common.sh@1665 -- # [[ host-managed != none ]] 00:03:04.890 23:24:49 -- common/autotest_common.sh@1674 -- # zoned_devs["${nvme##*/}"]=0000:5f:00.0 00:03:04.890 23:24:49 -- spdk/autotest.sh@98 -- # (( 1 > 0 )) 00:03:04.890 23:24:49 -- spdk/autotest.sh@103 -- # export PCI_BLOCKED=0000:5f:00.0 00:03:04.890 23:24:49 -- spdk/autotest.sh@103 -- # PCI_BLOCKED=0000:5f:00.0 00:03:04.890 23:24:49 -- spdk/autotest.sh@104 -- # export PCI_ZONED=0000:5f:00.0 00:03:04.890 23:24:49 -- spdk/autotest.sh@104 -- # PCI_ZONED=0000:5f:00.0 00:03:04.890 23:24:49 -- spdk/autotest.sh@110 -- # for dev in /dev/nvme*n!(*p*) 00:03:04.890 23:24:49 -- spdk/autotest.sh@112 -- # [[ -z '' ]] 00:03:04.890 23:24:49 -- spdk/autotest.sh@113 -- # block_in_use /dev/nvme0n1 00:03:04.890 23:24:49 -- scripts/common.sh@378 -- # local block=/dev/nvme0n1 pt 00:03:04.891 23:24:49 -- scripts/common.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:03:04.891 No valid GPT data, bailing 00:03:04.891 23:24:49 -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:03:04.891 23:24:49 -- scripts/common.sh@391 -- # pt= 00:03:04.891 23:24:49 -- scripts/common.sh@392 -- # return 1 00:03:04.891 23:24:49 -- spdk/autotest.sh@114 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:03:04.891 1+0 records in 00:03:04.891 1+0 records out 00:03:04.891 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00580147 s, 181 MB/s 00:03:04.891 23:24:49 -- spdk/autotest.sh@110 -- # for dev in /dev/nvme*n!(*p*) 00:03:04.891 23:24:49 -- spdk/autotest.sh@112 -- # [[ -z '' ]] 00:03:04.891 23:24:49 -- spdk/autotest.sh@113 -- # block_in_use /dev/nvme1n1 00:03:04.891 23:24:49 -- scripts/common.sh@378 -- # local block=/dev/nvme1n1 pt 00:03:04.891 23:24:49 -- scripts/common.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme1n1 00:03:04.891 No valid GPT data, bailing 00:03:04.891 23:24:49 -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme1n1 00:03:04.891 23:24:49 -- scripts/common.sh@391 -- # pt= 00:03:04.891 23:24:49 -- scripts/common.sh@392 -- # return 1 00:03:04.891 23:24:49 -- spdk/autotest.sh@114 -- # dd if=/dev/zero of=/dev/nvme1n1 bs=1M count=1 00:03:04.891 1+0 records in 00:03:04.891 1+0 records out 00:03:04.891 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00186124 s, 563 MB/s 00:03:04.891 23:24:49 -- spdk/autotest.sh@110 -- # for dev in /dev/nvme*n!(*p*) 00:03:04.891 23:24:49 -- spdk/autotest.sh@112 -- # [[ -z 0000:5f:00.0 ]] 00:03:04.891 23:24:49 -- spdk/autotest.sh@112 -- # continue 00:03:04.891 23:24:49 -- spdk/autotest.sh@118 -- # sync 00:03:04.891 23:24:49 -- spdk/autotest.sh@120 -- # xtrace_disable_per_cmd reap_spdk_processes 00:03:04.891 23:24:49 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:03:04.891 23:24:49 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:03:10.165 23:24:54 -- spdk/autotest.sh@124 -- # uname -s 00:03:10.165 23:24:54 -- spdk/autotest.sh@124 -- # '[' Linux = Linux ']' 00:03:10.165 23:24:54 -- spdk/autotest.sh@125 -- # run_test setup.sh /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/test-setup.sh 00:03:10.165 23:24:54 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:03:10.165 23:24:54 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:03:10.165 23:24:54 -- common/autotest_common.sh@10 -- # set +x 00:03:10.165 ************************************ 00:03:10.165 START TEST setup.sh 00:03:10.165 ************************************ 00:03:10.165 23:24:54 setup.sh -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/test-setup.sh 00:03:10.165 * Looking for test storage... 00:03:10.165 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:03:10.165 23:24:54 setup.sh -- setup/test-setup.sh@10 -- # uname -s 00:03:10.165 23:24:54 setup.sh -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:03:10.165 23:24:54 setup.sh -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/acl.sh 00:03:10.165 23:24:54 setup.sh -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:03:10.165 23:24:54 setup.sh -- common/autotest_common.sh@1107 -- # xtrace_disable 00:03:10.165 23:24:54 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:03:10.165 ************************************ 00:03:10.165 START TEST acl 00:03:10.165 ************************************ 00:03:10.165 23:24:54 setup.sh.acl -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/acl.sh 00:03:10.165 * Looking for test storage... 00:03:10.165 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:03:10.165 23:24:54 setup.sh.acl -- setup/acl.sh@10 -- # get_zoned_devs 00:03:10.165 23:24:54 setup.sh.acl -- common/autotest_common.sh@1669 -- # zoned_devs=() 00:03:10.165 23:24:54 setup.sh.acl -- common/autotest_common.sh@1669 -- # local -gA zoned_devs 00:03:10.166 23:24:54 setup.sh.acl -- common/autotest_common.sh@1670 -- # local nvme bdf 00:03:10.166 23:24:54 setup.sh.acl -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:03:10.166 23:24:54 setup.sh.acl -- common/autotest_common.sh@1673 -- # is_block_zoned nvme0n1 00:03:10.166 23:24:54 setup.sh.acl -- common/autotest_common.sh@1662 -- # local device=nvme0n1 00:03:10.166 23:24:54 setup.sh.acl -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:10.166 23:24:54 setup.sh.acl -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:03:10.166 23:24:54 setup.sh.acl -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:03:10.166 23:24:54 setup.sh.acl -- common/autotest_common.sh@1673 -- # is_block_zoned nvme1n1 00:03:10.166 23:24:54 setup.sh.acl -- common/autotest_common.sh@1662 -- # local device=nvme1n1 00:03:10.166 23:24:54 setup.sh.acl -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme1n1/queue/zoned ]] 00:03:10.166 23:24:54 setup.sh.acl -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:03:10.166 23:24:54 setup.sh.acl -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:03:10.166 23:24:54 setup.sh.acl -- common/autotest_common.sh@1673 -- # is_block_zoned nvme1n2 00:03:10.166 23:24:54 setup.sh.acl -- common/autotest_common.sh@1662 -- # local device=nvme1n2 00:03:10.166 23:24:54 setup.sh.acl -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme1n2/queue/zoned ]] 00:03:10.166 23:24:54 setup.sh.acl -- common/autotest_common.sh@1665 -- # [[ host-managed != none ]] 00:03:10.166 23:24:54 setup.sh.acl -- common/autotest_common.sh@1674 -- # zoned_devs["${nvme##*/}"]=0000:5f:00.0 00:03:10.166 23:24:54 setup.sh.acl -- setup/acl.sh@12 -- # devs=() 00:03:10.166 23:24:54 setup.sh.acl -- setup/acl.sh@12 -- # declare -a devs 00:03:10.166 23:24:54 setup.sh.acl -- setup/acl.sh@13 -- # drivers=() 00:03:10.166 23:24:54 setup.sh.acl -- setup/acl.sh@13 -- # declare -A drivers 00:03:10.166 23:24:54 setup.sh.acl -- setup/acl.sh@51 -- # setup reset 00:03:10.166 23:24:54 setup.sh.acl -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:10.166 23:24:54 setup.sh.acl -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:03:12.772 23:24:57 setup.sh.acl -- setup/acl.sh@52 -- # collect_setup_devs 00:03:12.772 23:24:57 setup.sh.acl -- setup/acl.sh@16 -- # local dev driver 00:03:12.772 23:24:57 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:12.772 23:24:57 setup.sh.acl -- setup/acl.sh@15 -- # setup output status 00:03:12.772 23:24:57 setup.sh.acl -- setup/common.sh@9 -- # [[ output == output ]] 00:03:12.772 23:24:57 setup.sh.acl -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:03:16.062 Hugepages 00:03:16.062 node hugesize free / total 00:03:16.062 23:25:00 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:03:16.062 23:25:00 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:16.062 23:25:00 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:16.062 23:25:00 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:03:16.062 23:25:00 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:16.062 23:25:00 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:16.062 23:25:00 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:03:16.062 23:25:00 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:16.062 23:25:00 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:16.062 00:03:16.062 Type BDF Vendor Device NUMA Driver Device Block devices 00:03:16.062 23:25:00 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:03:16.062 23:25:00 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:16.062 23:25:00 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:16.062 23:25:00 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.0 == *:*:*.* ]] 00:03:16.062 23:25:00 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:16.062 23:25:00 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:16.062 23:25:00 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:16.062 23:25:00 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.1 == *:*:*.* ]] 00:03:16.062 23:25:00 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:16.062 23:25:00 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:16.062 23:25:00 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:16.062 23:25:00 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.2 == *:*:*.* ]] 00:03:16.062 23:25:00 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:16.062 23:25:00 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:16.062 23:25:00 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:16.062 23:25:00 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.3 == *:*:*.* ]] 00:03:16.062 23:25:00 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:16.062 23:25:00 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:16.062 23:25:00 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:16.062 23:25:00 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.4 == *:*:*.* ]] 00:03:16.062 23:25:00 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:16.062 23:25:00 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:16.062 23:25:00 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:16.062 23:25:00 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.5 == *:*:*.* ]] 00:03:16.062 23:25:00 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:16.062 23:25:00 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:16.062 23:25:00 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:16.062 23:25:00 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.6 == *:*:*.* ]] 00:03:16.062 23:25:00 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:16.062 23:25:00 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:16.062 23:25:00 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:16.062 23:25:00 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.7 == *:*:*.* ]] 00:03:16.062 23:25:00 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:16.062 23:25:00 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:16.062 23:25:00 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:16.062 23:25:00 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:5e:00.0 == *:*:*.* ]] 00:03:16.063 23:25:00 setup.sh.acl -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:03:16.063 23:25:00 setup.sh.acl -- setup/acl.sh@21 -- # [[ 0000:5f:00.0 == *\0\0\0\0\:\5\e\:\0\0\.\0* ]] 00:03:16.063 23:25:00 setup.sh.acl -- setup/acl.sh@22 -- # devs+=("$dev") 00:03:16.063 23:25:00 setup.sh.acl -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:03:16.063 23:25:00 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:16.063 23:25:01 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:5f:00.0 == *:*:*.* ]] 00:03:16.063 23:25:01 setup.sh.acl -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:03:16.063 23:25:01 setup.sh.acl -- setup/acl.sh@21 -- # [[ 0000:5f:00.0 == *\0\0\0\0\:\5\f\:\0\0\.\0* ]] 00:03:16.063 23:25:01 setup.sh.acl -- setup/acl.sh@21 -- # continue 00:03:16.063 23:25:01 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:16.063 23:25:01 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.0 == *:*:*.* ]] 00:03:16.063 23:25:01 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:16.063 23:25:01 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:16.063 23:25:01 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:16.063 23:25:01 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.1 == *:*:*.* ]] 00:03:16.063 23:25:01 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:16.063 23:25:01 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:16.063 23:25:01 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:16.063 23:25:01 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.2 == *:*:*.* ]] 00:03:16.063 23:25:01 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:16.063 23:25:01 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:16.063 23:25:01 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:16.063 23:25:01 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.3 == *:*:*.* ]] 00:03:16.063 23:25:01 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:16.063 23:25:01 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:16.063 23:25:01 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:16.063 23:25:01 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.4 == *:*:*.* ]] 00:03:16.063 23:25:01 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:16.063 23:25:01 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:16.063 23:25:01 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:16.063 23:25:01 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.5 == *:*:*.* ]] 00:03:16.063 23:25:01 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:16.063 23:25:01 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:16.063 23:25:01 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:16.063 23:25:01 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.6 == *:*:*.* ]] 00:03:16.063 23:25:01 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:16.063 23:25:01 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:16.063 23:25:01 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:16.322 23:25:01 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.7 == *:*:*.* ]] 00:03:16.322 23:25:01 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:16.322 23:25:01 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:16.322 23:25:01 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:16.322 23:25:01 setup.sh.acl -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:03:16.322 23:25:01 setup.sh.acl -- setup/acl.sh@54 -- # run_test denied denied 00:03:16.322 23:25:01 setup.sh.acl -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:03:16.322 23:25:01 setup.sh.acl -- common/autotest_common.sh@1107 -- # xtrace_disable 00:03:16.322 23:25:01 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:03:16.322 ************************************ 00:03:16.322 START TEST denied 00:03:16.322 ************************************ 00:03:16.322 23:25:01 setup.sh.acl.denied -- common/autotest_common.sh@1125 -- # denied 00:03:16.322 23:25:01 setup.sh.acl.denied -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:5e:00.0' 00:03:16.322 23:25:01 setup.sh.acl.denied -- setup/acl.sh@38 -- # PCI_BLOCKED='0000:5f:00.0 0000:5e:00.0' 00:03:16.322 23:25:01 setup.sh.acl.denied -- setup/acl.sh@38 -- # setup output config 00:03:16.322 23:25:01 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ output == output ]] 00:03:16.322 23:25:01 setup.sh.acl.denied -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:03:19.613 0000:5e:00.0 (8086 0a54): Skipping denied controller at 0000:5e:00.0 00:03:19.613 23:25:04 setup.sh.acl.denied -- setup/acl.sh@40 -- # verify 0000:5e:00.0 00:03:19.613 23:25:04 setup.sh.acl.denied -- setup/acl.sh@28 -- # local dev driver 00:03:19.613 23:25:04 setup.sh.acl.denied -- setup/acl.sh@30 -- # for dev in "$@" 00:03:19.613 23:25:04 setup.sh.acl.denied -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:5e:00.0 ]] 00:03:19.613 23:25:04 setup.sh.acl.denied -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:5e:00.0/driver 00:03:19.613 23:25:04 setup.sh.acl.denied -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:03:19.613 23:25:04 setup.sh.acl.denied -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:03:19.613 23:25:04 setup.sh.acl.denied -- setup/acl.sh@41 -- # setup reset 00:03:19.613 23:25:04 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:19.613 23:25:04 setup.sh.acl.denied -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:03:24.890 00:03:24.890 real 0m7.733s 00:03:24.890 user 0m2.661s 00:03:24.890 sys 0m4.403s 00:03:24.890 23:25:08 setup.sh.acl.denied -- common/autotest_common.sh@1126 -- # xtrace_disable 00:03:24.890 23:25:08 setup.sh.acl.denied -- common/autotest_common.sh@10 -- # set +x 00:03:24.890 ************************************ 00:03:24.890 END TEST denied 00:03:24.890 ************************************ 00:03:24.890 23:25:08 setup.sh.acl -- setup/acl.sh@55 -- # run_test allowed allowed 00:03:24.890 23:25:08 setup.sh.acl -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:03:24.890 23:25:08 setup.sh.acl -- common/autotest_common.sh@1107 -- # xtrace_disable 00:03:24.890 23:25:08 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:03:24.890 ************************************ 00:03:24.890 START TEST allowed 00:03:24.890 ************************************ 00:03:24.890 23:25:08 setup.sh.acl.allowed -- common/autotest_common.sh@1125 -- # allowed 00:03:24.890 23:25:08 setup.sh.acl.allowed -- setup/acl.sh@46 -- # grep -E '0000:5e:00.0 .*: nvme -> .*' 00:03:24.890 23:25:08 setup.sh.acl.allowed -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:5e:00.0 00:03:24.890 23:25:08 setup.sh.acl.allowed -- setup/acl.sh@45 -- # setup output config 00:03:24.890 23:25:08 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ output == output ]] 00:03:24.890 23:25:08 setup.sh.acl.allowed -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:03:28.181 0000:5e:00.0 (8086 0a54): nvme -> vfio-pci 00:03:28.181 23:25:12 setup.sh.acl.allowed -- setup/acl.sh@47 -- # verify 00:03:28.181 23:25:12 setup.sh.acl.allowed -- setup/acl.sh@28 -- # local dev driver 00:03:28.181 23:25:12 setup.sh.acl.allowed -- setup/acl.sh@48 -- # setup reset 00:03:28.181 23:25:12 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:28.181 23:25:12 setup.sh.acl.allowed -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:03:31.470 00:03:31.470 real 0m7.271s 00:03:31.470 user 0m2.291s 00:03:31.470 sys 0m4.098s 00:03:31.470 23:25:16 setup.sh.acl.allowed -- common/autotest_common.sh@1126 -- # xtrace_disable 00:03:31.471 23:25:16 setup.sh.acl.allowed -- common/autotest_common.sh@10 -- # set +x 00:03:31.471 ************************************ 00:03:31.471 END TEST allowed 00:03:31.471 ************************************ 00:03:31.471 00:03:31.471 real 0m21.880s 00:03:31.471 user 0m7.561s 00:03:31.471 sys 0m12.928s 00:03:31.471 23:25:16 setup.sh.acl -- common/autotest_common.sh@1126 -- # xtrace_disable 00:03:31.471 23:25:16 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:03:31.471 ************************************ 00:03:31.471 END TEST acl 00:03:31.471 ************************************ 00:03:31.471 23:25:16 setup.sh -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/hugepages.sh 00:03:31.471 23:25:16 setup.sh -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:03:31.471 23:25:16 setup.sh -- common/autotest_common.sh@1107 -- # xtrace_disable 00:03:31.471 23:25:16 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:03:31.471 ************************************ 00:03:31.471 START TEST hugepages 00:03:31.471 ************************************ 00:03:31.471 23:25:16 setup.sh.hugepages -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/hugepages.sh 00:03:31.471 * Looking for test storage... 00:03:31.471 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/hugepages.sh@10 -- # nodes_sys=() 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@17 -- # local get=Hugepagesize 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@18 -- # local node= 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@19 -- # local var val 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@20 -- # local mem_f mem 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@28 -- # mapfile -t mem 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93323012 kB' 'MemFree: 75503472 kB' 'MemAvailable: 78924332 kB' 'Buffers: 2696 kB' 'Cached: 9843252 kB' 'SwapCached: 0 kB' 'Active: 6831584 kB' 'Inactive: 3507864 kB' 'Active(anon): 6434260 kB' 'Inactive(anon): 0 kB' 'Active(file): 397324 kB' 'Inactive(file): 3507864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 496744 kB' 'Mapped: 186632 kB' 'Shmem: 5940760 kB' 'KReclaimable: 223800 kB' 'Slab: 676876 kB' 'SReclaimable: 223800 kB' 'SUnreclaim: 453076 kB' 'KernelStack: 19760 kB' 'PageTables: 8492 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52952956 kB' 'Committed_AS: 7813520 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 221304 kB' 'VmallocChunk: 0 kB' 'Percpu: 63360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 2130900 kB' 'DirectMap2M: 13277184 kB' 'DirectMap1G: 87031808 kB' 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:31.471 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@33 -- # echo 2048 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/common.sh@33 -- # return 0 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:03:31.472 23:25:16 setup.sh.hugepages -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:03:31.473 23:25:16 setup.sh.hugepages -- setup/hugepages.sh@21 -- # unset -v HUGE_EVEN_ALLOC 00:03:31.473 23:25:16 setup.sh.hugepages -- setup/hugepages.sh@22 -- # unset -v HUGEMEM 00:03:31.473 23:25:16 setup.sh.hugepages -- setup/hugepages.sh@23 -- # unset -v HUGENODE 00:03:31.473 23:25:16 setup.sh.hugepages -- setup/hugepages.sh@24 -- # unset -v NRHUGE 00:03:31.473 23:25:16 setup.sh.hugepages -- setup/hugepages.sh@207 -- # get_nodes 00:03:31.473 23:25:16 setup.sh.hugepages -- setup/hugepages.sh@27 -- # local node 00:03:31.473 23:25:16 setup.sh.hugepages -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:31.473 23:25:16 setup.sh.hugepages -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:31.473 23:25:16 setup.sh.hugepages -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:31.473 23:25:16 setup.sh.hugepages -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:31.473 23:25:16 setup.sh.hugepages -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:31.473 23:25:16 setup.sh.hugepages -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:31.473 23:25:16 setup.sh.hugepages -- setup/hugepages.sh@208 -- # clear_hp 00:03:31.473 23:25:16 setup.sh.hugepages -- setup/hugepages.sh@37 -- # local node hp 00:03:31.473 23:25:16 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:31.473 23:25:16 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:31.473 23:25:16 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:03:31.473 23:25:16 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:31.473 23:25:16 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:03:31.473 23:25:16 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:31.473 23:25:16 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:31.473 23:25:16 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:03:31.473 23:25:16 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:31.473 23:25:16 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:03:31.473 23:25:16 setup.sh.hugepages -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:03:31.473 23:25:16 setup.sh.hugepages -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:03:31.473 23:25:16 setup.sh.hugepages -- setup/hugepages.sh@210 -- # run_test default_setup default_setup 00:03:31.473 23:25:16 setup.sh.hugepages -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:03:31.473 23:25:16 setup.sh.hugepages -- common/autotest_common.sh@1107 -- # xtrace_disable 00:03:31.473 23:25:16 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:03:31.473 ************************************ 00:03:31.473 START TEST default_setup 00:03:31.473 ************************************ 00:03:31.473 23:25:16 setup.sh.hugepages.default_setup -- common/autotest_common.sh@1125 -- # default_setup 00:03:31.473 23:25:16 setup.sh.hugepages.default_setup -- setup/hugepages.sh@136 -- # get_test_nr_hugepages 2097152 0 00:03:31.473 23:25:16 setup.sh.hugepages.default_setup -- setup/hugepages.sh@49 -- # local size=2097152 00:03:31.473 23:25:16 setup.sh.hugepages.default_setup -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:03:31.473 23:25:16 setup.sh.hugepages.default_setup -- setup/hugepages.sh@51 -- # shift 00:03:31.473 23:25:16 setup.sh.hugepages.default_setup -- setup/hugepages.sh@52 -- # node_ids=('0') 00:03:31.473 23:25:16 setup.sh.hugepages.default_setup -- setup/hugepages.sh@52 -- # local node_ids 00:03:31.473 23:25:16 setup.sh.hugepages.default_setup -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:31.473 23:25:16 setup.sh.hugepages.default_setup -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:31.473 23:25:16 setup.sh.hugepages.default_setup -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:03:31.473 23:25:16 setup.sh.hugepages.default_setup -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:03:31.473 23:25:16 setup.sh.hugepages.default_setup -- setup/hugepages.sh@62 -- # local user_nodes 00:03:31.473 23:25:16 setup.sh.hugepages.default_setup -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:31.473 23:25:16 setup.sh.hugepages.default_setup -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:31.473 23:25:16 setup.sh.hugepages.default_setup -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:31.473 23:25:16 setup.sh.hugepages.default_setup -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:31.473 23:25:16 setup.sh.hugepages.default_setup -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:03:31.473 23:25:16 setup.sh.hugepages.default_setup -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:31.473 23:25:16 setup.sh.hugepages.default_setup -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:03:31.473 23:25:16 setup.sh.hugepages.default_setup -- setup/hugepages.sh@73 -- # return 0 00:03:31.473 23:25:16 setup.sh.hugepages.default_setup -- setup/hugepages.sh@137 -- # setup output 00:03:31.473 23:25:16 setup.sh.hugepages.default_setup -- setup/common.sh@9 -- # [[ output == output ]] 00:03:31.473 23:25:16 setup.sh.hugepages.default_setup -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:03:34.764 0000:5f:00.0 (1b96 2600): Skipping denied controller at 0000:5f:00.0 00:03:34.765 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:03:34.765 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:03:34.765 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:03:34.765 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:03:34.765 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:03:34.765 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:03:34.765 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:03:34.765 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:03:34.765 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:03:34.765 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:03:34.765 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:03:34.765 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:03:34.765 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:03:34.765 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:03:34.765 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:03:34.765 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:03:35.707 0000:5e:00.0 (8086 0a54): nvme -> vfio-pci 00:03:35.707 23:25:20 setup.sh.hugepages.default_setup -- setup/hugepages.sh@138 -- # verify_nr_hugepages 00:03:35.707 23:25:20 setup.sh.hugepages.default_setup -- setup/hugepages.sh@89 -- # local node 00:03:35.707 23:25:20 setup.sh.hugepages.default_setup -- setup/hugepages.sh@90 -- # local sorted_t 00:03:35.707 23:25:20 setup.sh.hugepages.default_setup -- setup/hugepages.sh@91 -- # local sorted_s 00:03:35.707 23:25:20 setup.sh.hugepages.default_setup -- setup/hugepages.sh@92 -- # local surp 00:03:35.707 23:25:20 setup.sh.hugepages.default_setup -- setup/hugepages.sh@93 -- # local resv 00:03:35.707 23:25:20 setup.sh.hugepages.default_setup -- setup/hugepages.sh@94 -- # local anon 00:03:35.707 23:25:20 setup.sh.hugepages.default_setup -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:35.707 23:25:20 setup.sh.hugepages.default_setup -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:35.707 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:35.707 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:03:35.707 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:03:35.707 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:03:35.707 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:35.707 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:35.707 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:35.707 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:03:35.707 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:35.707 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.707 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.707 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93323012 kB' 'MemFree: 77661144 kB' 'MemAvailable: 81081776 kB' 'Buffers: 2696 kB' 'Cached: 9843372 kB' 'SwapCached: 0 kB' 'Active: 6850264 kB' 'Inactive: 3507864 kB' 'Active(anon): 6452940 kB' 'Inactive(anon): 0 kB' 'Active(file): 397324 kB' 'Inactive(file): 3507864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 515452 kB' 'Mapped: 186984 kB' 'Shmem: 5940880 kB' 'KReclaimable: 223344 kB' 'Slab: 674948 kB' 'SReclaimable: 223344 kB' 'SUnreclaim: 451604 kB' 'KernelStack: 19632 kB' 'PageTables: 7572 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001532 kB' 'Committed_AS: 7829484 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 221304 kB' 'VmallocChunk: 0 kB' 'Percpu: 63360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2130900 kB' 'DirectMap2M: 13277184 kB' 'DirectMap1G: 87031808 kB' 00:03:35.707 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.707 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.707 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.707 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.707 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.707 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.707 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.707 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.707 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.707 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.707 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.707 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.707 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.707 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.707 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.707 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.707 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.707 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.707 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.707 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.707 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.707 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.707 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.707 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.707 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.707 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.707 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.707 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.707 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.708 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/hugepages.sh@97 -- # anon=0 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93323012 kB' 'MemFree: 77659960 kB' 'MemAvailable: 81080592 kB' 'Buffers: 2696 kB' 'Cached: 9843376 kB' 'SwapCached: 0 kB' 'Active: 6851856 kB' 'Inactive: 3507864 kB' 'Active(anon): 6454532 kB' 'Inactive(anon): 0 kB' 'Active(file): 397324 kB' 'Inactive(file): 3507864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 516532 kB' 'Mapped: 186936 kB' 'Shmem: 5940884 kB' 'KReclaimable: 223344 kB' 'Slab: 674948 kB' 'SReclaimable: 223344 kB' 'SUnreclaim: 451604 kB' 'KernelStack: 19744 kB' 'PageTables: 8584 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001532 kB' 'Committed_AS: 7829500 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 221336 kB' 'VmallocChunk: 0 kB' 'Percpu: 63360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2130900 kB' 'DirectMap2M: 13277184 kB' 'DirectMap1G: 87031808 kB' 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.709 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.710 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/hugepages.sh@99 -- # surp=0 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93323012 kB' 'MemFree: 77659308 kB' 'MemAvailable: 81079940 kB' 'Buffers: 2696 kB' 'Cached: 9843396 kB' 'SwapCached: 0 kB' 'Active: 6851420 kB' 'Inactive: 3507864 kB' 'Active(anon): 6454096 kB' 'Inactive(anon): 0 kB' 'Active(file): 397324 kB' 'Inactive(file): 3507864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 516512 kB' 'Mapped: 186852 kB' 'Shmem: 5940904 kB' 'KReclaimable: 223344 kB' 'Slab: 674884 kB' 'SReclaimable: 223344 kB' 'SUnreclaim: 451540 kB' 'KernelStack: 19712 kB' 'PageTables: 8440 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001532 kB' 'Committed_AS: 7831016 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 221352 kB' 'VmallocChunk: 0 kB' 'Percpu: 63360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2130900 kB' 'DirectMap2M: 13277184 kB' 'DirectMap1G: 87031808 kB' 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.711 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.712 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/hugepages.sh@100 -- # resv=0 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:35.713 nr_hugepages=1024 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:35.713 resv_hugepages=0 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:35.713 surplus_hugepages=0 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:35.713 anon_hugepages=0 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93323012 kB' 'MemFree: 77659256 kB' 'MemAvailable: 81079888 kB' 'Buffers: 2696 kB' 'Cached: 9843416 kB' 'SwapCached: 0 kB' 'Active: 6851488 kB' 'Inactive: 3507864 kB' 'Active(anon): 6454164 kB' 'Inactive(anon): 0 kB' 'Active(file): 397324 kB' 'Inactive(file): 3507864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 516620 kB' 'Mapped: 186860 kB' 'Shmem: 5940924 kB' 'KReclaimable: 223344 kB' 'Slab: 674884 kB' 'SReclaimable: 223344 kB' 'SUnreclaim: 451540 kB' 'KernelStack: 19632 kB' 'PageTables: 8540 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001532 kB' 'Committed_AS: 7831036 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 221400 kB' 'VmallocChunk: 0 kB' 'Percpu: 63360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2130900 kB' 'DirectMap2M: 13277184 kB' 'DirectMap1G: 87031808 kB' 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.713 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.714 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 1024 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/hugepages.sh@112 -- # get_nodes 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/hugepages.sh@27 -- # local node 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node=0 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634628 kB' 'MemFree: 26765520 kB' 'MemUsed: 5869108 kB' 'SwapCached: 0 kB' 'Active: 2175244 kB' 'Inactive: 100316 kB' 'Active(anon): 1982248 kB' 'Inactive(anon): 0 kB' 'Active(file): 192996 kB' 'Inactive(file): 100316 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 2059240 kB' 'Mapped: 121000 kB' 'AnonPages: 219608 kB' 'Shmem: 1765928 kB' 'KernelStack: 11192 kB' 'PageTables: 5308 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 96224 kB' 'Slab: 343472 kB' 'SReclaimable: 96224 kB' 'SUnreclaim: 247248 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.715 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:03:35.716 node0=1024 expecting 1024 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:03:35.716 00:03:35.716 real 0m4.263s 00:03:35.716 user 0m1.352s 00:03:35.716 sys 0m2.166s 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- common/autotest_common.sh@1126 -- # xtrace_disable 00:03:35.716 23:25:20 setup.sh.hugepages.default_setup -- common/autotest_common.sh@10 -- # set +x 00:03:35.716 ************************************ 00:03:35.716 END TEST default_setup 00:03:35.716 ************************************ 00:03:35.716 23:25:20 setup.sh.hugepages -- setup/hugepages.sh@211 -- # run_test per_node_1G_alloc per_node_1G_alloc 00:03:35.716 23:25:20 setup.sh.hugepages -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:03:35.716 23:25:20 setup.sh.hugepages -- common/autotest_common.sh@1107 -- # xtrace_disable 00:03:35.716 23:25:20 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:03:35.976 ************************************ 00:03:35.976 START TEST per_node_1G_alloc 00:03:35.976 ************************************ 00:03:35.976 23:25:20 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@1125 -- # per_node_1G_alloc 00:03:35.976 23:25:20 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@143 -- # local IFS=, 00:03:35.976 23:25:20 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@145 -- # get_test_nr_hugepages 1048576 0 1 00:03:35.976 23:25:20 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@49 -- # local size=1048576 00:03:35.976 23:25:20 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@50 -- # (( 3 > 1 )) 00:03:35.976 23:25:20 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@51 -- # shift 00:03:35.976 23:25:20 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@52 -- # node_ids=('0' '1') 00:03:35.976 23:25:20 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@52 -- # local node_ids 00:03:35.976 23:25:20 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:35.976 23:25:20 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:03:35.976 23:25:20 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 1 00:03:35.977 23:25:20 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@62 -- # user_nodes=('0' '1') 00:03:35.977 23:25:20 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:03:35.977 23:25:20 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:03:35.977 23:25:20 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:35.977 23:25:20 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:35.977 23:25:20 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:35.977 23:25:20 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@69 -- # (( 2 > 0 )) 00:03:35.977 23:25:20 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:35.977 23:25:20 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:03:35.977 23:25:20 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:35.977 23:25:20 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:03:35.977 23:25:20 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@73 -- # return 0 00:03:35.977 23:25:20 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # NRHUGE=512 00:03:35.977 23:25:20 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # HUGENODE=0,1 00:03:35.977 23:25:20 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # setup output 00:03:35.977 23:25:20 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:03:35.977 23:25:20 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:03:38.513 0000:5f:00.0 (1b96 2600): Skipping denied controller at 0000:5f:00.0 00:03:38.773 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:38.773 0000:5e:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:38.773 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:38.773 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:38.773 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:38.773 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:38.773 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:38.773 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:38.773 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:38.773 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:38.773 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:38.773 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:39.036 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:39.036 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:39.036 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:39.036 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:39.036 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:39.036 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@147 -- # nr_hugepages=1024 00:03:39.036 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@147 -- # verify_nr_hugepages 00:03:39.036 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@89 -- # local node 00:03:39.036 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:03:39.036 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:03:39.036 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@92 -- # local surp 00:03:39.036 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@93 -- # local resv 00:03:39.036 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@94 -- # local anon 00:03:39.036 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93323012 kB' 'MemFree: 77643312 kB' 'MemAvailable: 81063912 kB' 'Buffers: 2696 kB' 'Cached: 9843520 kB' 'SwapCached: 0 kB' 'Active: 6852156 kB' 'Inactive: 3507864 kB' 'Active(anon): 6454832 kB' 'Inactive(anon): 0 kB' 'Active(file): 397324 kB' 'Inactive(file): 3507864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 517156 kB' 'Mapped: 187000 kB' 'Shmem: 5941028 kB' 'KReclaimable: 223280 kB' 'Slab: 674596 kB' 'SReclaimable: 223280 kB' 'SUnreclaim: 451316 kB' 'KernelStack: 19712 kB' 'PageTables: 8568 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001532 kB' 'Committed_AS: 7829020 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 221384 kB' 'VmallocChunk: 0 kB' 'Percpu: 63360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2130900 kB' 'DirectMap2M: 13277184 kB' 'DirectMap1G: 87031808 kB' 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.037 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@97 -- # anon=0 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93323012 kB' 'MemFree: 77648412 kB' 'MemAvailable: 81069012 kB' 'Buffers: 2696 kB' 'Cached: 9843520 kB' 'SwapCached: 0 kB' 'Active: 6851812 kB' 'Inactive: 3507864 kB' 'Active(anon): 6454488 kB' 'Inactive(anon): 0 kB' 'Active(file): 397324 kB' 'Inactive(file): 3507864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 516788 kB' 'Mapped: 186840 kB' 'Shmem: 5941028 kB' 'KReclaimable: 223280 kB' 'Slab: 674556 kB' 'SReclaimable: 223280 kB' 'SUnreclaim: 451276 kB' 'KernelStack: 19648 kB' 'PageTables: 8376 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001532 kB' 'Committed_AS: 7829036 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 221368 kB' 'VmallocChunk: 0 kB' 'Percpu: 63360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2130900 kB' 'DirectMap2M: 13277184 kB' 'DirectMap1G: 87031808 kB' 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.038 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.039 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@99 -- # surp=0 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.040 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93323012 kB' 'MemFree: 77646672 kB' 'MemAvailable: 81067272 kB' 'Buffers: 2696 kB' 'Cached: 9843540 kB' 'SwapCached: 0 kB' 'Active: 6851288 kB' 'Inactive: 3507864 kB' 'Active(anon): 6453964 kB' 'Inactive(anon): 0 kB' 'Active(file): 397324 kB' 'Inactive(file): 3507864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 516180 kB' 'Mapped: 186840 kB' 'Shmem: 5941048 kB' 'KReclaimable: 223280 kB' 'Slab: 674556 kB' 'SReclaimable: 223280 kB' 'SUnreclaim: 451276 kB' 'KernelStack: 19648 kB' 'PageTables: 8364 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001532 kB' 'Committed_AS: 7829060 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 221384 kB' 'VmallocChunk: 0 kB' 'Percpu: 63360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2130900 kB' 'DirectMap2M: 13277184 kB' 'DirectMap1G: 87031808 kB' 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.041 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.042 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.043 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:03:39.043 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:03:39.043 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@100 -- # resv=0 00:03:39.043 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:39.043 nr_hugepages=1024 00:03:39.043 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:39.043 resv_hugepages=0 00:03:39.043 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:39.043 surplus_hugepages=0 00:03:39.043 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:39.043 anon_hugepages=0 00:03:39.043 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:39.043 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:39.043 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:39.043 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:39.043 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:03:39.043 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:03:39.043 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:39.043 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:39.043 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:39.043 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:39.043 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:39.043 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:39.043 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.043 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93323012 kB' 'MemFree: 77646992 kB' 'MemAvailable: 81067592 kB' 'Buffers: 2696 kB' 'Cached: 9843560 kB' 'SwapCached: 0 kB' 'Active: 6851604 kB' 'Inactive: 3507864 kB' 'Active(anon): 6454280 kB' 'Inactive(anon): 0 kB' 'Active(file): 397324 kB' 'Inactive(file): 3507864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 516500 kB' 'Mapped: 186900 kB' 'Shmem: 5941068 kB' 'KReclaimable: 223280 kB' 'Slab: 674612 kB' 'SReclaimable: 223280 kB' 'SUnreclaim: 451332 kB' 'KernelStack: 19648 kB' 'PageTables: 8372 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001532 kB' 'Committed_AS: 7829084 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 221368 kB' 'VmallocChunk: 0 kB' 'Percpu: 63360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2130900 kB' 'DirectMap2M: 13277184 kB' 'DirectMap1G: 87031808 kB' 00:03:39.043 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.043 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.043 23:25:23 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.043 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.044 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.045 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.045 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.045 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.045 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.045 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 1024 00:03:39.045 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:03:39.045 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:39.045 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:03:39.045 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@27 -- # local node 00:03:39.045 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:39.045 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:39.045 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:39.045 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:39.045 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:39.045 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:39.045 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:39.045 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:39.045 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:39.045 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:39.045 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node=0 00:03:39.045 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:03:39.045 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:39.045 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:39.045 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:39.045 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:39.045 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:39.045 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:39.045 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.045 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.306 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634628 kB' 'MemFree: 27792268 kB' 'MemUsed: 4842360 kB' 'SwapCached: 0 kB' 'Active: 2175128 kB' 'Inactive: 100316 kB' 'Active(anon): 1982132 kB' 'Inactive(anon): 0 kB' 'Active(file): 192996 kB' 'Inactive(file): 100316 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 2059344 kB' 'Mapped: 121020 kB' 'AnonPages: 219304 kB' 'Shmem: 1766032 kB' 'KernelStack: 11160 kB' 'PageTables: 5248 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 96160 kB' 'Slab: 343356 kB' 'SReclaimable: 96160 kB' 'SUnreclaim: 247196 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:39.306 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.306 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.306 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.306 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.306 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.306 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.306 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.306 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.306 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.306 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.306 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.306 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.306 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.306 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.306 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.306 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.306 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.306 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.306 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.306 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.306 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.306 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.306 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.306 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.306 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.306 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.306 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.306 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.306 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.306 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.306 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.306 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.306 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.306 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.306 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.306 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.306 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.306 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.306 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.306 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.306 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.306 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.306 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.306 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.306 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.306 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node=1 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:39.307 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60688384 kB' 'MemFree: 49855156 kB' 'MemUsed: 10833228 kB' 'SwapCached: 0 kB' 'Active: 4676388 kB' 'Inactive: 3407548 kB' 'Active(anon): 4472060 kB' 'Inactive(anon): 0 kB' 'Active(file): 204328 kB' 'Inactive(file): 3407548 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 7786932 kB' 'Mapped: 65880 kB' 'AnonPages: 297072 kB' 'Shmem: 4175056 kB' 'KernelStack: 8472 kB' 'PageTables: 3068 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 127120 kB' 'Slab: 331256 kB' 'SReclaimable: 127120 kB' 'SUnreclaim: 204136 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.308 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.309 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.309 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.309 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.309 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.309 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.309 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.309 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.309 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.309 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.309 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.309 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.309 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.309 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.309 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.309 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.309 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.309 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.309 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.309 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.309 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.309 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.309 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.309 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.309 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.309 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.309 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.309 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.309 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.309 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.309 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.309 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.309 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.309 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.309 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.309 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.309 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.309 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.309 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.309 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.309 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.309 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.309 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.309 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.309 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.309 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.309 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:39.309 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:39.309 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:39.309 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.309 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:03:39.309 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:03:39.309 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:39.309 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:39.309 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:39.309 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:39.309 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:03:39.309 node0=512 expecting 512 00:03:39.309 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:39.309 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:39.309 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:39.309 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:03:39.309 node1=512 expecting 512 00:03:39.309 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:03:39.309 00:03:39.309 real 0m3.345s 00:03:39.309 user 0m1.405s 00:03:39.309 sys 0m1.999s 00:03:39.309 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:03:39.309 23:25:24 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@10 -- # set +x 00:03:39.309 ************************************ 00:03:39.309 END TEST per_node_1G_alloc 00:03:39.309 ************************************ 00:03:39.309 23:25:24 setup.sh.hugepages -- setup/hugepages.sh@212 -- # run_test even_2G_alloc even_2G_alloc 00:03:39.309 23:25:24 setup.sh.hugepages -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:03:39.309 23:25:24 setup.sh.hugepages -- common/autotest_common.sh@1107 -- # xtrace_disable 00:03:39.309 23:25:24 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:03:39.309 ************************************ 00:03:39.309 START TEST even_2G_alloc 00:03:39.309 ************************************ 00:03:39.309 23:25:24 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1125 -- # even_2G_alloc 00:03:39.309 23:25:24 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@152 -- # get_test_nr_hugepages 2097152 00:03:39.309 23:25:24 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:03:39.309 23:25:24 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:39.309 23:25:24 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:39.309 23:25:24 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:39.309 23:25:24 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:39.309 23:25:24 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:39.309 23:25:24 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:03:39.309 23:25:24 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:39.309 23:25:24 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:39.309 23:25:24 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:39.309 23:25:24 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:39.309 23:25:24 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:39.309 23:25:24 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:03:39.309 23:25:24 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:39.309 23:25:24 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:03:39.309 23:25:24 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 512 00:03:39.309 23:25:24 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@84 -- # : 1 00:03:39.309 23:25:24 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:39.309 23:25:24 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:03:39.309 23:25:24 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 0 00:03:39.309 23:25:24 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@84 -- # : 0 00:03:39.309 23:25:24 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:39.309 23:25:24 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # NRHUGE=1024 00:03:39.309 23:25:24 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # HUGE_EVEN_ALLOC=yes 00:03:39.309 23:25:24 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # setup output 00:03:39.309 23:25:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:03:39.309 23:25:24 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:03:41.854 0000:5f:00.0 (1b96 2600): Skipping denied controller at 0000:5f:00.0 00:03:41.854 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:41.854 0000:5e:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:41.854 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:41.854 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:41.854 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:41.854 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:41.854 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:41.854 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:41.854 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:41.854 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:41.854 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:41.854 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:41.854 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:41.854 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:41.854 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:41.854 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:41.854 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:41.854 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@154 -- # verify_nr_hugepages 00:03:41.854 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@89 -- # local node 00:03:41.854 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:03:41.854 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:03:41.854 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@92 -- # local surp 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@93 -- # local resv 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@94 -- # local anon 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93323012 kB' 'MemFree: 77684628 kB' 'MemAvailable: 81105228 kB' 'Buffers: 2696 kB' 'Cached: 9843676 kB' 'SwapCached: 0 kB' 'Active: 6851024 kB' 'Inactive: 3507864 kB' 'Active(anon): 6453700 kB' 'Inactive(anon): 0 kB' 'Active(file): 397324 kB' 'Inactive(file): 3507864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 515772 kB' 'Mapped: 185788 kB' 'Shmem: 5941184 kB' 'KReclaimable: 223280 kB' 'Slab: 674340 kB' 'SReclaimable: 223280 kB' 'SUnreclaim: 451060 kB' 'KernelStack: 19616 kB' 'PageTables: 8180 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001532 kB' 'Committed_AS: 7823132 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 221352 kB' 'VmallocChunk: 0 kB' 'Percpu: 63360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2130900 kB' 'DirectMap2M: 13277184 kB' 'DirectMap1G: 87031808 kB' 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.855 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@97 -- # anon=0 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93323012 kB' 'MemFree: 77684712 kB' 'MemAvailable: 81105312 kB' 'Buffers: 2696 kB' 'Cached: 9843676 kB' 'SwapCached: 0 kB' 'Active: 6850724 kB' 'Inactive: 3507864 kB' 'Active(anon): 6453400 kB' 'Inactive(anon): 0 kB' 'Active(file): 397324 kB' 'Inactive(file): 3507864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 515476 kB' 'Mapped: 185740 kB' 'Shmem: 5941184 kB' 'KReclaimable: 223280 kB' 'Slab: 674340 kB' 'SReclaimable: 223280 kB' 'SUnreclaim: 451060 kB' 'KernelStack: 19616 kB' 'PageTables: 8180 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001532 kB' 'Committed_AS: 7823148 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 221368 kB' 'VmallocChunk: 0 kB' 'Percpu: 63360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2130900 kB' 'DirectMap2M: 13277184 kB' 'DirectMap1G: 87031808 kB' 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.856 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.857 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # surp=0 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.858 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93323012 kB' 'MemFree: 77684144 kB' 'MemAvailable: 81104744 kB' 'Buffers: 2696 kB' 'Cached: 9843692 kB' 'SwapCached: 0 kB' 'Active: 6851076 kB' 'Inactive: 3507864 kB' 'Active(anon): 6453752 kB' 'Inactive(anon): 0 kB' 'Active(file): 397324 kB' 'Inactive(file): 3507864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 515812 kB' 'Mapped: 185740 kB' 'Shmem: 5941200 kB' 'KReclaimable: 223280 kB' 'Slab: 674340 kB' 'SReclaimable: 223280 kB' 'SUnreclaim: 451060 kB' 'KernelStack: 19632 kB' 'PageTables: 8236 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001532 kB' 'Committed_AS: 7823172 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 221368 kB' 'VmallocChunk: 0 kB' 'Percpu: 63360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2130900 kB' 'DirectMap2M: 13277184 kB' 'DirectMap1G: 87031808 kB' 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.859 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.860 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@100 -- # resv=0 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:41.861 nr_hugepages=1024 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:41.861 resv_hugepages=0 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:41.861 surplus_hugepages=0 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:41.861 anon_hugepages=0 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93323012 kB' 'MemFree: 77684504 kB' 'MemAvailable: 81105104 kB' 'Buffers: 2696 kB' 'Cached: 9843716 kB' 'SwapCached: 0 kB' 'Active: 6850764 kB' 'Inactive: 3507864 kB' 'Active(anon): 6453440 kB' 'Inactive(anon): 0 kB' 'Active(file): 397324 kB' 'Inactive(file): 3507864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 515452 kB' 'Mapped: 185740 kB' 'Shmem: 5941224 kB' 'KReclaimable: 223280 kB' 'Slab: 674388 kB' 'SReclaimable: 223280 kB' 'SUnreclaim: 451108 kB' 'KernelStack: 19616 kB' 'PageTables: 8196 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001532 kB' 'Committed_AS: 7823192 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 221384 kB' 'VmallocChunk: 0 kB' 'Percpu: 63360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2130900 kB' 'DirectMap2M: 13277184 kB' 'DirectMap1G: 87031808 kB' 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.861 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 1024 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:41.862 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@27 -- # local node 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=0 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634628 kB' 'MemFree: 27802572 kB' 'MemUsed: 4832056 kB' 'SwapCached: 0 kB' 'Active: 2174816 kB' 'Inactive: 100316 kB' 'Active(anon): 1981820 kB' 'Inactive(anon): 0 kB' 'Active(file): 192996 kB' 'Inactive(file): 100316 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 2059468 kB' 'Mapped: 120468 kB' 'AnonPages: 218804 kB' 'Shmem: 1766156 kB' 'KernelStack: 11176 kB' 'PageTables: 5272 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 96160 kB' 'Slab: 343044 kB' 'SReclaimable: 96160 kB' 'SUnreclaim: 246884 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.863 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=1 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60688384 kB' 'MemFree: 49881932 kB' 'MemUsed: 10806452 kB' 'SwapCached: 0 kB' 'Active: 4675952 kB' 'Inactive: 3407548 kB' 'Active(anon): 4471624 kB' 'Inactive(anon): 0 kB' 'Active(file): 204328 kB' 'Inactive(file): 3407548 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 7786968 kB' 'Mapped: 65272 kB' 'AnonPages: 296648 kB' 'Shmem: 4175092 kB' 'KernelStack: 8440 kB' 'PageTables: 2924 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 127120 kB' 'Slab: 331344 kB' 'SReclaimable: 127120 kB' 'SUnreclaim: 204224 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.864 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.865 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.866 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.866 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.866 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.866 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.866 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.866 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.866 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.866 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.866 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:41.866 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:41.866 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:41.866 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.866 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:03:41.866 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:03:41.866 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:41.866 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:41.866 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:41.866 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:41.866 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:03:41.866 node0=512 expecting 512 00:03:41.866 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:41.866 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:41.866 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:41.866 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:03:41.866 node1=512 expecting 512 00:03:41.866 23:25:26 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:03:41.866 00:03:41.866 real 0m2.713s 00:03:41.866 user 0m0.979s 00:03:41.866 sys 0m1.569s 00:03:41.866 23:25:26 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:03:41.866 23:25:26 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@10 -- # set +x 00:03:41.866 ************************************ 00:03:41.866 END TEST even_2G_alloc 00:03:41.866 ************************************ 00:03:42.126 23:25:26 setup.sh.hugepages -- setup/hugepages.sh@213 -- # run_test odd_alloc odd_alloc 00:03:42.126 23:25:26 setup.sh.hugepages -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:03:42.126 23:25:26 setup.sh.hugepages -- common/autotest_common.sh@1107 -- # xtrace_disable 00:03:42.126 23:25:26 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:03:42.126 ************************************ 00:03:42.126 START TEST odd_alloc 00:03:42.126 ************************************ 00:03:42.126 23:25:26 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1125 -- # odd_alloc 00:03:42.126 23:25:26 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@159 -- # get_test_nr_hugepages 2098176 00:03:42.126 23:25:26 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@49 -- # local size=2098176 00:03:42.126 23:25:26 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:42.126 23:25:26 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:42.126 23:25:26 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1025 00:03:42.126 23:25:26 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:42.126 23:25:26 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:42.126 23:25:26 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:03:42.126 23:25:26 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1025 00:03:42.126 23:25:26 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:42.126 23:25:26 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:42.126 23:25:26 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:42.126 23:25:26 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:42.126 23:25:26 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:03:42.126 23:25:26 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:42.126 23:25:26 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:03:42.126 23:25:26 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 513 00:03:42.126 23:25:26 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@84 -- # : 1 00:03:42.126 23:25:26 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:42.126 23:25:26 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=513 00:03:42.126 23:25:26 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 0 00:03:42.126 23:25:26 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@84 -- # : 0 00:03:42.126 23:25:26 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:42.126 23:25:26 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # HUGEMEM=2049 00:03:42.126 23:25:26 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # HUGE_EVEN_ALLOC=yes 00:03:42.126 23:25:26 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # setup output 00:03:42.126 23:25:26 setup.sh.hugepages.odd_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:03:42.126 23:25:26 setup.sh.hugepages.odd_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:03:44.728 0000:5f:00.0 (1b96 2600): Skipping denied controller at 0000:5f:00.0 00:03:44.728 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:44.728 0000:5e:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:44.728 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:44.728 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:44.728 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:44.728 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:44.728 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:44.728 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:44.728 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:44.728 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:44.728 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:44.728 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:44.728 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:44.728 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:44.728 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:44.728 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:44.728 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@161 -- # verify_nr_hugepages 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@89 -- # local node 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@92 -- # local surp 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@93 -- # local resv 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@94 -- # local anon 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93323012 kB' 'MemFree: 77691820 kB' 'MemAvailable: 81112420 kB' 'Buffers: 2696 kB' 'Cached: 9843820 kB' 'SwapCached: 0 kB' 'Active: 6851532 kB' 'Inactive: 3507864 kB' 'Active(anon): 6454208 kB' 'Inactive(anon): 0 kB' 'Active(file): 397324 kB' 'Inactive(file): 3507864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 516156 kB' 'Mapped: 185796 kB' 'Shmem: 5941328 kB' 'KReclaimable: 223280 kB' 'Slab: 674632 kB' 'SReclaimable: 223280 kB' 'SUnreclaim: 451352 kB' 'KernelStack: 19616 kB' 'PageTables: 8192 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54000508 kB' 'Committed_AS: 7823692 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 221384 kB' 'VmallocChunk: 0 kB' 'Percpu: 63360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 2130900 kB' 'DirectMap2M: 13277184 kB' 'DirectMap1G: 87031808 kB' 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.728 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@97 -- # anon=0 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93323012 kB' 'MemFree: 77693140 kB' 'MemAvailable: 81113740 kB' 'Buffers: 2696 kB' 'Cached: 9843820 kB' 'SwapCached: 0 kB' 'Active: 6851544 kB' 'Inactive: 3507864 kB' 'Active(anon): 6454220 kB' 'Inactive(anon): 0 kB' 'Active(file): 397324 kB' 'Inactive(file): 3507864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 516692 kB' 'Mapped: 186252 kB' 'Shmem: 5941328 kB' 'KReclaimable: 223280 kB' 'Slab: 674708 kB' 'SReclaimable: 223280 kB' 'SUnreclaim: 451428 kB' 'KernelStack: 19584 kB' 'PageTables: 8104 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54000508 kB' 'Committed_AS: 7825196 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 221336 kB' 'VmallocChunk: 0 kB' 'Percpu: 63360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 2130900 kB' 'DirectMap2M: 13277184 kB' 'DirectMap1G: 87031808 kB' 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.729 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.730 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # surp=0 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93323012 kB' 'MemFree: 77693876 kB' 'MemAvailable: 81114476 kB' 'Buffers: 2696 kB' 'Cached: 9843840 kB' 'SwapCached: 0 kB' 'Active: 6856316 kB' 'Inactive: 3507864 kB' 'Active(anon): 6458992 kB' 'Inactive(anon): 0 kB' 'Active(file): 397324 kB' 'Inactive(file): 3507864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 521408 kB' 'Mapped: 186252 kB' 'Shmem: 5941348 kB' 'KReclaimable: 223280 kB' 'Slab: 674708 kB' 'SReclaimable: 223280 kB' 'SUnreclaim: 451428 kB' 'KernelStack: 19616 kB' 'PageTables: 8208 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54000508 kB' 'Committed_AS: 7829848 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 221340 kB' 'VmallocChunk: 0 kB' 'Percpu: 63360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 2130900 kB' 'DirectMap2M: 13277184 kB' 'DirectMap1G: 87031808 kB' 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.731 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.732 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@100 -- # resv=0 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1025 00:03:44.733 nr_hugepages=1025 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:44.733 resv_hugepages=0 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:44.733 surplus_hugepages=0 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:44.733 anon_hugepages=0 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@107 -- # (( 1025 == nr_hugepages + surp + resv )) 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages )) 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.733 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93323012 kB' 'MemFree: 77692768 kB' 'MemAvailable: 81113368 kB' 'Buffers: 2696 kB' 'Cached: 9843860 kB' 'SwapCached: 0 kB' 'Active: 6857032 kB' 'Inactive: 3507864 kB' 'Active(anon): 6459708 kB' 'Inactive(anon): 0 kB' 'Active(file): 397324 kB' 'Inactive(file): 3507864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 521664 kB' 'Mapped: 186592 kB' 'Shmem: 5941368 kB' 'KReclaimable: 223280 kB' 'Slab: 674708 kB' 'SReclaimable: 223280 kB' 'SUnreclaim: 451428 kB' 'KernelStack: 19632 kB' 'PageTables: 8308 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54000508 kB' 'Committed_AS: 7829636 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 221352 kB' 'VmallocChunk: 0 kB' 'Percpu: 63360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 2130900 kB' 'DirectMap2M: 13277184 kB' 'DirectMap1G: 87031808 kB' 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.734 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 1025 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@110 -- # (( 1025 == nr_hugepages + surp + resv )) 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@27 -- # local node 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=513 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=0 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634628 kB' 'MemFree: 27802100 kB' 'MemUsed: 4832528 kB' 'SwapCached: 0 kB' 'Active: 2174528 kB' 'Inactive: 100316 kB' 'Active(anon): 1981532 kB' 'Inactive(anon): 0 kB' 'Active(file): 192996 kB' 'Inactive(file): 100316 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 2059560 kB' 'Mapped: 120484 kB' 'AnonPages: 218464 kB' 'Shmem: 1766248 kB' 'KernelStack: 11160 kB' 'PageTables: 5172 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 96160 kB' 'Slab: 343292 kB' 'SReclaimable: 96160 kB' 'SUnreclaim: 247132 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.735 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.736 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.737 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.737 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.737 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.737 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.737 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.737 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.737 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.737 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.737 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.737 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.737 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.737 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.737 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.737 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.737 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.737 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.737 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.737 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:03:44.737 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:03:44.737 23:25:29 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:44.737 23:25:29 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:44.737 23:25:29 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:44.737 23:25:29 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:44.737 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:44.737 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=1 00:03:44.737 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:03:44.737 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:44.737 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:44.737 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:44.737 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:44.737 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:44.737 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:44.997 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.997 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.997 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60688384 kB' 'MemFree: 49899120 kB' 'MemUsed: 10789264 kB' 'SwapCached: 0 kB' 'Active: 4676532 kB' 'Inactive: 3407548 kB' 'Active(anon): 4472204 kB' 'Inactive(anon): 0 kB' 'Active(file): 204328 kB' 'Inactive(file): 3407548 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 7786996 kB' 'Mapped: 65264 kB' 'AnonPages: 297184 kB' 'Shmem: 4175120 kB' 'KernelStack: 8392 kB' 'PageTables: 2788 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 127120 kB' 'Slab: 331416 kB' 'SReclaimable: 127120 kB' 'SUnreclaim: 204296 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:03:44.997 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.997 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.997 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.997 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.997 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.997 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.997 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.997 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.997 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.997 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.997 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.997 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.997 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.997 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.997 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.998 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.999 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.999 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:44.999 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:44.999 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:44.999 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.999 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:03:44.999 23:25:29 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:03:44.999 23:25:29 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:44.999 23:25:29 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:44.999 23:25:29 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:44.999 23:25:29 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:44.999 23:25:29 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 513' 00:03:44.999 node0=512 expecting 513 00:03:44.999 23:25:29 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:44.999 23:25:29 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:44.999 23:25:29 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:44.999 23:25:29 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@128 -- # echo 'node1=513 expecting 512' 00:03:44.999 node1=513 expecting 512 00:03:44.999 23:25:29 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@130 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:03:44.999 00:03:44.999 real 0m2.851s 00:03:44.999 user 0m1.103s 00:03:44.999 sys 0m1.730s 00:03:44.999 23:25:29 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:03:44.999 23:25:29 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@10 -- # set +x 00:03:44.999 ************************************ 00:03:44.999 END TEST odd_alloc 00:03:44.999 ************************************ 00:03:44.999 23:25:29 setup.sh.hugepages -- setup/hugepages.sh@214 -- # run_test custom_alloc custom_alloc 00:03:44.999 23:25:29 setup.sh.hugepages -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:03:44.999 23:25:29 setup.sh.hugepages -- common/autotest_common.sh@1107 -- # xtrace_disable 00:03:44.999 23:25:29 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:03:44.999 ************************************ 00:03:44.999 START TEST custom_alloc 00:03:44.999 ************************************ 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1125 -- # custom_alloc 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@167 -- # local IFS=, 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@169 -- # local node 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@170 -- # nodes_hp=() 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@170 -- # local nodes_hp 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@172 -- # local nr_hugepages=0 _nr_hugepages=0 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@174 -- # get_test_nr_hugepages 1048576 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # local size=1048576 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 256 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@84 -- # : 1 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 0 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@84 -- # : 0 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@175 -- # nodes_hp[0]=512 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@176 -- # (( 2 > 1 )) 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@177 -- # get_test_nr_hugepages 2097152 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 1 > 0 )) 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@78 -- # return 0 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@178 -- # nodes_hp[1]=1024 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@186 -- # get_test_nr_hugepages_per_node 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 2 > 0 )) 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=1024 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@78 -- # return 0 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@187 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@187 -- # setup output 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:03:44.999 23:25:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:03:47.535 0000:5f:00.0 (1b96 2600): Skipping denied controller at 0000:5f:00.0 00:03:47.797 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:47.797 0000:5e:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:47.797 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:47.797 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:47.797 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:47.797 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:47.797 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:47.797 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:47.797 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:47.797 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:47.797 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:47.797 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:47.797 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:47.797 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:47.797 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:47.797 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:47.797 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:47.797 23:25:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@188 -- # nr_hugepages=1536 00:03:47.797 23:25:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@188 -- # verify_nr_hugepages 00:03:47.797 23:25:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@89 -- # local node 00:03:47.797 23:25:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:03:47.797 23:25:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:03:47.797 23:25:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@92 -- # local surp 00:03:47.797 23:25:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@93 -- # local resv 00:03:47.797 23:25:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@94 -- # local anon 00:03:47.797 23:25:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:47.797 23:25:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:47.797 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:47.797 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:03:47.797 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:03:47.797 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:47.797 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:47.797 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:47.797 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:47.797 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:47.797 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:47.797 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.797 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.797 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93323012 kB' 'MemFree: 76672296 kB' 'MemAvailable: 80092848 kB' 'Buffers: 2696 kB' 'Cached: 9843976 kB' 'SwapCached: 0 kB' 'Active: 6850516 kB' 'Inactive: 3507864 kB' 'Active(anon): 6453192 kB' 'Inactive(anon): 0 kB' 'Active(file): 397324 kB' 'Inactive(file): 3507864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 514880 kB' 'Mapped: 185812 kB' 'Shmem: 5941484 kB' 'KReclaimable: 223184 kB' 'Slab: 674912 kB' 'SReclaimable: 223184 kB' 'SUnreclaim: 451728 kB' 'KernelStack: 19616 kB' 'PageTables: 8216 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53477244 kB' 'Committed_AS: 7825492 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 221432 kB' 'VmallocChunk: 0 kB' 'Percpu: 63360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 2130900 kB' 'DirectMap2M: 13277184 kB' 'DirectMap1G: 87031808 kB' 00:03:47.797 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:47.797 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.797 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.797 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.797 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:47.797 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.797 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.797 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.797 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:47.797 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.797 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.797 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.797 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:47.797 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.797 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.797 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.797 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:47.797 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.797 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.797 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.797 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:47.797 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.797 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.797 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.797 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:47.797 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.797 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.797 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.797 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:47.797 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.797 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.797 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.797 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:47.797 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.797 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.797 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.797 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:47.797 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.797 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.797 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.797 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:47.797 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.797 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.797 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.797 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:47.797 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@97 -- # anon=0 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:03:47.798 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93323012 kB' 'MemFree: 76672892 kB' 'MemAvailable: 80093444 kB' 'Buffers: 2696 kB' 'Cached: 9843976 kB' 'SwapCached: 0 kB' 'Active: 6850184 kB' 'Inactive: 3507864 kB' 'Active(anon): 6452860 kB' 'Inactive(anon): 0 kB' 'Active(file): 397324 kB' 'Inactive(file): 3507864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 514528 kB' 'Mapped: 185768 kB' 'Shmem: 5941484 kB' 'KReclaimable: 223184 kB' 'Slab: 674880 kB' 'SReclaimable: 223184 kB' 'SUnreclaim: 451696 kB' 'KernelStack: 19760 kB' 'PageTables: 8372 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53477244 kB' 'Committed_AS: 7827000 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 221464 kB' 'VmallocChunk: 0 kB' 'Percpu: 63360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 2130900 kB' 'DirectMap2M: 13277184 kB' 'DirectMap1G: 87031808 kB' 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.799 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # surp=0 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.800 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93323012 kB' 'MemFree: 76674448 kB' 'MemAvailable: 80095000 kB' 'Buffers: 2696 kB' 'Cached: 9844000 kB' 'SwapCached: 0 kB' 'Active: 6850436 kB' 'Inactive: 3507864 kB' 'Active(anon): 6453112 kB' 'Inactive(anon): 0 kB' 'Active(file): 397324 kB' 'Inactive(file): 3507864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 514788 kB' 'Mapped: 185768 kB' 'Shmem: 5941508 kB' 'KReclaimable: 223184 kB' 'Slab: 674948 kB' 'SReclaimable: 223184 kB' 'SUnreclaim: 451764 kB' 'KernelStack: 19840 kB' 'PageTables: 8576 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53477244 kB' 'Committed_AS: 7827020 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 221512 kB' 'VmallocChunk: 0 kB' 'Percpu: 63360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 2130900 kB' 'DirectMap2M: 13277184 kB' 'DirectMap1G: 87031808 kB' 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:47.801 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.064 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@100 -- # resv=0 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1536 00:03:48.065 nr_hugepages=1536 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:48.065 resv_hugepages=0 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:48.065 surplus_hugepages=0 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:48.065 anon_hugepages=0 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@107 -- # (( 1536 == nr_hugepages + surp + resv )) 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages )) 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93323012 kB' 'MemFree: 76673812 kB' 'MemAvailable: 80094364 kB' 'Buffers: 2696 kB' 'Cached: 9844000 kB' 'SwapCached: 0 kB' 'Active: 6850248 kB' 'Inactive: 3507864 kB' 'Active(anon): 6452924 kB' 'Inactive(anon): 0 kB' 'Active(file): 397324 kB' 'Inactive(file): 3507864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 514600 kB' 'Mapped: 185768 kB' 'Shmem: 5941508 kB' 'KReclaimable: 223184 kB' 'Slab: 674948 kB' 'SReclaimable: 223184 kB' 'SUnreclaim: 451764 kB' 'KernelStack: 19808 kB' 'PageTables: 8288 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53477244 kB' 'Committed_AS: 7827044 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 221528 kB' 'VmallocChunk: 0 kB' 'Percpu: 63360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 2130900 kB' 'DirectMap2M: 13277184 kB' 'DirectMap1G: 87031808 kB' 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.065 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.066 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 1536 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@110 -- # (( 1536 == nr_hugepages + surp + resv )) 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@27 -- # local node 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=0 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634628 kB' 'MemFree: 27830852 kB' 'MemUsed: 4803776 kB' 'SwapCached: 0 kB' 'Active: 2174300 kB' 'Inactive: 100316 kB' 'Active(anon): 1981304 kB' 'Inactive(anon): 0 kB' 'Active(file): 192996 kB' 'Inactive(file): 100316 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 2059676 kB' 'Mapped: 120500 kB' 'AnonPages: 217608 kB' 'Shmem: 1766364 kB' 'KernelStack: 11288 kB' 'PageTables: 5160 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 96064 kB' 'Slab: 343528 kB' 'SReclaimable: 96064 kB' 'SUnreclaim: 247464 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.067 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=1 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:48.068 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60688384 kB' 'MemFree: 48842816 kB' 'MemUsed: 11845568 kB' 'SwapCached: 0 kB' 'Active: 4675740 kB' 'Inactive: 3407548 kB' 'Active(anon): 4471412 kB' 'Inactive(anon): 0 kB' 'Active(file): 204328 kB' 'Inactive(file): 3407548 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 7787036 kB' 'Mapped: 65772 kB' 'AnonPages: 296756 kB' 'Shmem: 4175160 kB' 'KernelStack: 8504 kB' 'PageTables: 3008 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 127120 kB' 'Slab: 331420 kB' 'SReclaimable: 127120 kB' 'SUnreclaim: 204300 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.069 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.070 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.070 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.070 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.070 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.070 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.070 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.070 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.070 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.070 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.070 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.070 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.070 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.070 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.070 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.070 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.070 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.070 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.070 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.070 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.070 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.070 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.070 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.070 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.070 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.070 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.070 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.070 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.070 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.070 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.070 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.070 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.070 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.070 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.070 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:48.070 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:48.070 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:48.070 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:48.070 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:03:48.070 23:25:32 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:03:48.070 23:25:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:48.070 23:25:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:48.070 23:25:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:48.070 23:25:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:48.070 23:25:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:03:48.070 node0=512 expecting 512 00:03:48.070 23:25:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:48.070 23:25:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:48.070 23:25:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:48.070 23:25:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@128 -- # echo 'node1=1024 expecting 1024' 00:03:48.070 node1=1024 expecting 1024 00:03:48.070 23:25:32 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@130 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:03:48.070 00:03:48.070 real 0m3.078s 00:03:48.070 user 0m1.196s 00:03:48.070 sys 0m1.887s 00:03:48.070 23:25:32 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:03:48.070 23:25:32 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@10 -- # set +x 00:03:48.070 ************************************ 00:03:48.070 END TEST custom_alloc 00:03:48.070 ************************************ 00:03:48.070 23:25:32 setup.sh.hugepages -- setup/hugepages.sh@215 -- # run_test no_shrink_alloc no_shrink_alloc 00:03:48.070 23:25:32 setup.sh.hugepages -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:03:48.070 23:25:32 setup.sh.hugepages -- common/autotest_common.sh@1107 -- # xtrace_disable 00:03:48.070 23:25:32 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:03:48.070 ************************************ 00:03:48.070 START TEST no_shrink_alloc 00:03:48.070 ************************************ 00:03:48.070 23:25:32 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1125 -- # no_shrink_alloc 00:03:48.070 23:25:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@195 -- # get_test_nr_hugepages 2097152 0 00:03:48.070 23:25:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:03:48.070 23:25:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:03:48.070 23:25:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@51 -- # shift 00:03:48.070 23:25:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@52 -- # node_ids=('0') 00:03:48.070 23:25:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@52 -- # local node_ids 00:03:48.070 23:25:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:48.070 23:25:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:48.070 23:25:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:03:48.070 23:25:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:03:48.070 23:25:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:03:48.070 23:25:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:48.070 23:25:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:48.070 23:25:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:48.070 23:25:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:48.070 23:25:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:03:48.070 23:25:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:48.070 23:25:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:03:48.070 23:25:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@73 -- # return 0 00:03:48.070 23:25:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@198 -- # setup output 00:03:48.070 23:25:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:03:48.070 23:25:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:03:50.603 0000:5f:00.0 (1b96 2600): Skipping denied controller at 0000:5f:00.0 00:03:50.862 0000:5e:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:50.862 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:50.862 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:50.862 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:50.862 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:50.862 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:50.862 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:50.862 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:50.862 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:50.862 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:50.862 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:50.862 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:50.862 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:50.862 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:50.862 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:50.862 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:50.862 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@199 -- # verify_nr_hugepages 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local node 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local surp 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local resv 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@94 -- # local anon 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93323012 kB' 'MemFree: 77699112 kB' 'MemAvailable: 81119664 kB' 'Buffers: 2696 kB' 'Cached: 9844132 kB' 'SwapCached: 0 kB' 'Active: 6850752 kB' 'Inactive: 3507864 kB' 'Active(anon): 6453428 kB' 'Inactive(anon): 0 kB' 'Active(file): 397324 kB' 'Inactive(file): 3507864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 514952 kB' 'Mapped: 185820 kB' 'Shmem: 5941640 kB' 'KReclaimable: 223184 kB' 'Slab: 674876 kB' 'SReclaimable: 223184 kB' 'SUnreclaim: 451692 kB' 'KernelStack: 19632 kB' 'PageTables: 8240 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001532 kB' 'Committed_AS: 7825192 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 221400 kB' 'VmallocChunk: 0 kB' 'Percpu: 63360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2130900 kB' 'DirectMap2M: 13277184 kB' 'DirectMap1G: 87031808 kB' 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.863 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.127 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # anon=0 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93323012 kB' 'MemFree: 77701128 kB' 'MemAvailable: 81121680 kB' 'Buffers: 2696 kB' 'Cached: 9844136 kB' 'SwapCached: 0 kB' 'Active: 6850760 kB' 'Inactive: 3507864 kB' 'Active(anon): 6453436 kB' 'Inactive(anon): 0 kB' 'Active(file): 397324 kB' 'Inactive(file): 3507864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 515000 kB' 'Mapped: 185772 kB' 'Shmem: 5941644 kB' 'KReclaimable: 223184 kB' 'Slab: 674876 kB' 'SReclaimable: 223184 kB' 'SUnreclaim: 451692 kB' 'KernelStack: 19648 kB' 'PageTables: 8284 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001532 kB' 'Committed_AS: 7825212 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 221352 kB' 'VmallocChunk: 0 kB' 'Percpu: 63360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2130900 kB' 'DirectMap2M: 13277184 kB' 'DirectMap1G: 87031808 kB' 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.128 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:03:51.129 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # surp=0 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93323012 kB' 'MemFree: 77701036 kB' 'MemAvailable: 81121588 kB' 'Buffers: 2696 kB' 'Cached: 9844152 kB' 'SwapCached: 0 kB' 'Active: 6850272 kB' 'Inactive: 3507864 kB' 'Active(anon): 6452948 kB' 'Inactive(anon): 0 kB' 'Active(file): 397324 kB' 'Inactive(file): 3507864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 514508 kB' 'Mapped: 185772 kB' 'Shmem: 5941660 kB' 'KReclaimable: 223184 kB' 'Slab: 674896 kB' 'SReclaimable: 223184 kB' 'SUnreclaim: 451712 kB' 'KernelStack: 19616 kB' 'PageTables: 8196 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001532 kB' 'Committed_AS: 7825232 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 221368 kB' 'VmallocChunk: 0 kB' 'Percpu: 63360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2130900 kB' 'DirectMap2M: 13277184 kB' 'DirectMap1G: 87031808 kB' 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.130 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.131 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # resv=0 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:51.132 nr_hugepages=1024 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:51.132 resv_hugepages=0 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:51.132 surplus_hugepages=0 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:51.132 anon_hugepages=0 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93323012 kB' 'MemFree: 77700996 kB' 'MemAvailable: 81121548 kB' 'Buffers: 2696 kB' 'Cached: 9844176 kB' 'SwapCached: 0 kB' 'Active: 6850284 kB' 'Inactive: 3507864 kB' 'Active(anon): 6452960 kB' 'Inactive(anon): 0 kB' 'Active(file): 397324 kB' 'Inactive(file): 3507864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 514508 kB' 'Mapped: 185772 kB' 'Shmem: 5941684 kB' 'KReclaimable: 223184 kB' 'Slab: 674896 kB' 'SReclaimable: 223184 kB' 'SUnreclaim: 451712 kB' 'KernelStack: 19616 kB' 'PageTables: 8196 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001532 kB' 'Committed_AS: 7825256 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 221368 kB' 'VmallocChunk: 0 kB' 'Percpu: 63360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2130900 kB' 'DirectMap2M: 13277184 kB' 'DirectMap1G: 87031808 kB' 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.132 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.133 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@27 -- # local node 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634628 kB' 'MemFree: 26794680 kB' 'MemUsed: 5839948 kB' 'SwapCached: 0 kB' 'Active: 2174388 kB' 'Inactive: 100316 kB' 'Active(anon): 1981392 kB' 'Inactive(anon): 0 kB' 'Active(file): 192996 kB' 'Inactive(file): 100316 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 2059820 kB' 'Mapped: 120504 kB' 'AnonPages: 218048 kB' 'Shmem: 1766508 kB' 'KernelStack: 11208 kB' 'PageTables: 5324 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 96064 kB' 'Slab: 343492 kB' 'SReclaimable: 96064 kB' 'SUnreclaim: 247428 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.134 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:03:51.135 node0=1024 expecting 1024 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # CLEAR_HUGE=no 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # NRHUGE=512 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # setup output 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:03:51.135 23:25:35 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:03:53.672 0000:5f:00.0 (1b96 2600): Skipping denied controller at 0000:5f:00.0 00:03:53.931 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:53.931 0000:5e:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:53.931 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:53.931 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:53.931 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:53.931 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:53.931 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:53.931 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:53.931 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:53.931 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:53.931 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:53.931 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:53.931 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:53.931 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:53.931 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:53.931 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:53.931 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:53.931 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:03:54.195 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@204 -- # verify_nr_hugepages 00:03:54.195 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local node 00:03:54.195 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:03:54.195 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:03:54.195 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local surp 00:03:54.195 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local resv 00:03:54.195 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@94 -- # local anon 00:03:54.195 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:54.195 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:54.195 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:54.195 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:03:54.195 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:03:54.195 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:54.195 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:54.195 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:54.195 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:54.195 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:54.195 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:54.195 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.195 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.195 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93323012 kB' 'MemFree: 77688456 kB' 'MemAvailable: 81109008 kB' 'Buffers: 2696 kB' 'Cached: 9844268 kB' 'SwapCached: 0 kB' 'Active: 6851296 kB' 'Inactive: 3507864 kB' 'Active(anon): 6453972 kB' 'Inactive(anon): 0 kB' 'Active(file): 397324 kB' 'Inactive(file): 3507864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 515460 kB' 'Mapped: 185828 kB' 'Shmem: 5941776 kB' 'KReclaimable: 223184 kB' 'Slab: 674680 kB' 'SReclaimable: 223184 kB' 'SUnreclaim: 451496 kB' 'KernelStack: 19648 kB' 'PageTables: 8304 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001532 kB' 'Committed_AS: 7825876 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 221464 kB' 'VmallocChunk: 0 kB' 'Percpu: 63360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2130900 kB' 'DirectMap2M: 13277184 kB' 'DirectMap1G: 87031808 kB' 00:03:54.195 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.195 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.195 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.195 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.195 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.195 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.195 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.195 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.196 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # anon=0 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93323012 kB' 'MemFree: 77688184 kB' 'MemAvailable: 81108736 kB' 'Buffers: 2696 kB' 'Cached: 9844272 kB' 'SwapCached: 0 kB' 'Active: 6850716 kB' 'Inactive: 3507864 kB' 'Active(anon): 6453392 kB' 'Inactive(anon): 0 kB' 'Active(file): 397324 kB' 'Inactive(file): 3507864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 514828 kB' 'Mapped: 185780 kB' 'Shmem: 5941780 kB' 'KReclaimable: 223184 kB' 'Slab: 674680 kB' 'SReclaimable: 223184 kB' 'SUnreclaim: 451496 kB' 'KernelStack: 19600 kB' 'PageTables: 8140 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001532 kB' 'Committed_AS: 7825896 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 221416 kB' 'VmallocChunk: 0 kB' 'Percpu: 63360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2130900 kB' 'DirectMap2M: 13277184 kB' 'DirectMap1G: 87031808 kB' 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.197 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.198 23:25:38 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.198 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.198 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.198 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # surp=0 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93323012 kB' 'MemFree: 77688184 kB' 'MemAvailable: 81108736 kB' 'Buffers: 2696 kB' 'Cached: 9844288 kB' 'SwapCached: 0 kB' 'Active: 6850892 kB' 'Inactive: 3507864 kB' 'Active(anon): 6453568 kB' 'Inactive(anon): 0 kB' 'Active(file): 397324 kB' 'Inactive(file): 3507864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 514992 kB' 'Mapped: 185780 kB' 'Shmem: 5941796 kB' 'KReclaimable: 223184 kB' 'Slab: 674760 kB' 'SReclaimable: 223184 kB' 'SUnreclaim: 451576 kB' 'KernelStack: 19616 kB' 'PageTables: 8204 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001532 kB' 'Committed_AS: 7825916 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 221416 kB' 'VmallocChunk: 0 kB' 'Percpu: 63360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2130900 kB' 'DirectMap2M: 13277184 kB' 'DirectMap1G: 87031808 kB' 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.199 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.200 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # resv=0 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:54.201 nr_hugepages=1024 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:54.201 resv_hugepages=0 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:54.201 surplus_hugepages=0 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:54.201 anon_hugepages=0 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93323012 kB' 'MemFree: 77688540 kB' 'MemAvailable: 81109092 kB' 'Buffers: 2696 kB' 'Cached: 9844288 kB' 'SwapCached: 0 kB' 'Active: 6850892 kB' 'Inactive: 3507864 kB' 'Active(anon): 6453568 kB' 'Inactive(anon): 0 kB' 'Active(file): 397324 kB' 'Inactive(file): 3507864 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 514992 kB' 'Mapped: 185780 kB' 'Shmem: 5941796 kB' 'KReclaimable: 223184 kB' 'Slab: 674760 kB' 'SReclaimable: 223184 kB' 'SUnreclaim: 451576 kB' 'KernelStack: 19616 kB' 'PageTables: 8204 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001532 kB' 'Committed_AS: 7825940 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 221416 kB' 'VmallocChunk: 0 kB' 'Percpu: 63360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2130900 kB' 'DirectMap2M: 13277184 kB' 'DirectMap1G: 87031808 kB' 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.201 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.202 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@27 -- # local node 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634628 kB' 'MemFree: 26769804 kB' 'MemUsed: 5864824 kB' 'SwapCached: 0 kB' 'Active: 2175684 kB' 'Inactive: 100316 kB' 'Active(anon): 1982688 kB' 'Inactive(anon): 0 kB' 'Active(file): 192996 kB' 'Inactive(file): 100316 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 2059948 kB' 'Mapped: 120512 kB' 'AnonPages: 219252 kB' 'Shmem: 1766636 kB' 'KernelStack: 11192 kB' 'PageTables: 5276 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 96064 kB' 'Slab: 343580 kB' 'SReclaimable: 96064 kB' 'SUnreclaim: 247516 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.203 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:54.204 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:03:54.205 node0=1024 expecting 1024 00:03:54.205 23:25:39 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:03:54.205 00:03:54.205 real 0m6.145s 00:03:54.205 user 0m2.377s 00:03:54.205 sys 0m3.825s 00:03:54.205 23:25:39 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:03:54.205 23:25:39 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@10 -- # set +x 00:03:54.205 ************************************ 00:03:54.205 END TEST no_shrink_alloc 00:03:54.205 ************************************ 00:03:54.205 23:25:39 setup.sh.hugepages -- setup/hugepages.sh@217 -- # clear_hp 00:03:54.205 23:25:39 setup.sh.hugepages -- setup/hugepages.sh@37 -- # local node hp 00:03:54.205 23:25:39 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:54.205 23:25:39 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:54.205 23:25:39 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:03:54.205 23:25:39 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:54.205 23:25:39 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:03:54.205 23:25:39 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:54.205 23:25:39 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:54.205 23:25:39 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:03:54.205 23:25:39 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:54.205 23:25:39 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:03:54.205 23:25:39 setup.sh.hugepages -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:03:54.205 23:25:39 setup.sh.hugepages -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:03:54.205 00:03:54.205 real 0m22.849s 00:03:54.205 user 0m8.582s 00:03:54.205 sys 0m13.483s 00:03:54.205 23:25:39 setup.sh.hugepages -- common/autotest_common.sh@1126 -- # xtrace_disable 00:03:54.205 23:25:39 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:03:54.205 ************************************ 00:03:54.205 END TEST hugepages 00:03:54.205 ************************************ 00:03:54.205 23:25:39 setup.sh -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/driver.sh 00:03:54.205 23:25:39 setup.sh -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:03:54.205 23:25:39 setup.sh -- common/autotest_common.sh@1107 -- # xtrace_disable 00:03:54.205 23:25:39 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:03:54.205 ************************************ 00:03:54.205 START TEST driver 00:03:54.205 ************************************ 00:03:54.205 23:25:39 setup.sh.driver -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/driver.sh 00:03:54.464 * Looking for test storage... 00:03:54.464 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:03:54.464 23:25:39 setup.sh.driver -- setup/driver.sh@68 -- # setup reset 00:03:54.464 23:25:39 setup.sh.driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:54.464 23:25:39 setup.sh.driver -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:03:58.659 23:25:43 setup.sh.driver -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:03:58.659 23:25:43 setup.sh.driver -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:03:58.659 23:25:43 setup.sh.driver -- common/autotest_common.sh@1107 -- # xtrace_disable 00:03:58.659 23:25:43 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:03:58.659 ************************************ 00:03:58.659 START TEST guess_driver 00:03:58.659 ************************************ 00:03:58.659 23:25:43 setup.sh.driver.guess_driver -- common/autotest_common.sh@1125 -- # guess_driver 00:03:58.659 23:25:43 setup.sh.driver.guess_driver -- setup/driver.sh@46 -- # local driver setup_driver marker 00:03:58.659 23:25:43 setup.sh.driver.guess_driver -- setup/driver.sh@47 -- # local fail=0 00:03:58.659 23:25:43 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # pick_driver 00:03:58.659 23:25:43 setup.sh.driver.guess_driver -- setup/driver.sh@36 -- # vfio 00:03:58.659 23:25:43 setup.sh.driver.guess_driver -- setup/driver.sh@21 -- # local iommu_grups 00:03:58.659 23:25:43 setup.sh.driver.guess_driver -- setup/driver.sh@22 -- # local unsafe_vfio 00:03:58.659 23:25:43 setup.sh.driver.guess_driver -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:03:58.659 23:25:43 setup.sh.driver.guess_driver -- setup/driver.sh@25 -- # unsafe_vfio=N 00:03:58.659 23:25:43 setup.sh.driver.guess_driver -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:03:58.659 23:25:43 setup.sh.driver.guess_driver -- setup/driver.sh@29 -- # (( 220 > 0 )) 00:03:58.659 23:25:43 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # is_driver vfio_pci 00:03:58.659 23:25:43 setup.sh.driver.guess_driver -- setup/driver.sh@14 -- # mod vfio_pci 00:03:58.659 23:25:43 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # dep vfio_pci 00:03:58.659 23:25:43 setup.sh.driver.guess_driver -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:03:58.659 23:25:43 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:03:58.659 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:03:58.659 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:03:58.659 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:03:58.659 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:03:58.659 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:03:58.659 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:03:58.659 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:03:58.659 23:25:43 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # return 0 00:03:58.659 23:25:43 setup.sh.driver.guess_driver -- setup/driver.sh@37 -- # echo vfio-pci 00:03:58.659 23:25:43 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # driver=vfio-pci 00:03:58.659 23:25:43 setup.sh.driver.guess_driver -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:03:58.659 23:25:43 setup.sh.driver.guess_driver -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:03:58.659 Looking for driver=vfio-pci 00:03:58.659 23:25:43 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:58.659 23:25:43 setup.sh.driver.guess_driver -- setup/driver.sh@45 -- # setup output config 00:03:58.659 23:25:43 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ output == output ]] 00:03:58.659 23:25:43 setup.sh.driver.guess_driver -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:00.561 23:25:45 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ denied == \-\> ]] 00:04:00.561 23:25:45 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # continue 00:04:00.561 23:25:45 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:01.128 23:25:45 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:01.128 23:25:45 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:01.128 23:25:45 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:01.128 23:25:45 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:01.128 23:25:45 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:01.128 23:25:45 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:01.128 23:25:45 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:01.128 23:25:45 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:01.128 23:25:45 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:01.128 23:25:45 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:01.128 23:25:45 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:01.128 23:25:45 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:01.128 23:25:45 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:01.128 23:25:45 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:01.128 23:25:45 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:01.128 23:25:45 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:01.128 23:25:45 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:01.128 23:25:45 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:01.128 23:25:45 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:01.128 23:25:45 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:01.128 23:25:45 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:01.128 23:25:45 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:01.128 23:25:45 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:01.128 23:25:45 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:01.128 23:25:45 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:01.128 23:25:45 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:01.128 23:25:45 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:01.128 23:25:45 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:01.128 23:25:45 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:01.128 23:25:45 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:01.128 23:25:45 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:01.128 23:25:45 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:01.128 23:25:46 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:01.128 23:25:46 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:01.128 23:25:46 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:01.128 23:25:46 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:01.128 23:25:46 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:01.128 23:25:46 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:01.128 23:25:46 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:01.128 23:25:46 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:01.128 23:25:46 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:01.128 23:25:46 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:01.128 23:25:46 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:01.128 23:25:46 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:01.128 23:25:46 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:01.128 23:25:46 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:01.128 23:25:46 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:01.128 23:25:46 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:02.064 23:25:46 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:02.064 23:25:46 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:02.064 23:25:46 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:02.064 23:25:47 setup.sh.driver.guess_driver -- setup/driver.sh@64 -- # (( fail == 0 )) 00:04:02.064 23:25:47 setup.sh.driver.guess_driver -- setup/driver.sh@65 -- # setup reset 00:04:02.064 23:25:47 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:02.064 23:25:47 setup.sh.driver.guess_driver -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:06.257 00:04:06.257 real 0m7.885s 00:04:06.257 user 0m2.131s 00:04:06.257 sys 0m4.039s 00:04:06.257 23:25:51 setup.sh.driver.guess_driver -- common/autotest_common.sh@1126 -- # xtrace_disable 00:04:06.257 23:25:51 setup.sh.driver.guess_driver -- common/autotest_common.sh@10 -- # set +x 00:04:06.257 ************************************ 00:04:06.257 END TEST guess_driver 00:04:06.257 ************************************ 00:04:06.257 00:04:06.257 real 0m11.967s 00:04:06.257 user 0m3.202s 00:04:06.257 sys 0m6.228s 00:04:06.257 23:25:51 setup.sh.driver -- common/autotest_common.sh@1126 -- # xtrace_disable 00:04:06.257 23:25:51 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:04:06.257 ************************************ 00:04:06.257 END TEST driver 00:04:06.257 ************************************ 00:04:06.257 23:25:51 setup.sh -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/devices.sh 00:04:06.257 23:25:51 setup.sh -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:04:06.257 23:25:51 setup.sh -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:06.257 23:25:51 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:06.257 ************************************ 00:04:06.257 START TEST devices 00:04:06.257 ************************************ 00:04:06.257 23:25:51 setup.sh.devices -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/devices.sh 00:04:06.516 * Looking for test storage... 00:04:06.516 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:04:06.516 23:25:51 setup.sh.devices -- setup/devices.sh@190 -- # trap cleanup EXIT 00:04:06.516 23:25:51 setup.sh.devices -- setup/devices.sh@192 -- # setup reset 00:04:06.516 23:25:51 setup.sh.devices -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:06.516 23:25:51 setup.sh.devices -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:09.809 23:25:54 setup.sh.devices -- setup/devices.sh@194 -- # get_zoned_devs 00:04:09.809 23:25:54 setup.sh.devices -- common/autotest_common.sh@1669 -- # zoned_devs=() 00:04:09.809 23:25:54 setup.sh.devices -- common/autotest_common.sh@1669 -- # local -gA zoned_devs 00:04:09.809 23:25:54 setup.sh.devices -- common/autotest_common.sh@1670 -- # local nvme bdf 00:04:09.809 23:25:54 setup.sh.devices -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:04:09.809 23:25:54 setup.sh.devices -- common/autotest_common.sh@1673 -- # is_block_zoned nvme0n1 00:04:09.809 23:25:54 setup.sh.devices -- common/autotest_common.sh@1662 -- # local device=nvme0n1 00:04:09.809 23:25:54 setup.sh.devices -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:09.809 23:25:54 setup.sh.devices -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:04:09.809 23:25:54 setup.sh.devices -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:04:09.809 23:25:54 setup.sh.devices -- common/autotest_common.sh@1673 -- # is_block_zoned nvme1n1 00:04:09.809 23:25:54 setup.sh.devices -- common/autotest_common.sh@1662 -- # local device=nvme1n1 00:04:09.809 23:25:54 setup.sh.devices -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme1n1/queue/zoned ]] 00:04:09.809 23:25:54 setup.sh.devices -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:04:09.809 23:25:54 setup.sh.devices -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:04:09.809 23:25:54 setup.sh.devices -- common/autotest_common.sh@1673 -- # is_block_zoned nvme1n2 00:04:09.809 23:25:54 setup.sh.devices -- common/autotest_common.sh@1662 -- # local device=nvme1n2 00:04:09.809 23:25:54 setup.sh.devices -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme1n2/queue/zoned ]] 00:04:09.809 23:25:54 setup.sh.devices -- common/autotest_common.sh@1665 -- # [[ host-managed != none ]] 00:04:09.809 23:25:54 setup.sh.devices -- common/autotest_common.sh@1674 -- # zoned_devs["${nvme##*/}"]=0000:5f:00.0 00:04:09.809 23:25:54 setup.sh.devices -- setup/devices.sh@196 -- # blocks=() 00:04:09.809 23:25:54 setup.sh.devices -- setup/devices.sh@196 -- # declare -a blocks 00:04:09.809 23:25:54 setup.sh.devices -- setup/devices.sh@197 -- # blocks_to_pci=() 00:04:09.809 23:25:54 setup.sh.devices -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:04:09.809 23:25:54 setup.sh.devices -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:04:09.809 23:25:54 setup.sh.devices -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:04:09.809 23:25:54 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:04:09.809 23:25:54 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0 00:04:09.809 23:25:54 setup.sh.devices -- setup/devices.sh@202 -- # pci=0000:5e:00.0 00:04:09.809 23:25:54 setup.sh.devices -- setup/devices.sh@203 -- # [[ 0000:5f:00.0 == *\0\0\0\0\:\5\e\:\0\0\.\0* ]] 00:04:09.809 23:25:54 setup.sh.devices -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:04:09.809 23:25:54 setup.sh.devices -- scripts/common.sh@378 -- # local block=nvme0n1 pt 00:04:09.809 23:25:54 setup.sh.devices -- scripts/common.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:04:09.809 No valid GPT data, bailing 00:04:09.809 23:25:54 setup.sh.devices -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:04:09.809 23:25:54 setup.sh.devices -- scripts/common.sh@391 -- # pt= 00:04:09.809 23:25:54 setup.sh.devices -- scripts/common.sh@392 -- # return 1 00:04:09.809 23:25:54 setup.sh.devices -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:04:09.809 23:25:54 setup.sh.devices -- setup/common.sh@76 -- # local dev=nvme0n1 00:04:09.810 23:25:54 setup.sh.devices -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:04:09.810 23:25:54 setup.sh.devices -- setup/common.sh@80 -- # echo 1000204886016 00:04:09.810 23:25:54 setup.sh.devices -- setup/devices.sh@204 -- # (( 1000204886016 >= min_disk_size )) 00:04:09.810 23:25:54 setup.sh.devices -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:04:09.810 23:25:54 setup.sh.devices -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:5e:00.0 00:04:09.810 23:25:54 setup.sh.devices -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:04:09.810 23:25:54 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme1n1 00:04:09.810 23:25:54 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme1 00:04:09.810 23:25:54 setup.sh.devices -- setup/devices.sh@202 -- # pci=0000:5f:00.0 00:04:09.810 23:25:54 setup.sh.devices -- setup/devices.sh@203 -- # [[ 0000:5f:00.0 == *\0\0\0\0\:\5\f\:\0\0\.\0* ]] 00:04:09.810 23:25:54 setup.sh.devices -- setup/devices.sh@203 -- # continue 00:04:09.810 23:25:54 setup.sh.devices -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:04:09.810 23:25:54 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme1n2 00:04:09.810 23:25:54 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme1 00:04:09.810 23:25:54 setup.sh.devices -- setup/devices.sh@202 -- # pci=0000:5f:00.0 00:04:09.810 23:25:54 setup.sh.devices -- setup/devices.sh@203 -- # [[ 0000:5f:00.0 == *\0\0\0\0\:\5\f\:\0\0\.\0* ]] 00:04:09.810 23:25:54 setup.sh.devices -- setup/devices.sh@203 -- # continue 00:04:09.810 23:25:54 setup.sh.devices -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:04:09.810 23:25:54 setup.sh.devices -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:04:09.810 23:25:54 setup.sh.devices -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:04:09.810 23:25:54 setup.sh.devices -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:04:09.810 23:25:54 setup.sh.devices -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:09.810 23:25:54 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:04:09.810 ************************************ 00:04:09.810 START TEST nvme_mount 00:04:09.810 ************************************ 00:04:09.810 23:25:54 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1125 -- # nvme_mount 00:04:09.810 23:25:54 setup.sh.devices.nvme_mount -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:04:09.810 23:25:54 setup.sh.devices.nvme_mount -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:04:09.810 23:25:54 setup.sh.devices.nvme_mount -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:09.810 23:25:54 setup.sh.devices.nvme_mount -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:09.810 23:25:54 setup.sh.devices.nvme_mount -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:04:09.810 23:25:54 setup.sh.devices.nvme_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:04:09.810 23:25:54 setup.sh.devices.nvme_mount -- setup/common.sh@40 -- # local part_no=1 00:04:09.810 23:25:54 setup.sh.devices.nvme_mount -- setup/common.sh@41 -- # local size=1073741824 00:04:09.810 23:25:54 setup.sh.devices.nvme_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:04:09.810 23:25:54 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # parts=() 00:04:09.810 23:25:54 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # local parts 00:04:09.810 23:25:54 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:04:09.810 23:25:54 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:09.810 23:25:54 setup.sh.devices.nvme_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:09.810 23:25:54 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part++ )) 00:04:09.810 23:25:54 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:09.810 23:25:54 setup.sh.devices.nvme_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:04:09.810 23:25:54 setup.sh.devices.nvme_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:04:09.810 23:25:54 setup.sh.devices.nvme_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:04:10.790 Creating new GPT entries in memory. 00:04:10.790 GPT data structures destroyed! You may now partition the disk using fdisk or 00:04:10.790 other utilities. 00:04:10.790 23:25:55 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:04:10.790 23:25:55 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:10.790 23:25:55 setup.sh.devices.nvme_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:10.790 23:25:55 setup.sh.devices.nvme_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:10.790 23:25:55 setup.sh.devices.nvme_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:04:11.729 Creating new GPT entries in memory. 00:04:11.729 The operation has completed successfully. 00:04:11.729 23:25:56 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part++ )) 00:04:11.729 23:25:56 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:11.729 23:25:56 setup.sh.devices.nvme_mount -- setup/common.sh@62 -- # wait 181017 00:04:11.729 23:25:56 setup.sh.devices.nvme_mount -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:11.729 23:25:56 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount size= 00:04:11.729 23:25:56 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:11.729 23:25:56 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:04:11.729 23:25:56 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:04:11.989 23:25:56 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:11.989 23:25:56 setup.sh.devices.nvme_mount -- setup/devices.sh@105 -- # verify 0000:5e:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:11.989 23:25:56 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:04:11.989 23:25:56 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:04:11.989 23:25:56 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:11.989 23:25:56 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:11.989 23:25:56 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:04:11.989 23:25:56 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:11.989 23:25:56 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:04:11.989 23:25:56 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:04:11.989 23:25:56 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:11.989 23:25:56 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:04:11.989 23:25:56 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:04:11.989 23:25:56 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:04:11.989 23:25:56 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:14.524 23:25:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5f:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:14.524 23:25:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:14.524 23:25:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:14.524 23:25:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:04:14.524 23:25:59 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:04:14.524 23:25:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:14.524 23:25:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:14.524 23:25:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:14.525 23:25:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:14.525 23:25:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:14.525 23:25:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:14.525 23:25:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:14.525 23:25:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:14.525 23:25:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:14.525 23:25:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:14.525 23:25:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:14.525 23:25:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:14.525 23:25:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:14.525 23:25:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:14.525 23:25:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:14.525 23:25:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:14.525 23:25:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:14.525 23:25:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:14.525 23:25:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:14.525 23:25:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:14.525 23:25:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:14.525 23:25:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:14.525 23:25:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:14.525 23:25:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:14.525 23:25:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:14.525 23:25:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:14.525 23:25:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:14.525 23:25:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:14.525 23:25:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:14.525 23:25:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:14.525 23:25:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:14.525 23:25:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:14.525 23:25:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:14.784 23:25:59 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:14.784 23:25:59 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount ]] 00:04:14.784 23:25:59 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:14.784 23:25:59 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:14.784 23:25:59 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:14.784 23:25:59 setup.sh.devices.nvme_mount -- setup/devices.sh@110 -- # cleanup_nvme 00:04:14.784 23:25:59 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:14.784 23:25:59 setup.sh.devices.nvme_mount -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:14.784 23:25:59 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:14.784 23:25:59 setup.sh.devices.nvme_mount -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:04:14.784 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:14.784 23:25:59 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:14.784 23:25:59 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:15.043 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:04:15.043 /dev/nvme0n1: 8 bytes were erased at offset 0xe8e0db5e00 (gpt): 45 46 49 20 50 41 52 54 00:04:15.043 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:04:15.043 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:04:15.043 23:25:59 setup.sh.devices.nvme_mount -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:04:15.043 23:25:59 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:04:15.043 23:25:59 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:15.043 23:25:59 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:04:15.043 23:25:59 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:04:15.043 23:25:59 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:15.043 23:25:59 setup.sh.devices.nvme_mount -- setup/devices.sh@116 -- # verify 0000:5e:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:15.043 23:25:59 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:04:15.043 23:25:59 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:04:15.043 23:25:59 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:15.043 23:25:59 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:15.043 23:25:59 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:04:15.043 23:25:59 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:15.043 23:25:59 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:04:15.043 23:25:59 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:04:15.043 23:25:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:15.043 23:25:59 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:04:15.043 23:25:59 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:04:15.043 23:25:59 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:04:15.043 23:25:59 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:17.579 23:26:02 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5f:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:17.579 23:26:02 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:17.839 23:26:02 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:17.839 23:26:02 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:04:17.839 23:26:02 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:04:17.839 23:26:02 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:17.839 23:26:02 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:17.839 23:26:02 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:17.839 23:26:02 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:17.839 23:26:02 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:17.839 23:26:02 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:17.839 23:26:02 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:17.839 23:26:02 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:17.839 23:26:02 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:17.839 23:26:02 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:17.839 23:26:02 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:17.839 23:26:02 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:17.839 23:26:02 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:17.839 23:26:02 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:17.839 23:26:02 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:17.839 23:26:02 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:17.839 23:26:02 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:17.839 23:26:02 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:17.839 23:26:02 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:17.839 23:26:02 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:17.839 23:26:02 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:17.839 23:26:02 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:17.839 23:26:02 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:17.839 23:26:02 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:17.839 23:26:02 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:17.839 23:26:02 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:17.840 23:26:02 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:17.840 23:26:02 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:17.840 23:26:02 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:17.840 23:26:02 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:17.840 23:26:02 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:17.840 23:26:02 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:17.840 23:26:02 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:17.840 23:26:02 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:17.840 23:26:02 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount ]] 00:04:17.840 23:26:02 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:17.840 23:26:02 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:17.840 23:26:02 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:17.840 23:26:02 setup.sh.devices.nvme_mount -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:17.840 23:26:02 setup.sh.devices.nvme_mount -- setup/devices.sh@125 -- # verify 0000:5e:00.0 data@nvme0n1 '' '' 00:04:17.840 23:26:02 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:04:17.840 23:26:02 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:04:17.840 23:26:02 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point= 00:04:17.840 23:26:02 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file= 00:04:17.840 23:26:02 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:04:17.840 23:26:02 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:04:17.840 23:26:02 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:04:17.840 23:26:02 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:17.840 23:26:02 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:04:17.840 23:26:02 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:04:17.840 23:26:02 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:04:17.840 23:26:02 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:21.130 23:26:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5f:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:21.130 23:26:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.130 23:26:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:21.130 23:26:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:04:21.130 23:26:05 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:04:21.130 23:26:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.130 23:26:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:21.130 23:26:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.130 23:26:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:21.130 23:26:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.130 23:26:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:21.130 23:26:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.130 23:26:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:21.130 23:26:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.130 23:26:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:21.130 23:26:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.130 23:26:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:21.130 23:26:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.130 23:26:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:21.130 23:26:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.130 23:26:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:21.130 23:26:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.130 23:26:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:21.130 23:26:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.130 23:26:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:21.130 23:26:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.130 23:26:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:21.130 23:26:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.130 23:26:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:21.130 23:26:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.130 23:26:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:21.130 23:26:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.130 23:26:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:21.130 23:26:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.130 23:26:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:21.130 23:26:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.130 23:26:05 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:21.130 23:26:05 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.130 23:26:05 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:21.130 23:26:05 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:04:21.130 23:26:05 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # return 0 00:04:21.130 23:26:05 setup.sh.devices.nvme_mount -- setup/devices.sh@128 -- # cleanup_nvme 00:04:21.130 23:26:05 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:21.130 23:26:05 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:21.130 23:26:05 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:21.130 23:26:05 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:21.130 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:21.130 00:04:21.130 real 0m11.340s 00:04:21.130 user 0m3.241s 00:04:21.130 sys 0m5.666s 00:04:21.130 23:26:05 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1126 -- # xtrace_disable 00:04:21.130 23:26:05 setup.sh.devices.nvme_mount -- common/autotest_common.sh@10 -- # set +x 00:04:21.130 ************************************ 00:04:21.130 END TEST nvme_mount 00:04:21.130 ************************************ 00:04:21.130 23:26:06 setup.sh.devices -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:04:21.130 23:26:06 setup.sh.devices -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:04:21.130 23:26:06 setup.sh.devices -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:21.130 23:26:06 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:04:21.130 ************************************ 00:04:21.130 START TEST dm_mount 00:04:21.130 ************************************ 00:04:21.130 23:26:06 setup.sh.devices.dm_mount -- common/autotest_common.sh@1125 -- # dm_mount 00:04:21.130 23:26:06 setup.sh.devices.dm_mount -- setup/devices.sh@144 -- # pv=nvme0n1 00:04:21.130 23:26:06 setup.sh.devices.dm_mount -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:04:21.130 23:26:06 setup.sh.devices.dm_mount -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:04:21.130 23:26:06 setup.sh.devices.dm_mount -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:04:21.130 23:26:06 setup.sh.devices.dm_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:04:21.130 23:26:06 setup.sh.devices.dm_mount -- setup/common.sh@40 -- # local part_no=2 00:04:21.130 23:26:06 setup.sh.devices.dm_mount -- setup/common.sh@41 -- # local size=1073741824 00:04:21.130 23:26:06 setup.sh.devices.dm_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:04:21.130 23:26:06 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # parts=() 00:04:21.130 23:26:06 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # local parts 00:04:21.130 23:26:06 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:04:21.130 23:26:06 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:21.130 23:26:06 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:21.130 23:26:06 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:04:21.130 23:26:06 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:21.130 23:26:06 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:21.130 23:26:06 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:04:21.130 23:26:06 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:21.130 23:26:06 setup.sh.devices.dm_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:04:21.130 23:26:06 setup.sh.devices.dm_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:04:21.130 23:26:06 setup.sh.devices.dm_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:04:22.509 Creating new GPT entries in memory. 00:04:22.509 GPT data structures destroyed! You may now partition the disk using fdisk or 00:04:22.509 other utilities. 00:04:22.509 23:26:07 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:04:22.509 23:26:07 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:22.509 23:26:07 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:22.509 23:26:07 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:22.509 23:26:07 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:04:23.446 Creating new GPT entries in memory. 00:04:23.446 The operation has completed successfully. 00:04:23.446 23:26:08 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:04:23.446 23:26:08 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:23.446 23:26:08 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:23.446 23:26:08 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:23.446 23:26:08 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:04:24.384 The operation has completed successfully. 00:04:24.384 23:26:09 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:04:24.384 23:26:09 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:24.384 23:26:09 setup.sh.devices.dm_mount -- setup/common.sh@62 -- # wait 185545 00:04:24.384 23:26:09 setup.sh.devices.dm_mount -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:04:24.384 23:26:09 setup.sh.devices.dm_mount -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:04:24.384 23:26:09 setup.sh.devices.dm_mount -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:24.384 23:26:09 setup.sh.devices.dm_mount -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:04:24.384 23:26:09 setup.sh.devices.dm_mount -- setup/devices.sh@160 -- # for t in {1..5} 00:04:24.384 23:26:09 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:24.384 23:26:09 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # break 00:04:24.384 23:26:09 setup.sh.devices.dm_mount -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:24.384 23:26:09 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:04:24.384 23:26:09 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:04:24.384 23:26:09 setup.sh.devices.dm_mount -- setup/devices.sh@166 -- # dm=dm-0 00:04:24.384 23:26:09 setup.sh.devices.dm_mount -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:04:24.384 23:26:09 setup.sh.devices.dm_mount -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:04:24.384 23:26:09 setup.sh.devices.dm_mount -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:04:24.384 23:26:09 setup.sh.devices.dm_mount -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount size= 00:04:24.384 23:26:09 setup.sh.devices.dm_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:04:24.384 23:26:09 setup.sh.devices.dm_mount -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:24.384 23:26:09 setup.sh.devices.dm_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:04:24.384 23:26:09 setup.sh.devices.dm_mount -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:04:24.384 23:26:09 setup.sh.devices.dm_mount -- setup/devices.sh@174 -- # verify 0000:5e:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:24.384 23:26:09 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:04:24.384 23:26:09 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:04:24.384 23:26:09 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:04:24.384 23:26:09 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:24.384 23:26:09 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:04:24.384 23:26:09 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:04:24.384 23:26:09 setup.sh.devices.dm_mount -- setup/devices.sh@56 -- # : 00:04:24.384 23:26:09 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:04:24.384 23:26:09 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:24.384 23:26:09 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:04:24.384 23:26:09 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:04:24.384 23:26:09 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:04:24.384 23:26:09 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:26.920 23:26:11 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:5f:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:26.920 23:26:11 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:27.180 23:26:12 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:27.180 23:26:12 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:04:27.180 23:26:12 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:04:27.180 23:26:12 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:27.180 23:26:12 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:27.180 23:26:12 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:27.180 23:26:12 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:27.180 23:26:12 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:27.180 23:26:12 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:27.180 23:26:12 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:27.180 23:26:12 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:27.180 23:26:12 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:27.180 23:26:12 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:27.180 23:26:12 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:27.180 23:26:12 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:27.180 23:26:12 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:27.180 23:26:12 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:27.180 23:26:12 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:27.180 23:26:12 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:27.180 23:26:12 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:27.180 23:26:12 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:27.180 23:26:12 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:27.180 23:26:12 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:27.180 23:26:12 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:27.180 23:26:12 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:27.180 23:26:12 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:27.180 23:26:12 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:27.180 23:26:12 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:27.180 23:26:12 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:27.180 23:26:12 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:27.180 23:26:12 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:27.180 23:26:12 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:27.180 23:26:12 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:27.180 23:26:12 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:27.180 23:26:12 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:27.180 23:26:12 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:27.440 23:26:12 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:27.440 23:26:12 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount ]] 00:04:27.440 23:26:12 setup.sh.devices.dm_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:04:27.440 23:26:12 setup.sh.devices.dm_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:04:27.440 23:26:12 setup.sh.devices.dm_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:27.440 23:26:12 setup.sh.devices.dm_mount -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:04:27.440 23:26:12 setup.sh.devices.dm_mount -- setup/devices.sh@184 -- # verify 0000:5e:00.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:04:27.440 23:26:12 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:04:27.440 23:26:12 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:04:27.440 23:26:12 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point= 00:04:27.440 23:26:12 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file= 00:04:27.440 23:26:12 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:04:27.440 23:26:12 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:04:27.440 23:26:12 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:04:27.440 23:26:12 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:27.440 23:26:12 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:04:27.440 23:26:12 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:04:27.440 23:26:12 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:04:27.440 23:26:12 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:29.972 23:26:14 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:5f:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:29.972 23:26:14 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:30.231 23:26:15 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:30.231 23:26:15 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:04:30.231 23:26:15 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:04:30.231 23:26:15 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:30.231 23:26:15 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:30.231 23:26:15 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:30.231 23:26:15 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:30.231 23:26:15 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:30.231 23:26:15 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:30.231 23:26:15 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:30.231 23:26:15 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:30.231 23:26:15 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:30.231 23:26:15 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:30.231 23:26:15 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:30.231 23:26:15 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:30.231 23:26:15 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:30.231 23:26:15 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:30.231 23:26:15 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:30.231 23:26:15 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:30.231 23:26:15 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:30.231 23:26:15 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:30.231 23:26:15 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:30.231 23:26:15 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:30.231 23:26:15 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:30.231 23:26:15 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:30.231 23:26:15 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:30.231 23:26:15 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:30.231 23:26:15 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:30.231 23:26:15 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:30.231 23:26:15 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:30.231 23:26:15 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:30.231 23:26:15 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:30.231 23:26:15 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:30.231 23:26:15 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:30.231 23:26:15 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:30.231 23:26:15 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:30.491 23:26:15 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:30.491 23:26:15 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:04:30.491 23:26:15 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # return 0 00:04:30.491 23:26:15 setup.sh.devices.dm_mount -- setup/devices.sh@187 -- # cleanup_dm 00:04:30.491 23:26:15 setup.sh.devices.dm_mount -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:04:30.491 23:26:15 setup.sh.devices.dm_mount -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:04:30.491 23:26:15 setup.sh.devices.dm_mount -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:04:30.491 23:26:15 setup.sh.devices.dm_mount -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:30.491 23:26:15 setup.sh.devices.dm_mount -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:04:30.491 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:30.491 23:26:15 setup.sh.devices.dm_mount -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:04:30.491 23:26:15 setup.sh.devices.dm_mount -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:04:30.491 00:04:30.491 real 0m9.244s 00:04:30.491 user 0m2.230s 00:04:30.491 sys 0m3.886s 00:04:30.491 23:26:15 setup.sh.devices.dm_mount -- common/autotest_common.sh@1126 -- # xtrace_disable 00:04:30.491 23:26:15 setup.sh.devices.dm_mount -- common/autotest_common.sh@10 -- # set +x 00:04:30.491 ************************************ 00:04:30.491 END TEST dm_mount 00:04:30.491 ************************************ 00:04:30.491 23:26:15 setup.sh.devices -- setup/devices.sh@1 -- # cleanup 00:04:30.491 23:26:15 setup.sh.devices -- setup/devices.sh@11 -- # cleanup_nvme 00:04:30.491 23:26:15 setup.sh.devices -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:30.491 23:26:15 setup.sh.devices -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:30.491 23:26:15 setup.sh.devices -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:04:30.491 23:26:15 setup.sh.devices -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:30.491 23:26:15 setup.sh.devices -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:30.749 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:04:30.749 /dev/nvme0n1: 8 bytes were erased at offset 0xe8e0db5e00 (gpt): 45 46 49 20 50 41 52 54 00:04:30.749 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:04:30.749 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:04:30.749 23:26:15 setup.sh.devices -- setup/devices.sh@12 -- # cleanup_dm 00:04:30.749 23:26:15 setup.sh.devices -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:04:30.749 23:26:15 setup.sh.devices -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:04:30.749 23:26:15 setup.sh.devices -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:30.750 23:26:15 setup.sh.devices -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:04:30.750 23:26:15 setup.sh.devices -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:04:30.750 23:26:15 setup.sh.devices -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:04:30.750 00:04:30.750 real 0m24.421s 00:04:30.750 user 0m6.795s 00:04:30.750 sys 0m11.900s 00:04:30.750 23:26:15 setup.sh.devices -- common/autotest_common.sh@1126 -- # xtrace_disable 00:04:30.750 23:26:15 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:04:30.750 ************************************ 00:04:30.750 END TEST devices 00:04:30.750 ************************************ 00:04:30.750 00:04:30.750 real 1m21.437s 00:04:30.750 user 0m26.260s 00:04:30.750 sys 0m44.766s 00:04:30.750 23:26:15 setup.sh -- common/autotest_common.sh@1126 -- # xtrace_disable 00:04:30.750 23:26:15 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:30.750 ************************************ 00:04:30.750 END TEST setup.sh 00:04:30.750 ************************************ 00:04:30.750 23:26:15 -- spdk/autotest.sh@128 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:04:33.281 Hugepages 00:04:33.281 node hugesize free / total 00:04:33.281 node0 1048576kB 0 / 0 00:04:33.281 node0 2048kB 1024 / 1024 00:04:33.281 node1 1048576kB 0 / 0 00:04:33.281 node1 2048kB 1024 / 1024 00:04:33.281 00:04:33.281 Type BDF Vendor Device NUMA Driver Device Block devices 00:04:33.540 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:04:33.540 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:04:33.540 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:04:33.540 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:04:33.540 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:04:33.540 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:04:33.540 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:04:33.540 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:04:33.540 NVMe 0000:5e:00.0 8086 0a54 0 nvme nvme0 nvme0n1 00:04:33.540 NVMe 0000:5f:00.0 1b96 2600 0 nvme nvme1 nvme1n1 nvme1n2 00:04:33.540 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:04:33.540 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:04:33.540 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:04:33.540 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:04:33.540 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:04:33.540 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:04:33.540 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:04:33.540 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:04:33.540 23:26:18 -- spdk/autotest.sh@130 -- # uname -s 00:04:33.540 23:26:18 -- spdk/autotest.sh@130 -- # [[ Linux == Linux ]] 00:04:33.540 23:26:18 -- spdk/autotest.sh@132 -- # nvme_namespace_revert 00:04:33.540 23:26:18 -- common/autotest_common.sh@1531 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:36.075 0000:5f:00.0 (1b96 2600): Skipping denied controller at 0000:5f:00.0 00:04:36.334 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:36.334 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:36.334 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:36.334 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:36.334 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:36.334 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:36.593 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:36.593 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:36.593 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:36.593 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:36.593 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:36.593 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:36.593 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:36.593 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:36.593 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:36.593 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:37.530 0000:5e:00.0 (8086 0a54): nvme -> vfio-pci 00:04:37.530 23:26:22 -- common/autotest_common.sh@1532 -- # sleep 1 00:04:38.528 23:26:23 -- common/autotest_common.sh@1533 -- # bdfs=() 00:04:38.528 23:26:23 -- common/autotest_common.sh@1533 -- # local bdfs 00:04:38.528 23:26:23 -- common/autotest_common.sh@1534 -- # bdfs=($(get_nvme_bdfs)) 00:04:38.528 23:26:23 -- common/autotest_common.sh@1534 -- # get_nvme_bdfs 00:04:38.528 23:26:23 -- common/autotest_common.sh@1513 -- # bdfs=() 00:04:38.528 23:26:23 -- common/autotest_common.sh@1513 -- # local bdfs 00:04:38.528 23:26:23 -- common/autotest_common.sh@1514 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:04:38.528 23:26:23 -- common/autotest_common.sh@1514 -- # jq -r '.config[].params.traddr' 00:04:38.528 23:26:23 -- common/autotest_common.sh@1514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:04:38.528 23:26:23 -- common/autotest_common.sh@1515 -- # (( 1 == 0 )) 00:04:38.528 23:26:23 -- common/autotest_common.sh@1519 -- # printf '%s\n' 0000:5e:00.0 00:04:38.528 23:26:23 -- common/autotest_common.sh@1536 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:41.060 0000:5f:00.0 (1b96 2600): Skipping denied controller at 0000:5f:00.0 00:04:41.318 Waiting for block devices as requested 00:04:41.318 0000:5e:00.0 (8086 0a54): vfio-pci -> nvme 00:04:41.577 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:04:41.577 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:04:41.577 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:04:41.835 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:04:41.835 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:04:41.835 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:04:41.835 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:04:42.094 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:04:42.094 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:04:42.094 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:04:42.094 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:04:42.353 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:04:42.353 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:04:42.353 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:04:42.612 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:04:42.612 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:04:42.612 23:26:27 -- common/autotest_common.sh@1538 -- # for bdf in "${bdfs[@]}" 00:04:42.612 23:26:27 -- common/autotest_common.sh@1539 -- # get_nvme_ctrlr_from_bdf 0000:5e:00.0 00:04:42.612 23:26:27 -- common/autotest_common.sh@1502 -- # grep 0000:5e:00.0/nvme/nvme 00:04:42.612 23:26:27 -- common/autotest_common.sh@1502 -- # readlink -f /sys/class/nvme/nvme0 /sys/class/nvme/nvme1 00:04:42.612 23:26:27 -- common/autotest_common.sh@1502 -- # bdf_sysfs_path=/sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme0 00:04:42.612 23:26:27 -- common/autotest_common.sh@1503 -- # [[ -z /sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme0 ]] 00:04:42.612 23:26:27 -- common/autotest_common.sh@1507 -- # basename /sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme0 00:04:42.612 23:26:27 -- common/autotest_common.sh@1507 -- # printf '%s\n' nvme0 00:04:42.612 23:26:27 -- common/autotest_common.sh@1539 -- # nvme_ctrlr=/dev/nvme0 00:04:42.612 23:26:27 -- common/autotest_common.sh@1540 -- # [[ -z /dev/nvme0 ]] 00:04:42.612 23:26:27 -- common/autotest_common.sh@1545 -- # nvme id-ctrl /dev/nvme0 00:04:42.612 23:26:27 -- common/autotest_common.sh@1545 -- # grep oacs 00:04:42.613 23:26:27 -- common/autotest_common.sh@1545 -- # cut -d: -f2 00:04:42.613 23:26:27 -- common/autotest_common.sh@1545 -- # oacs=' 0xf' 00:04:42.613 23:26:27 -- common/autotest_common.sh@1546 -- # oacs_ns_manage=8 00:04:42.613 23:26:27 -- common/autotest_common.sh@1548 -- # [[ 8 -ne 0 ]] 00:04:42.613 23:26:27 -- common/autotest_common.sh@1554 -- # nvme id-ctrl /dev/nvme0 00:04:42.613 23:26:27 -- common/autotest_common.sh@1554 -- # cut -d: -f2 00:04:42.613 23:26:27 -- common/autotest_common.sh@1554 -- # grep unvmcap 00:04:42.613 23:26:27 -- common/autotest_common.sh@1554 -- # unvmcap=' 0' 00:04:42.613 23:26:27 -- common/autotest_common.sh@1555 -- # [[ 0 -eq 0 ]] 00:04:42.613 23:26:27 -- common/autotest_common.sh@1557 -- # continue 00:04:42.613 23:26:27 -- spdk/autotest.sh@135 -- # timing_exit pre_cleanup 00:04:42.613 23:26:27 -- common/autotest_common.sh@730 -- # xtrace_disable 00:04:42.613 23:26:27 -- common/autotest_common.sh@10 -- # set +x 00:04:42.871 23:26:27 -- spdk/autotest.sh@138 -- # timing_enter afterboot 00:04:42.871 23:26:27 -- common/autotest_common.sh@724 -- # xtrace_disable 00:04:42.871 23:26:27 -- common/autotest_common.sh@10 -- # set +x 00:04:42.871 23:26:27 -- spdk/autotest.sh@139 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:45.407 0000:5f:00.0 (1b96 2600): Skipping denied controller at 0000:5f:00.0 00:04:45.667 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:45.667 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:45.667 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:45.667 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:45.667 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:45.667 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:45.667 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:45.667 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:45.667 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:45.667 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:45.667 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:45.667 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:45.667 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:45.667 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:45.667 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:45.926 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:46.493 0000:5e:00.0 (8086 0a54): nvme -> vfio-pci 00:04:46.753 23:26:31 -- spdk/autotest.sh@140 -- # timing_exit afterboot 00:04:46.753 23:26:31 -- common/autotest_common.sh@730 -- # xtrace_disable 00:04:46.753 23:26:31 -- common/autotest_common.sh@10 -- # set +x 00:04:46.753 23:26:31 -- spdk/autotest.sh@144 -- # opal_revert_cleanup 00:04:46.753 23:26:31 -- common/autotest_common.sh@1591 -- # mapfile -t bdfs 00:04:46.753 23:26:31 -- common/autotest_common.sh@1591 -- # get_nvme_bdfs_by_id 0x0a54 00:04:46.753 23:26:31 -- common/autotest_common.sh@1577 -- # bdfs=() 00:04:46.753 23:26:31 -- common/autotest_common.sh@1577 -- # local bdfs 00:04:46.753 23:26:31 -- common/autotest_common.sh@1579 -- # get_nvme_bdfs 00:04:46.753 23:26:31 -- common/autotest_common.sh@1513 -- # bdfs=() 00:04:46.753 23:26:31 -- common/autotest_common.sh@1513 -- # local bdfs 00:04:46.753 23:26:31 -- common/autotest_common.sh@1514 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:04:46.753 23:26:31 -- common/autotest_common.sh@1514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:04:46.753 23:26:31 -- common/autotest_common.sh@1514 -- # jq -r '.config[].params.traddr' 00:04:46.753 23:26:31 -- common/autotest_common.sh@1515 -- # (( 1 == 0 )) 00:04:46.753 23:26:31 -- common/autotest_common.sh@1519 -- # printf '%s\n' 0000:5e:00.0 00:04:46.753 23:26:31 -- common/autotest_common.sh@1579 -- # for bdf in $(get_nvme_bdfs) 00:04:46.753 23:26:31 -- common/autotest_common.sh@1580 -- # cat /sys/bus/pci/devices/0000:5e:00.0/device 00:04:46.753 23:26:31 -- common/autotest_common.sh@1580 -- # device=0x0a54 00:04:46.753 23:26:31 -- common/autotest_common.sh@1581 -- # [[ 0x0a54 == \0\x\0\a\5\4 ]] 00:04:46.753 23:26:31 -- common/autotest_common.sh@1582 -- # bdfs+=($bdf) 00:04:46.753 23:26:31 -- common/autotest_common.sh@1586 -- # printf '%s\n' 0000:5e:00.0 00:04:46.753 23:26:31 -- common/autotest_common.sh@1592 -- # [[ -z 0000:5e:00.0 ]] 00:04:46.753 23:26:31 -- common/autotest_common.sh@1597 -- # spdk_tgt_pid=195454 00:04:46.753 23:26:31 -- common/autotest_common.sh@1598 -- # waitforlisten 195454 00:04:46.753 23:26:31 -- common/autotest_common.sh@1596 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:04:46.753 23:26:31 -- common/autotest_common.sh@831 -- # '[' -z 195454 ']' 00:04:46.753 23:26:31 -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:04:46.753 23:26:31 -- common/autotest_common.sh@836 -- # local max_retries=100 00:04:46.753 23:26:31 -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:04:46.753 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:04:46.753 23:26:31 -- common/autotest_common.sh@840 -- # xtrace_disable 00:04:46.753 23:26:31 -- common/autotest_common.sh@10 -- # set +x 00:04:46.753 [2024-07-24 23:26:31.729790] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:04:46.753 [2024-07-24 23:26:31.729837] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid195454 ] 00:04:47.012 [2024-07-24 23:26:31.788317] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:04:47.012 [2024-07-24 23:26:31.866009] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:04:47.580 23:26:32 -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:04:47.580 23:26:32 -- common/autotest_common.sh@864 -- # return 0 00:04:47.580 23:26:32 -- common/autotest_common.sh@1600 -- # bdf_id=0 00:04:47.580 23:26:32 -- common/autotest_common.sh@1601 -- # for bdf in "${bdfs[@]}" 00:04:47.580 23:26:32 -- common/autotest_common.sh@1602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_nvme_attach_controller -b nvme0 -t pcie -a 0000:5e:00.0 00:04:50.868 nvme0n1 00:04:50.868 23:26:35 -- common/autotest_common.sh@1604 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_nvme_opal_revert -b nvme0 -p test 00:04:50.868 [2024-07-24 23:26:35.645530] nvme_opal.c:2063:spdk_opal_cmd_revert_tper: *ERROR*: Error on starting admin SP session with error 18 00:04:50.868 [2024-07-24 23:26:35.645560] vbdev_opal_rpc.c: 134:rpc_bdev_nvme_opal_revert: *ERROR*: Revert TPer failure: 18 00:04:50.868 request: 00:04:50.868 { 00:04:50.868 "nvme_ctrlr_name": "nvme0", 00:04:50.868 "password": "test", 00:04:50.868 "method": "bdev_nvme_opal_revert", 00:04:50.868 "req_id": 1 00:04:50.868 } 00:04:50.868 Got JSON-RPC error response 00:04:50.868 response: 00:04:50.868 { 00:04:50.868 "code": -32603, 00:04:50.868 "message": "Internal error" 00:04:50.868 } 00:04:50.868 23:26:35 -- common/autotest_common.sh@1604 -- # true 00:04:50.869 23:26:35 -- common/autotest_common.sh@1605 -- # (( ++bdf_id )) 00:04:50.869 23:26:35 -- common/autotest_common.sh@1608 -- # killprocess 195454 00:04:50.869 23:26:35 -- common/autotest_common.sh@950 -- # '[' -z 195454 ']' 00:04:50.869 23:26:35 -- common/autotest_common.sh@954 -- # kill -0 195454 00:04:50.869 23:26:35 -- common/autotest_common.sh@955 -- # uname 00:04:50.869 23:26:35 -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:04:50.869 23:26:35 -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 195454 00:04:50.869 23:26:35 -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:04:50.869 23:26:35 -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:04:50.869 23:26:35 -- common/autotest_common.sh@968 -- # echo 'killing process with pid 195454' 00:04:50.869 killing process with pid 195454 00:04:50.869 23:26:35 -- common/autotest_common.sh@969 -- # kill 195454 00:04:50.869 23:26:35 -- common/autotest_common.sh@974 -- # wait 195454 00:04:52.773 23:26:37 -- spdk/autotest.sh@150 -- # '[' 0 -eq 1 ']' 00:04:52.773 23:26:37 -- spdk/autotest.sh@154 -- # '[' 1 -eq 1 ']' 00:04:52.773 23:26:37 -- spdk/autotest.sh@155 -- # [[ 1 -eq 1 ]] 00:04:52.773 23:26:37 -- spdk/autotest.sh@156 -- # [[ 0 -eq 1 ]] 00:04:52.773 23:26:37 -- spdk/autotest.sh@159 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/qat_setup.sh 00:04:53.032 Restarting all devices. 00:04:56.323 lstat() error: No such file or directory 00:04:56.323 QAT Error: No GENERAL section found 00:04:56.323 Failed to configure qat_dev0 00:04:56.323 lstat() error: No such file or directory 00:04:56.323 QAT Error: No GENERAL section found 00:04:56.323 Failed to configure qat_dev1 00:04:56.323 lstat() error: No such file or directory 00:04:56.323 QAT Error: No GENERAL section found 00:04:56.323 Failed to configure qat_dev2 00:04:56.323 enable sriov 00:04:56.323 Checking status of all devices. 00:04:56.323 There is 3 QAT acceleration device(s) in the system: 00:04:56.323 qat_dev0 - type: c6xx, inst_id: 0, node_id: 0, bsf: 0000:1a:00.0, #accel: 5 #engines: 10 state: down 00:04:56.323 qat_dev1 - type: c6xx, inst_id: 1, node_id: 0, bsf: 0000:1c:00.0, #accel: 5 #engines: 10 state: down 00:04:56.323 qat_dev2 - type: c6xx, inst_id: 2, node_id: 0, bsf: 0000:1e:00.0, #accel: 5 #engines: 10 state: down 00:04:57.260 0000:1a:00.0 set to 16 VFs 00:04:57.829 0000:1c:00.0 set to 16 VFs 00:04:58.766 0000:1e:00.0 set to 16 VFs 00:05:00.142 Properly configured the qat device with driver uio_pci_generic. 00:05:00.142 23:26:44 -- spdk/autotest.sh@162 -- # timing_enter lib 00:05:00.142 23:26:44 -- common/autotest_common.sh@724 -- # xtrace_disable 00:05:00.142 23:26:44 -- common/autotest_common.sh@10 -- # set +x 00:05:00.142 23:26:44 -- spdk/autotest.sh@164 -- # [[ 0 -eq 1 ]] 00:05:00.142 23:26:44 -- spdk/autotest.sh@168 -- # run_test env /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env.sh 00:05:00.142 23:26:44 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:00.142 23:26:44 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:00.142 23:26:44 -- common/autotest_common.sh@10 -- # set +x 00:05:00.142 ************************************ 00:05:00.142 START TEST env 00:05:00.142 ************************************ 00:05:00.142 23:26:44 env -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env.sh 00:05:00.142 * Looking for test storage... 00:05:00.142 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env 00:05:00.142 23:26:45 env -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/memory/memory_ut 00:05:00.142 23:26:45 env -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:00.142 23:26:45 env -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:00.142 23:26:45 env -- common/autotest_common.sh@10 -- # set +x 00:05:00.142 ************************************ 00:05:00.142 START TEST env_memory 00:05:00.142 ************************************ 00:05:00.142 23:26:45 env.env_memory -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/memory/memory_ut 00:05:00.142 00:05:00.142 00:05:00.142 CUnit - A unit testing framework for C - Version 2.1-3 00:05:00.142 http://cunit.sourceforge.net/ 00:05:00.142 00:05:00.142 00:05:00.142 Suite: memory 00:05:00.142 Test: alloc and free memory map ...[2024-07-24 23:26:45.074349] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:05:00.142 passed 00:05:00.143 Test: mem map translation ...[2024-07-24 23:26:45.092493] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 590:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:05:00.143 [2024-07-24 23:26:45.092505] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 590:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:05:00.143 [2024-07-24 23:26:45.092540] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 584:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:05:00.143 [2024-07-24 23:26:45.092547] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 600:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:05:00.143 passed 00:05:00.143 Test: mem map registration ...[2024-07-24 23:26:45.129594] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 346:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x200000 len=1234 00:05:00.143 [2024-07-24 23:26:45.129609] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 346:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x4d2 len=2097152 00:05:00.143 passed 00:05:00.403 Test: mem map adjacent registrations ...passed 00:05:00.403 00:05:00.403 Run Summary: Type Total Ran Passed Failed Inactive 00:05:00.403 suites 1 1 n/a 0 0 00:05:00.403 tests 4 4 4 0 0 00:05:00.403 asserts 152 152 152 0 n/a 00:05:00.403 00:05:00.403 Elapsed time = 0.141 seconds 00:05:00.403 00:05:00.403 real 0m0.152s 00:05:00.403 user 0m0.143s 00:05:00.403 sys 0m0.008s 00:05:00.403 23:26:45 env.env_memory -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:00.403 23:26:45 env.env_memory -- common/autotest_common.sh@10 -- # set +x 00:05:00.403 ************************************ 00:05:00.403 END TEST env_memory 00:05:00.403 ************************************ 00:05:00.403 23:26:45 env -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/vtophys/vtophys 00:05:00.403 23:26:45 env -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:00.403 23:26:45 env -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:00.403 23:26:45 env -- common/autotest_common.sh@10 -- # set +x 00:05:00.403 ************************************ 00:05:00.403 START TEST env_vtophys 00:05:00.403 ************************************ 00:05:00.403 23:26:45 env.env_vtophys -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/vtophys/vtophys 00:05:00.403 EAL: lib.eal log level changed from notice to debug 00:05:00.403 EAL: Detected lcore 0 as core 0 on socket 0 00:05:00.403 EAL: Detected lcore 1 as core 1 on socket 0 00:05:00.403 EAL: Detected lcore 2 as core 2 on socket 0 00:05:00.403 EAL: Detected lcore 3 as core 3 on socket 0 00:05:00.403 EAL: Detected lcore 4 as core 4 on socket 0 00:05:00.403 EAL: Detected lcore 5 as core 5 on socket 0 00:05:00.403 EAL: Detected lcore 6 as core 6 on socket 0 00:05:00.403 EAL: Detected lcore 7 as core 8 on socket 0 00:05:00.403 EAL: Detected lcore 8 as core 9 on socket 0 00:05:00.403 EAL: Detected lcore 9 as core 10 on socket 0 00:05:00.403 EAL: Detected lcore 10 as core 11 on socket 0 00:05:00.404 EAL: Detected lcore 11 as core 12 on socket 0 00:05:00.404 EAL: Detected lcore 12 as core 13 on socket 0 00:05:00.404 EAL: Detected lcore 13 as core 16 on socket 0 00:05:00.404 EAL: Detected lcore 14 as core 17 on socket 0 00:05:00.404 EAL: Detected lcore 15 as core 18 on socket 0 00:05:00.404 EAL: Detected lcore 16 as core 19 on socket 0 00:05:00.404 EAL: Detected lcore 17 as core 20 on socket 0 00:05:00.404 EAL: Detected lcore 18 as core 21 on socket 0 00:05:00.404 EAL: Detected lcore 19 as core 25 on socket 0 00:05:00.404 EAL: Detected lcore 20 as core 26 on socket 0 00:05:00.404 EAL: Detected lcore 21 as core 27 on socket 0 00:05:00.404 EAL: Detected lcore 22 as core 28 on socket 0 00:05:00.404 EAL: Detected lcore 23 as core 29 on socket 0 00:05:00.404 EAL: Detected lcore 24 as core 0 on socket 1 00:05:00.404 EAL: Detected lcore 25 as core 1 on socket 1 00:05:00.404 EAL: Detected lcore 26 as core 2 on socket 1 00:05:00.404 EAL: Detected lcore 27 as core 3 on socket 1 00:05:00.404 EAL: Detected lcore 28 as core 4 on socket 1 00:05:00.404 EAL: Detected lcore 29 as core 5 on socket 1 00:05:00.404 EAL: Detected lcore 30 as core 6 on socket 1 00:05:00.404 EAL: Detected lcore 31 as core 8 on socket 1 00:05:00.404 EAL: Detected lcore 32 as core 9 on socket 1 00:05:00.404 EAL: Detected lcore 33 as core 10 on socket 1 00:05:00.404 EAL: Detected lcore 34 as core 11 on socket 1 00:05:00.404 EAL: Detected lcore 35 as core 12 on socket 1 00:05:00.404 EAL: Detected lcore 36 as core 13 on socket 1 00:05:00.404 EAL: Detected lcore 37 as core 16 on socket 1 00:05:00.404 EAL: Detected lcore 38 as core 17 on socket 1 00:05:00.404 EAL: Detected lcore 39 as core 18 on socket 1 00:05:00.404 EAL: Detected lcore 40 as core 19 on socket 1 00:05:00.404 EAL: Detected lcore 41 as core 20 on socket 1 00:05:00.404 EAL: Detected lcore 42 as core 21 on socket 1 00:05:00.404 EAL: Detected lcore 43 as core 25 on socket 1 00:05:00.404 EAL: Detected lcore 44 as core 26 on socket 1 00:05:00.404 EAL: Detected lcore 45 as core 27 on socket 1 00:05:00.404 EAL: Detected lcore 46 as core 28 on socket 1 00:05:00.404 EAL: Detected lcore 47 as core 29 on socket 1 00:05:00.404 EAL: Detected lcore 48 as core 0 on socket 0 00:05:00.404 EAL: Detected lcore 49 as core 1 on socket 0 00:05:00.404 EAL: Detected lcore 50 as core 2 on socket 0 00:05:00.404 EAL: Detected lcore 51 as core 3 on socket 0 00:05:00.404 EAL: Detected lcore 52 as core 4 on socket 0 00:05:00.404 EAL: Detected lcore 53 as core 5 on socket 0 00:05:00.404 EAL: Detected lcore 54 as core 6 on socket 0 00:05:00.404 EAL: Detected lcore 55 as core 8 on socket 0 00:05:00.404 EAL: Detected lcore 56 as core 9 on socket 0 00:05:00.404 EAL: Detected lcore 57 as core 10 on socket 0 00:05:00.404 EAL: Detected lcore 58 as core 11 on socket 0 00:05:00.404 EAL: Detected lcore 59 as core 12 on socket 0 00:05:00.404 EAL: Detected lcore 60 as core 13 on socket 0 00:05:00.404 EAL: Detected lcore 61 as core 16 on socket 0 00:05:00.404 EAL: Detected lcore 62 as core 17 on socket 0 00:05:00.404 EAL: Detected lcore 63 as core 18 on socket 0 00:05:00.404 EAL: Detected lcore 64 as core 19 on socket 0 00:05:00.404 EAL: Detected lcore 65 as core 20 on socket 0 00:05:00.404 EAL: Detected lcore 66 as core 21 on socket 0 00:05:00.404 EAL: Detected lcore 67 as core 25 on socket 0 00:05:00.404 EAL: Detected lcore 68 as core 26 on socket 0 00:05:00.404 EAL: Detected lcore 69 as core 27 on socket 0 00:05:00.404 EAL: Detected lcore 70 as core 28 on socket 0 00:05:00.404 EAL: Detected lcore 71 as core 29 on socket 0 00:05:00.404 EAL: Detected lcore 72 as core 0 on socket 1 00:05:00.404 EAL: Detected lcore 73 as core 1 on socket 1 00:05:00.404 EAL: Detected lcore 74 as core 2 on socket 1 00:05:00.404 EAL: Detected lcore 75 as core 3 on socket 1 00:05:00.404 EAL: Detected lcore 76 as core 4 on socket 1 00:05:00.404 EAL: Detected lcore 77 as core 5 on socket 1 00:05:00.404 EAL: Detected lcore 78 as core 6 on socket 1 00:05:00.404 EAL: Detected lcore 79 as core 8 on socket 1 00:05:00.404 EAL: Detected lcore 80 as core 9 on socket 1 00:05:00.404 EAL: Detected lcore 81 as core 10 on socket 1 00:05:00.404 EAL: Detected lcore 82 as core 11 on socket 1 00:05:00.404 EAL: Detected lcore 83 as core 12 on socket 1 00:05:00.404 EAL: Detected lcore 84 as core 13 on socket 1 00:05:00.404 EAL: Detected lcore 85 as core 16 on socket 1 00:05:00.404 EAL: Detected lcore 86 as core 17 on socket 1 00:05:00.404 EAL: Detected lcore 87 as core 18 on socket 1 00:05:00.404 EAL: Detected lcore 88 as core 19 on socket 1 00:05:00.404 EAL: Detected lcore 89 as core 20 on socket 1 00:05:00.404 EAL: Detected lcore 90 as core 21 on socket 1 00:05:00.404 EAL: Detected lcore 91 as core 25 on socket 1 00:05:00.404 EAL: Detected lcore 92 as core 26 on socket 1 00:05:00.404 EAL: Detected lcore 93 as core 27 on socket 1 00:05:00.404 EAL: Detected lcore 94 as core 28 on socket 1 00:05:00.404 EAL: Detected lcore 95 as core 29 on socket 1 00:05:00.404 EAL: Maximum logical cores by configuration: 128 00:05:00.404 EAL: Detected CPU lcores: 96 00:05:00.404 EAL: Detected NUMA nodes: 2 00:05:00.404 EAL: Checking presence of .so 'librte_eal.so.24.1' 00:05:00.404 EAL: Detected shared linkage of DPDK 00:05:00.404 EAL: No shared files mode enabled, IPC will be disabled 00:05:00.404 EAL: No shared files mode enabled, IPC is disabled 00:05:00.404 EAL: PCI driver qat for device 0000:1a:01.0 wants IOVA as 'PA' 00:05:00.404 EAL: PCI driver qat for device 0000:1a:01.1 wants IOVA as 'PA' 00:05:00.404 EAL: PCI driver qat for device 0000:1a:01.2 wants IOVA as 'PA' 00:05:00.404 EAL: PCI driver qat for device 0000:1a:01.3 wants IOVA as 'PA' 00:05:00.404 EAL: PCI driver qat for device 0000:1a:01.4 wants IOVA as 'PA' 00:05:00.404 EAL: PCI driver qat for device 0000:1a:01.5 wants IOVA as 'PA' 00:05:00.404 EAL: PCI driver qat for device 0000:1a:01.6 wants IOVA as 'PA' 00:05:00.404 EAL: PCI driver qat for device 0000:1a:01.7 wants IOVA as 'PA' 00:05:00.404 EAL: PCI driver qat for device 0000:1a:02.0 wants IOVA as 'PA' 00:05:00.404 EAL: PCI driver qat for device 0000:1a:02.1 wants IOVA as 'PA' 00:05:00.404 EAL: PCI driver qat for device 0000:1a:02.2 wants IOVA as 'PA' 00:05:00.404 EAL: PCI driver qat for device 0000:1a:02.3 wants IOVA as 'PA' 00:05:00.404 EAL: PCI driver qat for device 0000:1a:02.4 wants IOVA as 'PA' 00:05:00.404 EAL: PCI driver qat for device 0000:1a:02.5 wants IOVA as 'PA' 00:05:00.404 EAL: PCI driver qat for device 0000:1a:02.6 wants IOVA as 'PA' 00:05:00.404 EAL: PCI driver qat for device 0000:1a:02.7 wants IOVA as 'PA' 00:05:00.404 EAL: PCI driver qat for device 0000:1c:01.0 wants IOVA as 'PA' 00:05:00.404 EAL: PCI driver qat for device 0000:1c:01.1 wants IOVA as 'PA' 00:05:00.404 EAL: PCI driver qat for device 0000:1c:01.2 wants IOVA as 'PA' 00:05:00.404 EAL: PCI driver qat for device 0000:1c:01.3 wants IOVA as 'PA' 00:05:00.404 EAL: PCI driver qat for device 0000:1c:01.4 wants IOVA as 'PA' 00:05:00.404 EAL: PCI driver qat for device 0000:1c:01.5 wants IOVA as 'PA' 00:05:00.404 EAL: PCI driver qat for device 0000:1c:01.6 wants IOVA as 'PA' 00:05:00.404 EAL: PCI driver qat for device 0000:1c:01.7 wants IOVA as 'PA' 00:05:00.404 EAL: PCI driver qat for device 0000:1c:02.0 wants IOVA as 'PA' 00:05:00.404 EAL: PCI driver qat for device 0000:1c:02.1 wants IOVA as 'PA' 00:05:00.404 EAL: PCI driver qat for device 0000:1c:02.2 wants IOVA as 'PA' 00:05:00.404 EAL: PCI driver qat for device 0000:1c:02.3 wants IOVA as 'PA' 00:05:00.404 EAL: PCI driver qat for device 0000:1c:02.4 wants IOVA as 'PA' 00:05:00.404 EAL: PCI driver qat for device 0000:1c:02.5 wants IOVA as 'PA' 00:05:00.404 EAL: PCI driver qat for device 0000:1c:02.6 wants IOVA as 'PA' 00:05:00.404 EAL: PCI driver qat for device 0000:1c:02.7 wants IOVA as 'PA' 00:05:00.404 EAL: PCI driver qat for device 0000:1e:01.0 wants IOVA as 'PA' 00:05:00.404 EAL: PCI driver qat for device 0000:1e:01.1 wants IOVA as 'PA' 00:05:00.404 EAL: PCI driver qat for device 0000:1e:01.2 wants IOVA as 'PA' 00:05:00.404 EAL: PCI driver qat for device 0000:1e:01.3 wants IOVA as 'PA' 00:05:00.404 EAL: PCI driver qat for device 0000:1e:01.4 wants IOVA as 'PA' 00:05:00.404 EAL: PCI driver qat for device 0000:1e:01.5 wants IOVA as 'PA' 00:05:00.404 EAL: PCI driver qat for device 0000:1e:01.6 wants IOVA as 'PA' 00:05:00.404 EAL: PCI driver qat for device 0000:1e:01.7 wants IOVA as 'PA' 00:05:00.404 EAL: PCI driver qat for device 0000:1e:02.0 wants IOVA as 'PA' 00:05:00.404 EAL: PCI driver qat for device 0000:1e:02.1 wants IOVA as 'PA' 00:05:00.404 EAL: PCI driver qat for device 0000:1e:02.2 wants IOVA as 'PA' 00:05:00.404 EAL: PCI driver qat for device 0000:1e:02.3 wants IOVA as 'PA' 00:05:00.404 EAL: PCI driver qat for device 0000:1e:02.4 wants IOVA as 'PA' 00:05:00.404 EAL: PCI driver qat for device 0000:1e:02.5 wants IOVA as 'PA' 00:05:00.404 EAL: PCI driver qat for device 0000:1e:02.6 wants IOVA as 'PA' 00:05:00.404 EAL: PCI driver qat for device 0000:1e:02.7 wants IOVA as 'PA' 00:05:00.404 EAL: Bus pci wants IOVA as 'PA' 00:05:00.404 EAL: Bus auxiliary wants IOVA as 'DC' 00:05:00.404 EAL: Bus vdev wants IOVA as 'DC' 00:05:00.404 EAL: Selected IOVA mode 'PA' 00:05:00.404 EAL: Probing VFIO support... 00:05:00.404 EAL: IOMMU type 1 (Type 1) is supported 00:05:00.404 EAL: IOMMU type 7 (sPAPR) is not supported 00:05:00.404 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:05:00.404 EAL: VFIO support initialized 00:05:00.404 EAL: Ask a virtual area of 0x2e000 bytes 00:05:00.404 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:05:00.404 EAL: Setting up physically contiguous memory... 00:05:00.404 EAL: Setting maximum number of open files to 524288 00:05:00.404 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:05:00.404 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:05:00.404 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:05:00.404 EAL: Ask a virtual area of 0x61000 bytes 00:05:00.404 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:05:00.404 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:00.404 EAL: Ask a virtual area of 0x400000000 bytes 00:05:00.404 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:05:00.404 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:05:00.404 EAL: Ask a virtual area of 0x61000 bytes 00:05:00.404 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:05:00.404 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:00.404 EAL: Ask a virtual area of 0x400000000 bytes 00:05:00.404 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:05:00.404 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:05:00.404 EAL: Ask a virtual area of 0x61000 bytes 00:05:00.404 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:05:00.404 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:00.405 EAL: Ask a virtual area of 0x400000000 bytes 00:05:00.405 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:05:00.405 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:05:00.405 EAL: Ask a virtual area of 0x61000 bytes 00:05:00.405 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:05:00.405 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:00.405 EAL: Ask a virtual area of 0x400000000 bytes 00:05:00.405 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:05:00.405 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:05:00.405 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:05:00.405 EAL: Ask a virtual area of 0x61000 bytes 00:05:00.405 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:05:00.405 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:00.405 EAL: Ask a virtual area of 0x400000000 bytes 00:05:00.405 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:05:00.405 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:05:00.405 EAL: Ask a virtual area of 0x61000 bytes 00:05:00.405 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:05:00.405 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:00.405 EAL: Ask a virtual area of 0x400000000 bytes 00:05:00.405 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:05:00.405 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:05:00.405 EAL: Ask a virtual area of 0x61000 bytes 00:05:00.405 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:05:00.405 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:00.405 EAL: Ask a virtual area of 0x400000000 bytes 00:05:00.405 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:05:00.405 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:05:00.405 EAL: Ask a virtual area of 0x61000 bytes 00:05:00.405 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:05:00.405 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:00.405 EAL: Ask a virtual area of 0x400000000 bytes 00:05:00.405 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:05:00.405 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:05:00.405 EAL: Hugepages will be freed exactly as allocated. 00:05:00.405 EAL: No shared files mode enabled, IPC is disabled 00:05:00.405 EAL: No shared files mode enabled, IPC is disabled 00:05:00.405 EAL: TSC frequency is ~2100000 KHz 00:05:00.405 EAL: Main lcore 0 is ready (tid=7fb5fe6c0b00;cpuset=[0]) 00:05:00.405 EAL: Trying to obtain current memory policy. 00:05:00.405 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:00.405 EAL: Restoring previous memory policy: 0 00:05:00.405 EAL: request: mp_malloc_sync 00:05:00.405 EAL: No shared files mode enabled, IPC is disabled 00:05:00.405 EAL: Heap on socket 0 was expanded by 2MB 00:05:00.405 EAL: PCI device 0000:1a:01.0 on NUMA socket 0 00:05:00.405 EAL: probe driver: 8086:37c9 qat 00:05:00.405 EAL: PCI memory mapped at 0x202001000000 00:05:00.405 EAL: PCI memory mapped at 0x202001001000 00:05:00.405 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.0 (socket 0) 00:05:00.405 EAL: PCI device 0000:1a:01.1 on NUMA socket 0 00:05:00.405 EAL: probe driver: 8086:37c9 qat 00:05:00.405 EAL: PCI memory mapped at 0x202001002000 00:05:00.405 EAL: PCI memory mapped at 0x202001003000 00:05:00.405 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.1 (socket 0) 00:05:00.405 EAL: PCI device 0000:1a:01.2 on NUMA socket 0 00:05:00.405 EAL: probe driver: 8086:37c9 qat 00:05:00.405 EAL: PCI memory mapped at 0x202001004000 00:05:00.405 EAL: PCI memory mapped at 0x202001005000 00:05:00.405 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.2 (socket 0) 00:05:00.405 EAL: PCI device 0000:1a:01.3 on NUMA socket 0 00:05:00.405 EAL: probe driver: 8086:37c9 qat 00:05:00.405 EAL: PCI memory mapped at 0x202001006000 00:05:00.405 EAL: PCI memory mapped at 0x202001007000 00:05:00.405 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.3 (socket 0) 00:05:00.405 EAL: PCI device 0000:1a:01.4 on NUMA socket 0 00:05:00.405 EAL: probe driver: 8086:37c9 qat 00:05:00.405 EAL: PCI memory mapped at 0x202001008000 00:05:00.405 EAL: PCI memory mapped at 0x202001009000 00:05:00.405 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.4 (socket 0) 00:05:00.405 EAL: PCI device 0000:1a:01.5 on NUMA socket 0 00:05:00.405 EAL: probe driver: 8086:37c9 qat 00:05:00.405 EAL: PCI memory mapped at 0x20200100a000 00:05:00.405 EAL: PCI memory mapped at 0x20200100b000 00:05:00.405 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.5 (socket 0) 00:05:00.405 EAL: PCI device 0000:1a:01.6 on NUMA socket 0 00:05:00.405 EAL: probe driver: 8086:37c9 qat 00:05:00.405 EAL: PCI memory mapped at 0x20200100c000 00:05:00.405 EAL: PCI memory mapped at 0x20200100d000 00:05:00.405 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.6 (socket 0) 00:05:00.405 EAL: PCI device 0000:1a:01.7 on NUMA socket 0 00:05:00.405 EAL: probe driver: 8086:37c9 qat 00:05:00.405 EAL: PCI memory mapped at 0x20200100e000 00:05:00.405 EAL: PCI memory mapped at 0x20200100f000 00:05:00.405 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.7 (socket 0) 00:05:00.405 EAL: PCI device 0000:1a:02.0 on NUMA socket 0 00:05:00.405 EAL: probe driver: 8086:37c9 qat 00:05:00.405 EAL: PCI memory mapped at 0x202001010000 00:05:00.405 EAL: PCI memory mapped at 0x202001011000 00:05:00.405 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.0 (socket 0) 00:05:00.405 EAL: PCI device 0000:1a:02.1 on NUMA socket 0 00:05:00.405 EAL: probe driver: 8086:37c9 qat 00:05:00.405 EAL: PCI memory mapped at 0x202001012000 00:05:00.405 EAL: PCI memory mapped at 0x202001013000 00:05:00.405 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.1 (socket 0) 00:05:00.405 EAL: PCI device 0000:1a:02.2 on NUMA socket 0 00:05:00.405 EAL: probe driver: 8086:37c9 qat 00:05:00.405 EAL: PCI memory mapped at 0x202001014000 00:05:00.405 EAL: PCI memory mapped at 0x202001015000 00:05:00.405 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.2 (socket 0) 00:05:00.405 EAL: PCI device 0000:1a:02.3 on NUMA socket 0 00:05:00.405 EAL: probe driver: 8086:37c9 qat 00:05:00.405 EAL: PCI memory mapped at 0x202001016000 00:05:00.405 EAL: PCI memory mapped at 0x202001017000 00:05:00.405 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.3 (socket 0) 00:05:00.405 EAL: PCI device 0000:1a:02.4 on NUMA socket 0 00:05:00.405 EAL: probe driver: 8086:37c9 qat 00:05:00.405 EAL: PCI memory mapped at 0x202001018000 00:05:00.405 EAL: PCI memory mapped at 0x202001019000 00:05:00.405 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.4 (socket 0) 00:05:00.405 EAL: PCI device 0000:1a:02.5 on NUMA socket 0 00:05:00.405 EAL: probe driver: 8086:37c9 qat 00:05:00.405 EAL: PCI memory mapped at 0x20200101a000 00:05:00.405 EAL: PCI memory mapped at 0x20200101b000 00:05:00.405 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.5 (socket 0) 00:05:00.405 EAL: PCI device 0000:1a:02.6 on NUMA socket 0 00:05:00.405 EAL: probe driver: 8086:37c9 qat 00:05:00.405 EAL: PCI memory mapped at 0x20200101c000 00:05:00.405 EAL: PCI memory mapped at 0x20200101d000 00:05:00.405 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.6 (socket 0) 00:05:00.405 EAL: PCI device 0000:1a:02.7 on NUMA socket 0 00:05:00.405 EAL: probe driver: 8086:37c9 qat 00:05:00.405 EAL: PCI memory mapped at 0x20200101e000 00:05:00.405 EAL: PCI memory mapped at 0x20200101f000 00:05:00.405 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.7 (socket 0) 00:05:00.405 EAL: PCI device 0000:1c:01.0 on NUMA socket 0 00:05:00.405 EAL: probe driver: 8086:37c9 qat 00:05:00.405 EAL: PCI memory mapped at 0x202001020000 00:05:00.405 EAL: PCI memory mapped at 0x202001021000 00:05:00.405 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.0 (socket 0) 00:05:00.405 EAL: PCI device 0000:1c:01.1 on NUMA socket 0 00:05:00.405 EAL: probe driver: 8086:37c9 qat 00:05:00.405 EAL: PCI memory mapped at 0x202001022000 00:05:00.405 EAL: PCI memory mapped at 0x202001023000 00:05:00.405 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.1 (socket 0) 00:05:00.405 EAL: PCI device 0000:1c:01.2 on NUMA socket 0 00:05:00.405 EAL: probe driver: 8086:37c9 qat 00:05:00.405 EAL: PCI memory mapped at 0x202001024000 00:05:00.405 EAL: PCI memory mapped at 0x202001025000 00:05:00.405 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.2 (socket 0) 00:05:00.405 EAL: PCI device 0000:1c:01.3 on NUMA socket 0 00:05:00.405 EAL: probe driver: 8086:37c9 qat 00:05:00.405 EAL: PCI memory mapped at 0x202001026000 00:05:00.405 EAL: PCI memory mapped at 0x202001027000 00:05:00.405 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.3 (socket 0) 00:05:00.405 EAL: PCI device 0000:1c:01.4 on NUMA socket 0 00:05:00.405 EAL: probe driver: 8086:37c9 qat 00:05:00.405 EAL: PCI memory mapped at 0x202001028000 00:05:00.405 EAL: PCI memory mapped at 0x202001029000 00:05:00.405 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.4 (socket 0) 00:05:00.405 EAL: PCI device 0000:1c:01.5 on NUMA socket 0 00:05:00.405 EAL: probe driver: 8086:37c9 qat 00:05:00.405 EAL: PCI memory mapped at 0x20200102a000 00:05:00.405 EAL: PCI memory mapped at 0x20200102b000 00:05:00.405 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.5 (socket 0) 00:05:00.405 EAL: PCI device 0000:1c:01.6 on NUMA socket 0 00:05:00.405 EAL: probe driver: 8086:37c9 qat 00:05:00.405 EAL: PCI memory mapped at 0x20200102c000 00:05:00.405 EAL: PCI memory mapped at 0x20200102d000 00:05:00.405 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.6 (socket 0) 00:05:00.405 EAL: PCI device 0000:1c:01.7 on NUMA socket 0 00:05:00.405 EAL: probe driver: 8086:37c9 qat 00:05:00.405 EAL: PCI memory mapped at 0x20200102e000 00:05:00.405 EAL: PCI memory mapped at 0x20200102f000 00:05:00.405 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.7 (socket 0) 00:05:00.405 EAL: PCI device 0000:1c:02.0 on NUMA socket 0 00:05:00.405 EAL: probe driver: 8086:37c9 qat 00:05:00.405 EAL: PCI memory mapped at 0x202001030000 00:05:00.405 EAL: PCI memory mapped at 0x202001031000 00:05:00.405 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.0 (socket 0) 00:05:00.405 EAL: PCI device 0000:1c:02.1 on NUMA socket 0 00:05:00.405 EAL: probe driver: 8086:37c9 qat 00:05:00.406 EAL: PCI memory mapped at 0x202001032000 00:05:00.406 EAL: PCI memory mapped at 0x202001033000 00:05:00.406 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.1 (socket 0) 00:05:00.406 EAL: PCI device 0000:1c:02.2 on NUMA socket 0 00:05:00.406 EAL: probe driver: 8086:37c9 qat 00:05:00.406 EAL: PCI memory mapped at 0x202001034000 00:05:00.406 EAL: PCI memory mapped at 0x202001035000 00:05:00.406 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.2 (socket 0) 00:05:00.406 EAL: PCI device 0000:1c:02.3 on NUMA socket 0 00:05:00.406 EAL: probe driver: 8086:37c9 qat 00:05:00.406 EAL: PCI memory mapped at 0x202001036000 00:05:00.406 EAL: PCI memory mapped at 0x202001037000 00:05:00.406 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.3 (socket 0) 00:05:00.406 EAL: PCI device 0000:1c:02.4 on NUMA socket 0 00:05:00.406 EAL: probe driver: 8086:37c9 qat 00:05:00.406 EAL: PCI memory mapped at 0x202001038000 00:05:00.406 EAL: PCI memory mapped at 0x202001039000 00:05:00.406 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.4 (socket 0) 00:05:00.406 EAL: PCI device 0000:1c:02.5 on NUMA socket 0 00:05:00.406 EAL: probe driver: 8086:37c9 qat 00:05:00.406 EAL: PCI memory mapped at 0x20200103a000 00:05:00.406 EAL: PCI memory mapped at 0x20200103b000 00:05:00.406 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.5 (socket 0) 00:05:00.406 EAL: PCI device 0000:1c:02.6 on NUMA socket 0 00:05:00.406 EAL: probe driver: 8086:37c9 qat 00:05:00.406 EAL: PCI memory mapped at 0x20200103c000 00:05:00.406 EAL: PCI memory mapped at 0x20200103d000 00:05:00.406 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.6 (socket 0) 00:05:00.406 EAL: PCI device 0000:1c:02.7 on NUMA socket 0 00:05:00.406 EAL: probe driver: 8086:37c9 qat 00:05:00.406 EAL: PCI memory mapped at 0x20200103e000 00:05:00.406 EAL: PCI memory mapped at 0x20200103f000 00:05:00.406 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.7 (socket 0) 00:05:00.406 EAL: PCI device 0000:1e:01.0 on NUMA socket 0 00:05:00.406 EAL: probe driver: 8086:37c9 qat 00:05:00.406 EAL: PCI memory mapped at 0x202001040000 00:05:00.406 EAL: PCI memory mapped at 0x202001041000 00:05:00.406 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.0 (socket 0) 00:05:00.406 EAL: PCI device 0000:1e:01.1 on NUMA socket 0 00:05:00.406 EAL: probe driver: 8086:37c9 qat 00:05:00.406 EAL: PCI memory mapped at 0x202001042000 00:05:00.406 EAL: PCI memory mapped at 0x202001043000 00:05:00.406 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.1 (socket 0) 00:05:00.406 EAL: PCI device 0000:1e:01.2 on NUMA socket 0 00:05:00.406 EAL: probe driver: 8086:37c9 qat 00:05:00.406 EAL: PCI memory mapped at 0x202001044000 00:05:00.406 EAL: PCI memory mapped at 0x202001045000 00:05:00.406 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.2 (socket 0) 00:05:00.406 EAL: PCI device 0000:1e:01.3 on NUMA socket 0 00:05:00.406 EAL: probe driver: 8086:37c9 qat 00:05:00.406 EAL: PCI memory mapped at 0x202001046000 00:05:00.406 EAL: PCI memory mapped at 0x202001047000 00:05:00.406 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.3 (socket 0) 00:05:00.406 EAL: PCI device 0000:1e:01.4 on NUMA socket 0 00:05:00.406 EAL: probe driver: 8086:37c9 qat 00:05:00.406 EAL: PCI memory mapped at 0x202001048000 00:05:00.406 EAL: PCI memory mapped at 0x202001049000 00:05:00.406 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.4 (socket 0) 00:05:00.406 EAL: PCI device 0000:1e:01.5 on NUMA socket 0 00:05:00.406 EAL: probe driver: 8086:37c9 qat 00:05:00.406 EAL: PCI memory mapped at 0x20200104a000 00:05:00.406 EAL: PCI memory mapped at 0x20200104b000 00:05:00.406 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.5 (socket 0) 00:05:00.406 EAL: PCI device 0000:1e:01.6 on NUMA socket 0 00:05:00.406 EAL: probe driver: 8086:37c9 qat 00:05:00.406 EAL: PCI memory mapped at 0x20200104c000 00:05:00.406 EAL: PCI memory mapped at 0x20200104d000 00:05:00.406 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.6 (socket 0) 00:05:00.406 EAL: PCI device 0000:1e:01.7 on NUMA socket 0 00:05:00.406 EAL: probe driver: 8086:37c9 qat 00:05:00.406 EAL: PCI memory mapped at 0x20200104e000 00:05:00.406 EAL: PCI memory mapped at 0x20200104f000 00:05:00.406 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.7 (socket 0) 00:05:00.406 EAL: PCI device 0000:1e:02.0 on NUMA socket 0 00:05:00.406 EAL: probe driver: 8086:37c9 qat 00:05:00.406 EAL: PCI memory mapped at 0x202001050000 00:05:00.406 EAL: PCI memory mapped at 0x202001051000 00:05:00.406 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.0 (socket 0) 00:05:00.406 EAL: PCI device 0000:1e:02.1 on NUMA socket 0 00:05:00.406 EAL: probe driver: 8086:37c9 qat 00:05:00.406 EAL: PCI memory mapped at 0x202001052000 00:05:00.406 EAL: PCI memory mapped at 0x202001053000 00:05:00.406 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.1 (socket 0) 00:05:00.406 EAL: PCI device 0000:1e:02.2 on NUMA socket 0 00:05:00.406 EAL: probe driver: 8086:37c9 qat 00:05:00.406 EAL: PCI memory mapped at 0x202001054000 00:05:00.406 EAL: PCI memory mapped at 0x202001055000 00:05:00.406 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.2 (socket 0) 00:05:00.406 EAL: PCI device 0000:1e:02.3 on NUMA socket 0 00:05:00.406 EAL: probe driver: 8086:37c9 qat 00:05:00.406 EAL: PCI memory mapped at 0x202001056000 00:05:00.406 EAL: PCI memory mapped at 0x202001057000 00:05:00.406 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.3 (socket 0) 00:05:00.406 EAL: PCI device 0000:1e:02.4 on NUMA socket 0 00:05:00.406 EAL: probe driver: 8086:37c9 qat 00:05:00.406 EAL: PCI memory mapped at 0x202001058000 00:05:00.406 EAL: PCI memory mapped at 0x202001059000 00:05:00.406 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.4 (socket 0) 00:05:00.406 EAL: PCI device 0000:1e:02.5 on NUMA socket 0 00:05:00.406 EAL: probe driver: 8086:37c9 qat 00:05:00.406 EAL: PCI memory mapped at 0x20200105a000 00:05:00.406 EAL: PCI memory mapped at 0x20200105b000 00:05:00.406 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.5 (socket 0) 00:05:00.406 EAL: PCI device 0000:1e:02.6 on NUMA socket 0 00:05:00.406 EAL: probe driver: 8086:37c9 qat 00:05:00.406 EAL: PCI memory mapped at 0x20200105c000 00:05:00.406 EAL: PCI memory mapped at 0x20200105d000 00:05:00.406 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.6 (socket 0) 00:05:00.406 EAL: PCI device 0000:1e:02.7 on NUMA socket 0 00:05:00.406 EAL: probe driver: 8086:37c9 qat 00:05:00.406 EAL: PCI memory mapped at 0x20200105e000 00:05:00.406 EAL: PCI memory mapped at 0x20200105f000 00:05:00.406 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.7 (socket 0) 00:05:00.406 EAL: No shared files mode enabled, IPC is disabled 00:05:00.406 EAL: No shared files mode enabled, IPC is disabled 00:05:00.406 EAL: No PCI address specified using 'addr=' in: bus=pci 00:05:00.406 EAL: Mem event callback 'spdk:(nil)' registered 00:05:00.406 00:05:00.406 00:05:00.406 CUnit - A unit testing framework for C - Version 2.1-3 00:05:00.406 http://cunit.sourceforge.net/ 00:05:00.406 00:05:00.406 00:05:00.406 Suite: components_suite 00:05:00.406 Test: vtophys_malloc_test ...passed 00:05:00.406 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:05:00.406 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:00.406 EAL: Restoring previous memory policy: 4 00:05:00.406 EAL: Calling mem event callback 'spdk:(nil)' 00:05:00.406 EAL: request: mp_malloc_sync 00:05:00.406 EAL: No shared files mode enabled, IPC is disabled 00:05:00.406 EAL: Heap on socket 0 was expanded by 4MB 00:05:00.406 EAL: Calling mem event callback 'spdk:(nil)' 00:05:00.406 EAL: request: mp_malloc_sync 00:05:00.406 EAL: No shared files mode enabled, IPC is disabled 00:05:00.406 EAL: Heap on socket 0 was shrunk by 4MB 00:05:00.406 EAL: Trying to obtain current memory policy. 00:05:00.406 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:00.406 EAL: Restoring previous memory policy: 4 00:05:00.406 EAL: Calling mem event callback 'spdk:(nil)' 00:05:00.406 EAL: request: mp_malloc_sync 00:05:00.406 EAL: No shared files mode enabled, IPC is disabled 00:05:00.406 EAL: Heap on socket 0 was expanded by 6MB 00:05:00.406 EAL: Calling mem event callback 'spdk:(nil)' 00:05:00.406 EAL: request: mp_malloc_sync 00:05:00.406 EAL: No shared files mode enabled, IPC is disabled 00:05:00.406 EAL: Heap on socket 0 was shrunk by 6MB 00:05:00.406 EAL: Trying to obtain current memory policy. 00:05:00.406 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:00.406 EAL: Restoring previous memory policy: 4 00:05:00.406 EAL: Calling mem event callback 'spdk:(nil)' 00:05:00.406 EAL: request: mp_malloc_sync 00:05:00.406 EAL: No shared files mode enabled, IPC is disabled 00:05:00.406 EAL: Heap on socket 0 was expanded by 10MB 00:05:00.406 EAL: Calling mem event callback 'spdk:(nil)' 00:05:00.406 EAL: request: mp_malloc_sync 00:05:00.406 EAL: No shared files mode enabled, IPC is disabled 00:05:00.406 EAL: Heap on socket 0 was shrunk by 10MB 00:05:00.406 EAL: Trying to obtain current memory policy. 00:05:00.406 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:00.406 EAL: Restoring previous memory policy: 4 00:05:00.406 EAL: Calling mem event callback 'spdk:(nil)' 00:05:00.406 EAL: request: mp_malloc_sync 00:05:00.406 EAL: No shared files mode enabled, IPC is disabled 00:05:00.406 EAL: Heap on socket 0 was expanded by 18MB 00:05:00.406 EAL: Calling mem event callback 'spdk:(nil)' 00:05:00.406 EAL: request: mp_malloc_sync 00:05:00.406 EAL: No shared files mode enabled, IPC is disabled 00:05:00.406 EAL: Heap on socket 0 was shrunk by 18MB 00:05:00.406 EAL: Trying to obtain current memory policy. 00:05:00.406 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:00.406 EAL: Restoring previous memory policy: 4 00:05:00.406 EAL: Calling mem event callback 'spdk:(nil)' 00:05:00.406 EAL: request: mp_malloc_sync 00:05:00.406 EAL: No shared files mode enabled, IPC is disabled 00:05:00.406 EAL: Heap on socket 0 was expanded by 34MB 00:05:00.406 EAL: Calling mem event callback 'spdk:(nil)' 00:05:00.406 EAL: request: mp_malloc_sync 00:05:00.406 EAL: No shared files mode enabled, IPC is disabled 00:05:00.406 EAL: Heap on socket 0 was shrunk by 34MB 00:05:00.406 EAL: Trying to obtain current memory policy. 00:05:00.406 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:00.406 EAL: Restoring previous memory policy: 4 00:05:00.406 EAL: Calling mem event callback 'spdk:(nil)' 00:05:00.406 EAL: request: mp_malloc_sync 00:05:00.406 EAL: No shared files mode enabled, IPC is disabled 00:05:00.407 EAL: Heap on socket 0 was expanded by 66MB 00:05:00.407 EAL: Calling mem event callback 'spdk:(nil)' 00:05:00.407 EAL: request: mp_malloc_sync 00:05:00.407 EAL: No shared files mode enabled, IPC is disabled 00:05:00.407 EAL: Heap on socket 0 was shrunk by 66MB 00:05:00.407 EAL: Trying to obtain current memory policy. 00:05:00.407 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:00.666 EAL: Restoring previous memory policy: 4 00:05:00.666 EAL: Calling mem event callback 'spdk:(nil)' 00:05:00.666 EAL: request: mp_malloc_sync 00:05:00.666 EAL: No shared files mode enabled, IPC is disabled 00:05:00.666 EAL: Heap on socket 0 was expanded by 130MB 00:05:00.666 EAL: Calling mem event callback 'spdk:(nil)' 00:05:00.666 EAL: request: mp_malloc_sync 00:05:00.666 EAL: No shared files mode enabled, IPC is disabled 00:05:00.666 EAL: Heap on socket 0 was shrunk by 130MB 00:05:00.666 EAL: Trying to obtain current memory policy. 00:05:00.666 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:00.666 EAL: Restoring previous memory policy: 4 00:05:00.666 EAL: Calling mem event callback 'spdk:(nil)' 00:05:00.666 EAL: request: mp_malloc_sync 00:05:00.666 EAL: No shared files mode enabled, IPC is disabled 00:05:00.666 EAL: Heap on socket 0 was expanded by 258MB 00:05:00.666 EAL: Calling mem event callback 'spdk:(nil)' 00:05:00.666 EAL: request: mp_malloc_sync 00:05:00.666 EAL: No shared files mode enabled, IPC is disabled 00:05:00.666 EAL: Heap on socket 0 was shrunk by 258MB 00:05:00.666 EAL: Trying to obtain current memory policy. 00:05:00.666 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:00.925 EAL: Restoring previous memory policy: 4 00:05:00.925 EAL: Calling mem event callback 'spdk:(nil)' 00:05:00.925 EAL: request: mp_malloc_sync 00:05:00.925 EAL: No shared files mode enabled, IPC is disabled 00:05:00.925 EAL: Heap on socket 0 was expanded by 514MB 00:05:00.925 EAL: Calling mem event callback 'spdk:(nil)' 00:05:00.925 EAL: request: mp_malloc_sync 00:05:00.925 EAL: No shared files mode enabled, IPC is disabled 00:05:00.925 EAL: Heap on socket 0 was shrunk by 514MB 00:05:00.925 EAL: Trying to obtain current memory policy. 00:05:00.925 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:01.182 EAL: Restoring previous memory policy: 4 00:05:01.182 EAL: Calling mem event callback 'spdk:(nil)' 00:05:01.182 EAL: request: mp_malloc_sync 00:05:01.182 EAL: No shared files mode enabled, IPC is disabled 00:05:01.182 EAL: Heap on socket 0 was expanded by 1026MB 00:05:01.182 EAL: Calling mem event callback 'spdk:(nil)' 00:05:01.441 EAL: request: mp_malloc_sync 00:05:01.441 EAL: No shared files mode enabled, IPC is disabled 00:05:01.441 EAL: Heap on socket 0 was shrunk by 1026MB 00:05:01.441 passed 00:05:01.441 00:05:01.441 Run Summary: Type Total Ran Passed Failed Inactive 00:05:01.441 suites 1 1 n/a 0 0 00:05:01.441 tests 2 2 2 0 0 00:05:01.441 asserts 6618 6618 6618 0 n/a 00:05:01.441 00:05:01.441 Elapsed time = 0.959 seconds 00:05:01.441 EAL: No shared files mode enabled, IPC is disabled 00:05:01.441 EAL: No shared files mode enabled, IPC is disabled 00:05:01.441 EAL: No shared files mode enabled, IPC is disabled 00:05:01.441 00:05:01.441 real 0m1.092s 00:05:01.441 user 0m0.637s 00:05:01.441 sys 0m0.423s 00:05:01.441 23:26:46 env.env_vtophys -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:01.441 23:26:46 env.env_vtophys -- common/autotest_common.sh@10 -- # set +x 00:05:01.441 ************************************ 00:05:01.441 END TEST env_vtophys 00:05:01.441 ************************************ 00:05:01.441 23:26:46 env -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/pci/pci_ut 00:05:01.441 23:26:46 env -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:01.441 23:26:46 env -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:01.441 23:26:46 env -- common/autotest_common.sh@10 -- # set +x 00:05:01.441 ************************************ 00:05:01.441 START TEST env_pci 00:05:01.441 ************************************ 00:05:01.441 23:26:46 env.env_pci -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/pci/pci_ut 00:05:01.441 00:05:01.441 00:05:01.441 CUnit - A unit testing framework for C - Version 2.1-3 00:05:01.441 http://cunit.sourceforge.net/ 00:05:01.441 00:05:01.441 00:05:01.441 Suite: pci 00:05:01.441 Test: pci_hook ...[2024-07-24 23:26:46.422894] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/pci.c:1040:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 198035 has claimed it 00:05:01.700 EAL: Cannot find device (10000:00:01.0) 00:05:01.700 EAL: Failed to attach device on primary process 00:05:01.700 passed 00:05:01.700 00:05:01.700 Run Summary: Type Total Ran Passed Failed Inactive 00:05:01.700 suites 1 1 n/a 0 0 00:05:01.700 tests 1 1 1 0 0 00:05:01.700 asserts 25 25 25 0 n/a 00:05:01.700 00:05:01.700 Elapsed time = 0.027 seconds 00:05:01.700 00:05:01.700 real 0m0.050s 00:05:01.700 user 0m0.016s 00:05:01.700 sys 0m0.033s 00:05:01.700 23:26:46 env.env_pci -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:01.700 23:26:46 env.env_pci -- common/autotest_common.sh@10 -- # set +x 00:05:01.700 ************************************ 00:05:01.700 END TEST env_pci 00:05:01.700 ************************************ 00:05:01.700 23:26:46 env -- env/env.sh@14 -- # argv='-c 0x1 ' 00:05:01.700 23:26:46 env -- env/env.sh@15 -- # uname 00:05:01.700 23:26:46 env -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:05:01.700 23:26:46 env -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:05:01.700 23:26:46 env -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:05:01.700 23:26:46 env -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:05:01.700 23:26:46 env -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:01.701 23:26:46 env -- common/autotest_common.sh@10 -- # set +x 00:05:01.701 ************************************ 00:05:01.701 START TEST env_dpdk_post_init 00:05:01.701 ************************************ 00:05:01.701 23:26:46 env.env_dpdk_post_init -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:05:01.701 EAL: Detected CPU lcores: 96 00:05:01.701 EAL: Detected NUMA nodes: 2 00:05:01.701 EAL: Detected shared linkage of DPDK 00:05:01.701 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:05:01.701 EAL: Selected IOVA mode 'PA' 00:05:01.701 EAL: VFIO support initialized 00:05:01.701 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.0 (socket 0) 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1a:01.0_qat_asym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1a:01.0_qat_sym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:01.701 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.1 (socket 0) 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1a:01.1_qat_asym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1a:01.1_qat_sym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:01.701 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.2 (socket 0) 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1a:01.2_qat_asym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1a:01.2_qat_sym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:01.701 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.3 (socket 0) 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1a:01.3_qat_asym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1a:01.3_qat_sym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:01.701 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.4 (socket 0) 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1a:01.4_qat_asym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1a:01.4_qat_sym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:01.701 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.5 (socket 0) 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1a:01.5_qat_asym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1a:01.5_qat_sym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:01.701 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.6 (socket 0) 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1a:01.6_qat_asym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1a:01.6_qat_sym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:01.701 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.7 (socket 0) 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1a:01.7_qat_asym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1a:01.7_qat_sym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:01.701 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.0 (socket 0) 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1a:02.0_qat_asym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1a:02.0_qat_sym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:01.701 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.1 (socket 0) 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1a:02.1_qat_asym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1a:02.1_qat_sym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:01.701 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.2 (socket 0) 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1a:02.2_qat_asym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1a:02.2_qat_sym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:01.701 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.3 (socket 0) 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1a:02.3_qat_asym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1a:02.3_qat_sym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:01.701 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.4 (socket 0) 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1a:02.4_qat_asym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1a:02.4_qat_sym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:01.701 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.5 (socket 0) 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1a:02.5_qat_asym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1a:02.5_qat_sym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:01.701 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.6 (socket 0) 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1a:02.6_qat_asym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1a:02.6_qat_sym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:01.701 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.7 (socket 0) 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1a:02.7_qat_asym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1a:02.7_qat_sym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:01.701 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.0 (socket 0) 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1c:01.0_qat_asym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1c:01.0_qat_sym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:01.701 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.1 (socket 0) 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1c:01.1_qat_asym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1c:01.1_qat_sym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:01.701 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.2 (socket 0) 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1c:01.2_qat_asym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1c:01.2_qat_sym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:01.701 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.3 (socket 0) 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1c:01.3_qat_asym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1c:01.3_qat_sym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:01.701 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.4 (socket 0) 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1c:01.4_qat_asym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1c:01.4_qat_sym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:01.701 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.5 (socket 0) 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1c:01.5_qat_asym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1c:01.5_qat_sym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:01.701 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.6 (socket 0) 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1c:01.6_qat_asym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1c:01.6_qat_sym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:01.701 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.7 (socket 0) 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1c:01.7_qat_asym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1c:01.7_qat_sym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:01.701 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.0 (socket 0) 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1c:02.0_qat_asym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1c:02.0_qat_sym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:01.701 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.1 (socket 0) 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1c:02.1_qat_asym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1c:02.1_qat_sym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:01.701 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.2 (socket 0) 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1c:02.2_qat_asym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1c:02.2_qat_sym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:01.701 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.3 (socket 0) 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1c:02.3_qat_asym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1c:02.3_qat_sym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:01.701 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.4 (socket 0) 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1c:02.4_qat_asym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1c:02.4_qat_sym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:01.701 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.5 (socket 0) 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1c:02.5_qat_asym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1c:02.5_qat_sym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:01.701 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.6 (socket 0) 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1c:02.6_qat_asym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1c:02.6_qat_sym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:01.701 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.7 (socket 0) 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1c:02.7_qat_asym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1c:02.7_qat_sym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:01.701 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.0 (socket 0) 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1e:01.0_qat_asym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1e:01.0_qat_sym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:01.701 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.1 (socket 0) 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1e:01.1_qat_asym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1e:01.1_qat_sym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:01.701 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.2 (socket 0) 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1e:01.2_qat_asym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1e:01.2_qat_sym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:01.701 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.3 (socket 0) 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1e:01.3_qat_asym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1e:01.3_qat_sym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:01.701 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.4 (socket 0) 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1e:01.4_qat_asym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1e:01.4_qat_sym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:01.701 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.5 (socket 0) 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1e:01.5_qat_asym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1e:01.5_qat_sym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:01.701 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.6 (socket 0) 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1e:01.6_qat_asym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1e:01.6_qat_sym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:01.701 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.7 (socket 0) 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1e:01.7_qat_asym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:01.701 CRYPTODEV: Creating cryptodev 0000:1e:01.7_qat_sym 00:05:01.701 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:01.701 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.0 (socket 0) 00:05:01.702 CRYPTODEV: Creating cryptodev 0000:1e:02.0_qat_asym 00:05:01.702 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:01.702 CRYPTODEV: Creating cryptodev 0000:1e:02.0_qat_sym 00:05:01.702 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:01.702 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.1 (socket 0) 00:05:01.702 CRYPTODEV: Creating cryptodev 0000:1e:02.1_qat_asym 00:05:01.702 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:01.702 CRYPTODEV: Creating cryptodev 0000:1e:02.1_qat_sym 00:05:01.702 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:01.702 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.2 (socket 0) 00:05:01.702 CRYPTODEV: Creating cryptodev 0000:1e:02.2_qat_asym 00:05:01.702 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:01.702 CRYPTODEV: Creating cryptodev 0000:1e:02.2_qat_sym 00:05:01.702 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:01.702 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.3 (socket 0) 00:05:01.702 CRYPTODEV: Creating cryptodev 0000:1e:02.3_qat_asym 00:05:01.702 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:01.702 CRYPTODEV: Creating cryptodev 0000:1e:02.3_qat_sym 00:05:01.702 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:01.702 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.4 (socket 0) 00:05:01.702 CRYPTODEV: Creating cryptodev 0000:1e:02.4_qat_asym 00:05:01.702 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:01.702 CRYPTODEV: Creating cryptodev 0000:1e:02.4_qat_sym 00:05:01.702 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:01.702 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.5 (socket 0) 00:05:01.702 CRYPTODEV: Creating cryptodev 0000:1e:02.5_qat_asym 00:05:01.702 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:01.702 CRYPTODEV: Creating cryptodev 0000:1e:02.5_qat_sym 00:05:01.702 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:01.702 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.6 (socket 0) 00:05:01.702 CRYPTODEV: Creating cryptodev 0000:1e:02.6_qat_asym 00:05:01.702 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:01.702 CRYPTODEV: Creating cryptodev 0000:1e:02.6_qat_sym 00:05:01.702 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:01.702 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.7 (socket 0) 00:05:01.702 CRYPTODEV: Creating cryptodev 0000:1e:02.7_qat_asym 00:05:01.702 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:01.702 CRYPTODEV: Creating cryptodev 0000:1e:02.7_qat_sym 00:05:01.702 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:01.702 TELEMETRY: No legacy callbacks, legacy socket not created 00:05:01.702 EAL: Using IOMMU type 1 (Type 1) 00:05:01.702 EAL: Ignore mapping IO port bar(1) 00:05:01.702 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.0 (socket 0) 00:05:01.702 EAL: Ignore mapping IO port bar(1) 00:05:01.702 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.1 (socket 0) 00:05:01.702 EAL: Ignore mapping IO port bar(1) 00:05:01.702 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.2 (socket 0) 00:05:01.702 EAL: Ignore mapping IO port bar(1) 00:05:01.702 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.3 (socket 0) 00:05:01.702 EAL: Ignore mapping IO port bar(1) 00:05:01.702 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.4 (socket 0) 00:05:01.702 EAL: Ignore mapping IO port bar(1) 00:05:01.702 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.5 (socket 0) 00:05:01.961 EAL: Ignore mapping IO port bar(1) 00:05:01.961 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.6 (socket 0) 00:05:01.961 EAL: Ignore mapping IO port bar(1) 00:05:01.961 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.7 (socket 0) 00:05:02.529 EAL: Probe PCI driver: spdk_nvme (8086:0a54) device: 0000:5e:00.0 (socket 0) 00:05:02.529 EAL: Ignore mapping IO port bar(1) 00:05:02.529 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.0 (socket 1) 00:05:02.529 EAL: Ignore mapping IO port bar(1) 00:05:02.529 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.1 (socket 1) 00:05:02.529 EAL: Ignore mapping IO port bar(1) 00:05:02.529 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.2 (socket 1) 00:05:02.529 EAL: Ignore mapping IO port bar(1) 00:05:02.529 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.3 (socket 1) 00:05:02.529 EAL: Ignore mapping IO port bar(1) 00:05:02.529 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.4 (socket 1) 00:05:02.529 EAL: Ignore mapping IO port bar(1) 00:05:02.529 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.5 (socket 1) 00:05:02.813 EAL: Ignore mapping IO port bar(1) 00:05:02.813 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.6 (socket 1) 00:05:02.813 EAL: Ignore mapping IO port bar(1) 00:05:02.813 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.7 (socket 1) 00:05:06.166 EAL: Releasing PCI mapped resource for 0000:5e:00.0 00:05:06.166 EAL: Calling pci_unmap_resource for 0000:5e:00.0 at 0x202001080000 00:05:06.166 Starting DPDK initialization... 00:05:06.166 Starting SPDK post initialization... 00:05:06.166 SPDK NVMe probe 00:05:06.166 Attaching to 0000:5e:00.0 00:05:06.166 Attached to 0000:5e:00.0 00:05:06.166 Cleaning up... 00:05:06.166 00:05:06.166 real 0m4.320s 00:05:06.166 user 0m3.265s 00:05:06.166 sys 0m0.129s 00:05:06.166 23:26:50 env.env_dpdk_post_init -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:06.166 23:26:50 env.env_dpdk_post_init -- common/autotest_common.sh@10 -- # set +x 00:05:06.166 ************************************ 00:05:06.166 END TEST env_dpdk_post_init 00:05:06.166 ************************************ 00:05:06.166 23:26:50 env -- env/env.sh@26 -- # uname 00:05:06.166 23:26:50 env -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:05:06.166 23:26:50 env -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:05:06.166 23:26:50 env -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:06.166 23:26:50 env -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:06.166 23:26:50 env -- common/autotest_common.sh@10 -- # set +x 00:05:06.166 ************************************ 00:05:06.166 START TEST env_mem_callbacks 00:05:06.166 ************************************ 00:05:06.166 23:26:50 env.env_mem_callbacks -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:05:06.166 EAL: Detected CPU lcores: 96 00:05:06.166 EAL: Detected NUMA nodes: 2 00:05:06.166 EAL: Detected shared linkage of DPDK 00:05:06.166 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:05:06.166 EAL: Selected IOVA mode 'PA' 00:05:06.166 EAL: VFIO support initialized 00:05:06.166 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.0 (socket 0) 00:05:06.166 CRYPTODEV: Creating cryptodev 0000:1a:01.0_qat_asym 00:05:06.166 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:06.166 CRYPTODEV: Creating cryptodev 0000:1a:01.0_qat_sym 00:05:06.166 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:06.166 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.1 (socket 0) 00:05:06.166 CRYPTODEV: Creating cryptodev 0000:1a:01.1_qat_asym 00:05:06.166 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:06.166 CRYPTODEV: Creating cryptodev 0000:1a:01.1_qat_sym 00:05:06.166 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:06.166 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.2 (socket 0) 00:05:06.166 CRYPTODEV: Creating cryptodev 0000:1a:01.2_qat_asym 00:05:06.166 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:06.166 CRYPTODEV: Creating cryptodev 0000:1a:01.2_qat_sym 00:05:06.166 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:06.166 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.3 (socket 0) 00:05:06.166 CRYPTODEV: Creating cryptodev 0000:1a:01.3_qat_asym 00:05:06.166 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:06.166 CRYPTODEV: Creating cryptodev 0000:1a:01.3_qat_sym 00:05:06.166 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:06.166 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.4 (socket 0) 00:05:06.166 CRYPTODEV: Creating cryptodev 0000:1a:01.4_qat_asym 00:05:06.166 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:06.166 CRYPTODEV: Creating cryptodev 0000:1a:01.4_qat_sym 00:05:06.166 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:06.166 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.5 (socket 0) 00:05:06.166 CRYPTODEV: Creating cryptodev 0000:1a:01.5_qat_asym 00:05:06.166 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:06.166 CRYPTODEV: Creating cryptodev 0000:1a:01.5_qat_sym 00:05:06.166 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:06.166 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.6 (socket 0) 00:05:06.166 CRYPTODEV: Creating cryptodev 0000:1a:01.6_qat_asym 00:05:06.166 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:06.166 CRYPTODEV: Creating cryptodev 0000:1a:01.6_qat_sym 00:05:06.166 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:06.166 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.7 (socket 0) 00:05:06.166 CRYPTODEV: Creating cryptodev 0000:1a:01.7_qat_asym 00:05:06.166 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:06.166 CRYPTODEV: Creating cryptodev 0000:1a:01.7_qat_sym 00:05:06.166 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:06.167 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.0 (socket 0) 00:05:06.167 CRYPTODEV: Creating cryptodev 0000:1a:02.0_qat_asym 00:05:06.167 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:06.167 CRYPTODEV: Creating cryptodev 0000:1a:02.0_qat_sym 00:05:06.167 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:06.167 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.1 (socket 0) 00:05:06.167 CRYPTODEV: Creating cryptodev 0000:1a:02.1_qat_asym 00:05:06.167 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:06.167 CRYPTODEV: Creating cryptodev 0000:1a:02.1_qat_sym 00:05:06.167 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:06.167 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.2 (socket 0) 00:05:06.167 CRYPTODEV: Creating cryptodev 0000:1a:02.2_qat_asym 00:05:06.167 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:06.167 CRYPTODEV: Creating cryptodev 0000:1a:02.2_qat_sym 00:05:06.167 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:06.167 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.3 (socket 0) 00:05:06.167 CRYPTODEV: Creating cryptodev 0000:1a:02.3_qat_asym 00:05:06.167 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:06.167 CRYPTODEV: Creating cryptodev 0000:1a:02.3_qat_sym 00:05:06.167 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:06.167 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.4 (socket 0) 00:05:06.167 CRYPTODEV: Creating cryptodev 0000:1a:02.4_qat_asym 00:05:06.167 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:06.167 CRYPTODEV: Creating cryptodev 0000:1a:02.4_qat_sym 00:05:06.167 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:06.167 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.5 (socket 0) 00:05:06.167 CRYPTODEV: Creating cryptodev 0000:1a:02.5_qat_asym 00:05:06.167 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:06.167 CRYPTODEV: Creating cryptodev 0000:1a:02.5_qat_sym 00:05:06.167 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:06.167 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.6 (socket 0) 00:05:06.167 CRYPTODEV: Creating cryptodev 0000:1a:02.6_qat_asym 00:05:06.167 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:06.167 CRYPTODEV: Creating cryptodev 0000:1a:02.6_qat_sym 00:05:06.167 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:06.167 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.7 (socket 0) 00:05:06.167 CRYPTODEV: Creating cryptodev 0000:1a:02.7_qat_asym 00:05:06.167 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:06.167 CRYPTODEV: Creating cryptodev 0000:1a:02.7_qat_sym 00:05:06.167 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:06.167 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.0 (socket 0) 00:05:06.167 CRYPTODEV: Creating cryptodev 0000:1c:01.0_qat_asym 00:05:06.167 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:06.167 CRYPTODEV: Creating cryptodev 0000:1c:01.0_qat_sym 00:05:06.167 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:06.167 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.1 (socket 0) 00:05:06.167 CRYPTODEV: Creating cryptodev 0000:1c:01.1_qat_asym 00:05:06.167 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:06.167 CRYPTODEV: Creating cryptodev 0000:1c:01.1_qat_sym 00:05:06.167 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:06.167 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.2 (socket 0) 00:05:06.167 CRYPTODEV: Creating cryptodev 0000:1c:01.2_qat_asym 00:05:06.167 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:06.167 CRYPTODEV: Creating cryptodev 0000:1c:01.2_qat_sym 00:05:06.167 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:06.167 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.3 (socket 0) 00:05:06.167 CRYPTODEV: Creating cryptodev 0000:1c:01.3_qat_asym 00:05:06.167 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:06.167 CRYPTODEV: Creating cryptodev 0000:1c:01.3_qat_sym 00:05:06.167 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:06.167 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.4 (socket 0) 00:05:06.167 CRYPTODEV: Creating cryptodev 0000:1c:01.4_qat_asym 00:05:06.167 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:06.167 CRYPTODEV: Creating cryptodev 0000:1c:01.4_qat_sym 00:05:06.167 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:06.167 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.5 (socket 0) 00:05:06.167 CRYPTODEV: Creating cryptodev 0000:1c:01.5_qat_asym 00:05:06.167 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:06.167 CRYPTODEV: Creating cryptodev 0000:1c:01.5_qat_sym 00:05:06.167 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:06.167 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.6 (socket 0) 00:05:06.167 CRYPTODEV: Creating cryptodev 0000:1c:01.6_qat_asym 00:05:06.167 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:06.167 CRYPTODEV: Creating cryptodev 0000:1c:01.6_qat_sym 00:05:06.167 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:06.167 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.7 (socket 0) 00:05:06.167 CRYPTODEV: Creating cryptodev 0000:1c:01.7_qat_asym 00:05:06.167 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:06.167 CRYPTODEV: Creating cryptodev 0000:1c:01.7_qat_sym 00:05:06.167 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:06.167 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.0 (socket 0) 00:05:06.167 CRYPTODEV: Creating cryptodev 0000:1c:02.0_qat_asym 00:05:06.167 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:06.167 CRYPTODEV: Creating cryptodev 0000:1c:02.0_qat_sym 00:05:06.167 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:06.167 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.1 (socket 0) 00:05:06.167 CRYPTODEV: Creating cryptodev 0000:1c:02.1_qat_asym 00:05:06.167 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:06.167 CRYPTODEV: Creating cryptodev 0000:1c:02.1_qat_sym 00:05:06.167 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:06.167 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.2 (socket 0) 00:05:06.167 CRYPTODEV: Creating cryptodev 0000:1c:02.2_qat_asym 00:05:06.167 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:06.167 CRYPTODEV: Creating cryptodev 0000:1c:02.2_qat_sym 00:05:06.167 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:06.167 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.3 (socket 0) 00:05:06.167 CRYPTODEV: Creating cryptodev 0000:1c:02.3_qat_asym 00:05:06.167 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:06.167 CRYPTODEV: Creating cryptodev 0000:1c:02.3_qat_sym 00:05:06.167 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:06.167 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.4 (socket 0) 00:05:06.167 CRYPTODEV: Creating cryptodev 0000:1c:02.4_qat_asym 00:05:06.167 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:06.167 CRYPTODEV: Creating cryptodev 0000:1c:02.4_qat_sym 00:05:06.167 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:06.167 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.5 (socket 0) 00:05:06.167 CRYPTODEV: Creating cryptodev 0000:1c:02.5_qat_asym 00:05:06.167 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:06.167 CRYPTODEV: Creating cryptodev 0000:1c:02.5_qat_sym 00:05:06.167 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:06.167 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.6 (socket 0) 00:05:06.167 CRYPTODEV: Creating cryptodev 0000:1c:02.6_qat_asym 00:05:06.167 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:06.167 CRYPTODEV: Creating cryptodev 0000:1c:02.6_qat_sym 00:05:06.167 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:06.167 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.7 (socket 0) 00:05:06.167 CRYPTODEV: Creating cryptodev 0000:1c:02.7_qat_asym 00:05:06.167 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:06.167 CRYPTODEV: Creating cryptodev 0000:1c:02.7_qat_sym 00:05:06.167 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:06.167 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.0 (socket 0) 00:05:06.167 CRYPTODEV: Creating cryptodev 0000:1e:01.0_qat_asym 00:05:06.167 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:06.167 CRYPTODEV: Creating cryptodev 0000:1e:01.0_qat_sym 00:05:06.167 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:06.167 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.1 (socket 0) 00:05:06.167 CRYPTODEV: Creating cryptodev 0000:1e:01.1_qat_asym 00:05:06.167 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:06.167 CRYPTODEV: Creating cryptodev 0000:1e:01.1_qat_sym 00:05:06.167 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:06.167 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.2 (socket 0) 00:05:06.167 CRYPTODEV: Creating cryptodev 0000:1e:01.2_qat_asym 00:05:06.167 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:06.167 CRYPTODEV: Creating cryptodev 0000:1e:01.2_qat_sym 00:05:06.167 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:06.167 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.3 (socket 0) 00:05:06.167 CRYPTODEV: Creating cryptodev 0000:1e:01.3_qat_asym 00:05:06.167 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:06.168 CRYPTODEV: Creating cryptodev 0000:1e:01.3_qat_sym 00:05:06.168 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:06.168 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.4 (socket 0) 00:05:06.168 CRYPTODEV: Creating cryptodev 0000:1e:01.4_qat_asym 00:05:06.168 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:06.168 CRYPTODEV: Creating cryptodev 0000:1e:01.4_qat_sym 00:05:06.168 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:06.168 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.5 (socket 0) 00:05:06.168 CRYPTODEV: Creating cryptodev 0000:1e:01.5_qat_asym 00:05:06.168 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:06.168 CRYPTODEV: Creating cryptodev 0000:1e:01.5_qat_sym 00:05:06.168 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:06.168 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.6 (socket 0) 00:05:06.168 CRYPTODEV: Creating cryptodev 0000:1e:01.6_qat_asym 00:05:06.168 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:06.168 CRYPTODEV: Creating cryptodev 0000:1e:01.6_qat_sym 00:05:06.168 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:06.168 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.7 (socket 0) 00:05:06.168 CRYPTODEV: Creating cryptodev 0000:1e:01.7_qat_asym 00:05:06.168 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:06.168 CRYPTODEV: Creating cryptodev 0000:1e:01.7_qat_sym 00:05:06.168 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:06.168 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.0 (socket 0) 00:05:06.168 CRYPTODEV: Creating cryptodev 0000:1e:02.0_qat_asym 00:05:06.168 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:06.168 CRYPTODEV: Creating cryptodev 0000:1e:02.0_qat_sym 00:05:06.168 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:06.168 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.1 (socket 0) 00:05:06.168 CRYPTODEV: Creating cryptodev 0000:1e:02.1_qat_asym 00:05:06.168 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:06.168 CRYPTODEV: Creating cryptodev 0000:1e:02.1_qat_sym 00:05:06.168 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:06.168 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.2 (socket 0) 00:05:06.168 CRYPTODEV: Creating cryptodev 0000:1e:02.2_qat_asym 00:05:06.168 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:06.168 CRYPTODEV: Creating cryptodev 0000:1e:02.2_qat_sym 00:05:06.168 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:06.168 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.3 (socket 0) 00:05:06.168 CRYPTODEV: Creating cryptodev 0000:1e:02.3_qat_asym 00:05:06.168 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:06.168 CRYPTODEV: Creating cryptodev 0000:1e:02.3_qat_sym 00:05:06.168 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:06.168 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.4 (socket 0) 00:05:06.168 CRYPTODEV: Creating cryptodev 0000:1e:02.4_qat_asym 00:05:06.168 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:06.168 CRYPTODEV: Creating cryptodev 0000:1e:02.4_qat_sym 00:05:06.168 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:06.168 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.5 (socket 0) 00:05:06.168 CRYPTODEV: Creating cryptodev 0000:1e:02.5_qat_asym 00:05:06.168 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:06.168 CRYPTODEV: Creating cryptodev 0000:1e:02.5_qat_sym 00:05:06.168 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:06.168 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.6 (socket 0) 00:05:06.168 CRYPTODEV: Creating cryptodev 0000:1e:02.6_qat_asym 00:05:06.168 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:06.168 CRYPTODEV: Creating cryptodev 0000:1e:02.6_qat_sym 00:05:06.168 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:06.168 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.7 (socket 0) 00:05:06.168 CRYPTODEV: Creating cryptodev 0000:1e:02.7_qat_asym 00:05:06.168 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:06.168 CRYPTODEV: Creating cryptodev 0000:1e:02.7_qat_sym 00:05:06.168 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:06.168 TELEMETRY: No legacy callbacks, legacy socket not created 00:05:06.168 00:05:06.168 00:05:06.168 CUnit - A unit testing framework for C - Version 2.1-3 00:05:06.168 http://cunit.sourceforge.net/ 00:05:06.168 00:05:06.168 00:05:06.168 Suite: memory 00:05:06.168 Test: test ... 00:05:06.168 register 0x200000200000 2097152 00:05:06.168 malloc 3145728 00:05:06.168 register 0x200000400000 4194304 00:05:06.168 buf 0x200000500000 len 3145728 PASSED 00:05:06.168 malloc 64 00:05:06.168 buf 0x2000004fff40 len 64 PASSED 00:05:06.168 malloc 4194304 00:05:06.168 register 0x200000800000 6291456 00:05:06.168 buf 0x200000a00000 len 4194304 PASSED 00:05:06.168 free 0x200000500000 3145728 00:05:06.168 free 0x2000004fff40 64 00:05:06.168 unregister 0x200000400000 4194304 PASSED 00:05:06.168 free 0x200000a00000 4194304 00:05:06.168 unregister 0x200000800000 6291456 PASSED 00:05:06.168 malloc 8388608 00:05:06.168 register 0x200000400000 10485760 00:05:06.168 buf 0x200000600000 len 8388608 PASSED 00:05:06.168 free 0x200000600000 8388608 00:05:06.168 unregister 0x200000400000 10485760 PASSED 00:05:06.168 passed 00:05:06.168 00:05:06.168 Run Summary: Type Total Ran Passed Failed Inactive 00:05:06.168 suites 1 1 n/a 0 0 00:05:06.168 tests 1 1 1 0 0 00:05:06.168 asserts 15 15 15 0 n/a 00:05:06.168 00:05:06.168 Elapsed time = 0.005 seconds 00:05:06.168 00:05:06.168 real 0m0.066s 00:05:06.168 user 0m0.025s 00:05:06.168 sys 0m0.041s 00:05:06.168 23:26:50 env.env_mem_callbacks -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:06.168 23:26:50 env.env_mem_callbacks -- common/autotest_common.sh@10 -- # set +x 00:05:06.168 ************************************ 00:05:06.168 END TEST env_mem_callbacks 00:05:06.168 ************************************ 00:05:06.168 00:05:06.168 real 0m6.047s 00:05:06.168 user 0m4.220s 00:05:06.168 sys 0m0.895s 00:05:06.168 23:26:50 env -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:06.168 23:26:50 env -- common/autotest_common.sh@10 -- # set +x 00:05:06.168 ************************************ 00:05:06.168 END TEST env 00:05:06.168 ************************************ 00:05:06.168 23:26:51 -- spdk/autotest.sh@169 -- # run_test rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/rpc.sh 00:05:06.168 23:26:51 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:06.168 23:26:51 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:06.168 23:26:51 -- common/autotest_common.sh@10 -- # set +x 00:05:06.168 ************************************ 00:05:06.168 START TEST rpc 00:05:06.168 ************************************ 00:05:06.168 23:26:51 rpc -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/rpc.sh 00:05:06.168 * Looking for test storage... 00:05:06.168 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:05:06.168 23:26:51 rpc -- rpc/rpc.sh@65 -- # spdk_pid=199068 00:05:06.168 23:26:51 rpc -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:06.168 23:26:51 rpc -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:05:06.168 23:26:51 rpc -- rpc/rpc.sh@67 -- # waitforlisten 199068 00:05:06.168 23:26:51 rpc -- common/autotest_common.sh@831 -- # '[' -z 199068 ']' 00:05:06.168 23:26:51 rpc -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:06.168 23:26:51 rpc -- common/autotest_common.sh@836 -- # local max_retries=100 00:05:06.168 23:26:51 rpc -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:06.168 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:06.168 23:26:51 rpc -- common/autotest_common.sh@840 -- # xtrace_disable 00:05:06.168 23:26:51 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:06.427 [2024-07-24 23:26:51.167231] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:05:06.427 [2024-07-24 23:26:51.167275] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid199068 ] 00:05:06.427 [2024-07-24 23:26:51.232401] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:06.427 [2024-07-24 23:26:51.304020] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:05:06.427 [2024-07-24 23:26:51.304061] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 199068' to capture a snapshot of events at runtime. 00:05:06.427 [2024-07-24 23:26:51.304067] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:05:06.427 [2024-07-24 23:26:51.304075] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:05:06.427 [2024-07-24 23:26:51.304079] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid199068 for offline analysis/debug. 00:05:06.427 [2024-07-24 23:26:51.304097] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:05:06.994 23:26:51 rpc -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:05:06.994 23:26:51 rpc -- common/autotest_common.sh@864 -- # return 0 00:05:06.994 23:26:51 rpc -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:05:06.994 23:26:51 rpc -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:05:06.994 23:26:51 rpc -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:05:06.994 23:26:51 rpc -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:05:06.994 23:26:51 rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:06.994 23:26:51 rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:06.994 23:26:51 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:06.994 ************************************ 00:05:06.994 START TEST rpc_integrity 00:05:06.994 ************************************ 00:05:06.994 23:26:51 rpc.rpc_integrity -- common/autotest_common.sh@1125 -- # rpc_integrity 00:05:06.994 23:26:51 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:05:06.994 23:26:51 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:06.994 23:26:51 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:07.253 23:26:51 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:07.253 23:26:51 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:05:07.253 23:26:51 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # jq length 00:05:07.253 23:26:52 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:05:07.253 23:26:52 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:05:07.253 23:26:52 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:07.253 23:26:52 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:07.253 23:26:52 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:07.253 23:26:52 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:05:07.253 23:26:52 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:05:07.253 23:26:52 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:07.253 23:26:52 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:07.253 23:26:52 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:07.253 23:26:52 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:05:07.253 { 00:05:07.253 "name": "Malloc0", 00:05:07.253 "aliases": [ 00:05:07.253 "f04c8619-e503-4cf0-8a04-b29bbdedec91" 00:05:07.253 ], 00:05:07.253 "product_name": "Malloc disk", 00:05:07.253 "block_size": 512, 00:05:07.253 "num_blocks": 16384, 00:05:07.253 "uuid": "f04c8619-e503-4cf0-8a04-b29bbdedec91", 00:05:07.253 "assigned_rate_limits": { 00:05:07.253 "rw_ios_per_sec": 0, 00:05:07.253 "rw_mbytes_per_sec": 0, 00:05:07.253 "r_mbytes_per_sec": 0, 00:05:07.253 "w_mbytes_per_sec": 0 00:05:07.253 }, 00:05:07.253 "claimed": false, 00:05:07.253 "zoned": false, 00:05:07.253 "supported_io_types": { 00:05:07.253 "read": true, 00:05:07.253 "write": true, 00:05:07.253 "unmap": true, 00:05:07.253 "flush": true, 00:05:07.253 "reset": true, 00:05:07.253 "nvme_admin": false, 00:05:07.253 "nvme_io": false, 00:05:07.253 "nvme_io_md": false, 00:05:07.253 "write_zeroes": true, 00:05:07.253 "zcopy": true, 00:05:07.253 "get_zone_info": false, 00:05:07.253 "zone_management": false, 00:05:07.253 "zone_append": false, 00:05:07.253 "compare": false, 00:05:07.253 "compare_and_write": false, 00:05:07.253 "abort": true, 00:05:07.253 "seek_hole": false, 00:05:07.253 "seek_data": false, 00:05:07.253 "copy": true, 00:05:07.254 "nvme_iov_md": false 00:05:07.254 }, 00:05:07.254 "memory_domains": [ 00:05:07.254 { 00:05:07.254 "dma_device_id": "system", 00:05:07.254 "dma_device_type": 1 00:05:07.254 }, 00:05:07.254 { 00:05:07.254 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:07.254 "dma_device_type": 2 00:05:07.254 } 00:05:07.254 ], 00:05:07.254 "driver_specific": {} 00:05:07.254 } 00:05:07.254 ]' 00:05:07.254 23:26:52 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # jq length 00:05:07.254 23:26:52 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:05:07.254 23:26:52 rpc.rpc_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:05:07.254 23:26:52 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:07.254 23:26:52 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:07.254 [2024-07-24 23:26:52.111029] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:05:07.254 [2024-07-24 23:26:52.111060] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:07.254 [2024-07-24 23:26:52.111072] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ce6150 00:05:07.254 [2024-07-24 23:26:52.111078] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:07.254 [2024-07-24 23:26:52.112145] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:07.254 [2024-07-24 23:26:52.112166] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:05:07.254 Passthru0 00:05:07.254 23:26:52 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:07.254 23:26:52 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:05:07.254 23:26:52 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:07.254 23:26:52 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:07.254 23:26:52 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:07.254 23:26:52 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:05:07.254 { 00:05:07.254 "name": "Malloc0", 00:05:07.254 "aliases": [ 00:05:07.254 "f04c8619-e503-4cf0-8a04-b29bbdedec91" 00:05:07.254 ], 00:05:07.254 "product_name": "Malloc disk", 00:05:07.254 "block_size": 512, 00:05:07.254 "num_blocks": 16384, 00:05:07.254 "uuid": "f04c8619-e503-4cf0-8a04-b29bbdedec91", 00:05:07.254 "assigned_rate_limits": { 00:05:07.254 "rw_ios_per_sec": 0, 00:05:07.254 "rw_mbytes_per_sec": 0, 00:05:07.254 "r_mbytes_per_sec": 0, 00:05:07.254 "w_mbytes_per_sec": 0 00:05:07.254 }, 00:05:07.254 "claimed": true, 00:05:07.254 "claim_type": "exclusive_write", 00:05:07.254 "zoned": false, 00:05:07.254 "supported_io_types": { 00:05:07.254 "read": true, 00:05:07.254 "write": true, 00:05:07.254 "unmap": true, 00:05:07.254 "flush": true, 00:05:07.254 "reset": true, 00:05:07.254 "nvme_admin": false, 00:05:07.254 "nvme_io": false, 00:05:07.254 "nvme_io_md": false, 00:05:07.254 "write_zeroes": true, 00:05:07.254 "zcopy": true, 00:05:07.254 "get_zone_info": false, 00:05:07.254 "zone_management": false, 00:05:07.254 "zone_append": false, 00:05:07.254 "compare": false, 00:05:07.254 "compare_and_write": false, 00:05:07.254 "abort": true, 00:05:07.254 "seek_hole": false, 00:05:07.254 "seek_data": false, 00:05:07.254 "copy": true, 00:05:07.254 "nvme_iov_md": false 00:05:07.254 }, 00:05:07.254 "memory_domains": [ 00:05:07.254 { 00:05:07.254 "dma_device_id": "system", 00:05:07.254 "dma_device_type": 1 00:05:07.254 }, 00:05:07.254 { 00:05:07.254 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:07.254 "dma_device_type": 2 00:05:07.254 } 00:05:07.254 ], 00:05:07.254 "driver_specific": {} 00:05:07.254 }, 00:05:07.254 { 00:05:07.254 "name": "Passthru0", 00:05:07.254 "aliases": [ 00:05:07.254 "f67a9a42-9390-5b17-aec0-bde2a26094cd" 00:05:07.254 ], 00:05:07.254 "product_name": "passthru", 00:05:07.254 "block_size": 512, 00:05:07.254 "num_blocks": 16384, 00:05:07.254 "uuid": "f67a9a42-9390-5b17-aec0-bde2a26094cd", 00:05:07.254 "assigned_rate_limits": { 00:05:07.254 "rw_ios_per_sec": 0, 00:05:07.254 "rw_mbytes_per_sec": 0, 00:05:07.254 "r_mbytes_per_sec": 0, 00:05:07.254 "w_mbytes_per_sec": 0 00:05:07.254 }, 00:05:07.254 "claimed": false, 00:05:07.254 "zoned": false, 00:05:07.254 "supported_io_types": { 00:05:07.254 "read": true, 00:05:07.254 "write": true, 00:05:07.254 "unmap": true, 00:05:07.254 "flush": true, 00:05:07.254 "reset": true, 00:05:07.254 "nvme_admin": false, 00:05:07.254 "nvme_io": false, 00:05:07.254 "nvme_io_md": false, 00:05:07.254 "write_zeroes": true, 00:05:07.254 "zcopy": true, 00:05:07.254 "get_zone_info": false, 00:05:07.254 "zone_management": false, 00:05:07.254 "zone_append": false, 00:05:07.254 "compare": false, 00:05:07.254 "compare_and_write": false, 00:05:07.254 "abort": true, 00:05:07.254 "seek_hole": false, 00:05:07.254 "seek_data": false, 00:05:07.254 "copy": true, 00:05:07.254 "nvme_iov_md": false 00:05:07.254 }, 00:05:07.254 "memory_domains": [ 00:05:07.254 { 00:05:07.254 "dma_device_id": "system", 00:05:07.254 "dma_device_type": 1 00:05:07.254 }, 00:05:07.254 { 00:05:07.254 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:07.254 "dma_device_type": 2 00:05:07.254 } 00:05:07.254 ], 00:05:07.254 "driver_specific": { 00:05:07.254 "passthru": { 00:05:07.254 "name": "Passthru0", 00:05:07.254 "base_bdev_name": "Malloc0" 00:05:07.254 } 00:05:07.254 } 00:05:07.254 } 00:05:07.254 ]' 00:05:07.254 23:26:52 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # jq length 00:05:07.254 23:26:52 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:05:07.254 23:26:52 rpc.rpc_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:05:07.254 23:26:52 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:07.254 23:26:52 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:07.254 23:26:52 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:07.254 23:26:52 rpc.rpc_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:05:07.254 23:26:52 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:07.254 23:26:52 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:07.254 23:26:52 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:07.254 23:26:52 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:05:07.254 23:26:52 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:07.254 23:26:52 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:07.254 23:26:52 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:07.254 23:26:52 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:05:07.254 23:26:52 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # jq length 00:05:07.254 23:26:52 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:05:07.254 00:05:07.254 real 0m0.260s 00:05:07.254 user 0m0.179s 00:05:07.254 sys 0m0.019s 00:05:07.254 23:26:52 rpc.rpc_integrity -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:07.254 23:26:52 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:07.254 ************************************ 00:05:07.254 END TEST rpc_integrity 00:05:07.254 ************************************ 00:05:07.513 23:26:52 rpc -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:05:07.513 23:26:52 rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:07.513 23:26:52 rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:07.513 23:26:52 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:07.513 ************************************ 00:05:07.513 START TEST rpc_plugins 00:05:07.513 ************************************ 00:05:07.513 23:26:52 rpc.rpc_plugins -- common/autotest_common.sh@1125 -- # rpc_plugins 00:05:07.513 23:26:52 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:05:07.513 23:26:52 rpc.rpc_plugins -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:07.513 23:26:52 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:05:07.513 23:26:52 rpc.rpc_plugins -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:07.513 23:26:52 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:05:07.513 23:26:52 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:05:07.513 23:26:52 rpc.rpc_plugins -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:07.513 23:26:52 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:05:07.513 23:26:52 rpc.rpc_plugins -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:07.513 23:26:52 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # bdevs='[ 00:05:07.513 { 00:05:07.513 "name": "Malloc1", 00:05:07.513 "aliases": [ 00:05:07.513 "ede34555-0da3-43cc-b696-cb87c12377d3" 00:05:07.513 ], 00:05:07.513 "product_name": "Malloc disk", 00:05:07.513 "block_size": 4096, 00:05:07.513 "num_blocks": 256, 00:05:07.513 "uuid": "ede34555-0da3-43cc-b696-cb87c12377d3", 00:05:07.513 "assigned_rate_limits": { 00:05:07.513 "rw_ios_per_sec": 0, 00:05:07.513 "rw_mbytes_per_sec": 0, 00:05:07.513 "r_mbytes_per_sec": 0, 00:05:07.513 "w_mbytes_per_sec": 0 00:05:07.513 }, 00:05:07.513 "claimed": false, 00:05:07.513 "zoned": false, 00:05:07.513 "supported_io_types": { 00:05:07.513 "read": true, 00:05:07.514 "write": true, 00:05:07.514 "unmap": true, 00:05:07.514 "flush": true, 00:05:07.514 "reset": true, 00:05:07.514 "nvme_admin": false, 00:05:07.514 "nvme_io": false, 00:05:07.514 "nvme_io_md": false, 00:05:07.514 "write_zeroes": true, 00:05:07.514 "zcopy": true, 00:05:07.514 "get_zone_info": false, 00:05:07.514 "zone_management": false, 00:05:07.514 "zone_append": false, 00:05:07.514 "compare": false, 00:05:07.514 "compare_and_write": false, 00:05:07.514 "abort": true, 00:05:07.514 "seek_hole": false, 00:05:07.514 "seek_data": false, 00:05:07.514 "copy": true, 00:05:07.514 "nvme_iov_md": false 00:05:07.514 }, 00:05:07.514 "memory_domains": [ 00:05:07.514 { 00:05:07.514 "dma_device_id": "system", 00:05:07.514 "dma_device_type": 1 00:05:07.514 }, 00:05:07.514 { 00:05:07.514 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:07.514 "dma_device_type": 2 00:05:07.514 } 00:05:07.514 ], 00:05:07.514 "driver_specific": {} 00:05:07.514 } 00:05:07.514 ]' 00:05:07.514 23:26:52 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # jq length 00:05:07.514 23:26:52 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:05:07.514 23:26:52 rpc.rpc_plugins -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:05:07.514 23:26:52 rpc.rpc_plugins -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:07.514 23:26:52 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:05:07.514 23:26:52 rpc.rpc_plugins -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:07.514 23:26:52 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:05:07.514 23:26:52 rpc.rpc_plugins -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:07.514 23:26:52 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:05:07.514 23:26:52 rpc.rpc_plugins -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:07.514 23:26:52 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # bdevs='[]' 00:05:07.514 23:26:52 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # jq length 00:05:07.514 23:26:52 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:05:07.514 00:05:07.514 real 0m0.133s 00:05:07.514 user 0m0.084s 00:05:07.514 sys 0m0.013s 00:05:07.514 23:26:52 rpc.rpc_plugins -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:07.514 23:26:52 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:05:07.514 ************************************ 00:05:07.514 END TEST rpc_plugins 00:05:07.514 ************************************ 00:05:07.514 23:26:52 rpc -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:05:07.514 23:26:52 rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:07.514 23:26:52 rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:07.514 23:26:52 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:07.514 ************************************ 00:05:07.514 START TEST rpc_trace_cmd_test 00:05:07.514 ************************************ 00:05:07.514 23:26:52 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1125 -- # rpc_trace_cmd_test 00:05:07.514 23:26:52 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@40 -- # local info 00:05:07.514 23:26:52 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:05:07.514 23:26:52 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:07.514 23:26:52 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:05:07.514 23:26:52 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:07.514 23:26:52 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # info='{ 00:05:07.514 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid199068", 00:05:07.514 "tpoint_group_mask": "0x8", 00:05:07.514 "iscsi_conn": { 00:05:07.514 "mask": "0x2", 00:05:07.514 "tpoint_mask": "0x0" 00:05:07.514 }, 00:05:07.514 "scsi": { 00:05:07.514 "mask": "0x4", 00:05:07.514 "tpoint_mask": "0x0" 00:05:07.514 }, 00:05:07.514 "bdev": { 00:05:07.514 "mask": "0x8", 00:05:07.514 "tpoint_mask": "0xffffffffffffffff" 00:05:07.514 }, 00:05:07.514 "nvmf_rdma": { 00:05:07.514 "mask": "0x10", 00:05:07.514 "tpoint_mask": "0x0" 00:05:07.514 }, 00:05:07.514 "nvmf_tcp": { 00:05:07.514 "mask": "0x20", 00:05:07.514 "tpoint_mask": "0x0" 00:05:07.514 }, 00:05:07.514 "ftl": { 00:05:07.514 "mask": "0x40", 00:05:07.514 "tpoint_mask": "0x0" 00:05:07.514 }, 00:05:07.514 "blobfs": { 00:05:07.514 "mask": "0x80", 00:05:07.514 "tpoint_mask": "0x0" 00:05:07.514 }, 00:05:07.514 "dsa": { 00:05:07.514 "mask": "0x200", 00:05:07.514 "tpoint_mask": "0x0" 00:05:07.514 }, 00:05:07.514 "thread": { 00:05:07.514 "mask": "0x400", 00:05:07.514 "tpoint_mask": "0x0" 00:05:07.514 }, 00:05:07.514 "nvme_pcie": { 00:05:07.514 "mask": "0x800", 00:05:07.514 "tpoint_mask": "0x0" 00:05:07.514 }, 00:05:07.514 "iaa": { 00:05:07.514 "mask": "0x1000", 00:05:07.514 "tpoint_mask": "0x0" 00:05:07.514 }, 00:05:07.514 "nvme_tcp": { 00:05:07.514 "mask": "0x2000", 00:05:07.514 "tpoint_mask": "0x0" 00:05:07.514 }, 00:05:07.514 "bdev_nvme": { 00:05:07.514 "mask": "0x4000", 00:05:07.514 "tpoint_mask": "0x0" 00:05:07.514 }, 00:05:07.514 "sock": { 00:05:07.514 "mask": "0x8000", 00:05:07.514 "tpoint_mask": "0x0" 00:05:07.514 } 00:05:07.514 }' 00:05:07.514 23:26:52 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # jq length 00:05:07.773 23:26:52 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # '[' 16 -gt 2 ']' 00:05:07.773 23:26:52 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:05:07.773 23:26:52 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:05:07.773 23:26:52 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:05:07.773 23:26:52 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:05:07.773 23:26:52 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:05:07.773 23:26:52 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:05:07.773 23:26:52 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:05:07.773 23:26:52 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:05:07.773 00:05:07.773 real 0m0.208s 00:05:07.773 user 0m0.176s 00:05:07.773 sys 0m0.022s 00:05:07.773 23:26:52 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:07.773 23:26:52 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:05:07.773 ************************************ 00:05:07.773 END TEST rpc_trace_cmd_test 00:05:07.773 ************************************ 00:05:07.773 23:26:52 rpc -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:05:07.773 23:26:52 rpc -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:05:07.773 23:26:52 rpc -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:05:07.773 23:26:52 rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:07.773 23:26:52 rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:07.773 23:26:52 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:07.773 ************************************ 00:05:07.773 START TEST rpc_daemon_integrity 00:05:07.773 ************************************ 00:05:07.773 23:26:52 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1125 -- # rpc_integrity 00:05:07.773 23:26:52 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:05:07.773 23:26:52 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:07.773 23:26:52 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:08.032 23:26:52 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:08.032 23:26:52 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:05:08.032 23:26:52 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # jq length 00:05:08.032 23:26:52 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:05:08.032 23:26:52 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:05:08.032 23:26:52 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:08.032 23:26:52 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:08.032 23:26:52 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:08.032 23:26:52 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:05:08.032 23:26:52 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:05:08.032 23:26:52 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:08.032 23:26:52 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:08.032 23:26:52 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:08.032 23:26:52 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:05:08.032 { 00:05:08.032 "name": "Malloc2", 00:05:08.032 "aliases": [ 00:05:08.032 "6750feba-7c98-4530-921a-db9bc1c8d6ca" 00:05:08.032 ], 00:05:08.032 "product_name": "Malloc disk", 00:05:08.032 "block_size": 512, 00:05:08.032 "num_blocks": 16384, 00:05:08.032 "uuid": "6750feba-7c98-4530-921a-db9bc1c8d6ca", 00:05:08.032 "assigned_rate_limits": { 00:05:08.032 "rw_ios_per_sec": 0, 00:05:08.032 "rw_mbytes_per_sec": 0, 00:05:08.032 "r_mbytes_per_sec": 0, 00:05:08.032 "w_mbytes_per_sec": 0 00:05:08.032 }, 00:05:08.032 "claimed": false, 00:05:08.032 "zoned": false, 00:05:08.032 "supported_io_types": { 00:05:08.032 "read": true, 00:05:08.032 "write": true, 00:05:08.032 "unmap": true, 00:05:08.032 "flush": true, 00:05:08.032 "reset": true, 00:05:08.032 "nvme_admin": false, 00:05:08.032 "nvme_io": false, 00:05:08.032 "nvme_io_md": false, 00:05:08.032 "write_zeroes": true, 00:05:08.032 "zcopy": true, 00:05:08.032 "get_zone_info": false, 00:05:08.032 "zone_management": false, 00:05:08.032 "zone_append": false, 00:05:08.032 "compare": false, 00:05:08.032 "compare_and_write": false, 00:05:08.032 "abort": true, 00:05:08.032 "seek_hole": false, 00:05:08.032 "seek_data": false, 00:05:08.032 "copy": true, 00:05:08.032 "nvme_iov_md": false 00:05:08.032 }, 00:05:08.032 "memory_domains": [ 00:05:08.032 { 00:05:08.032 "dma_device_id": "system", 00:05:08.032 "dma_device_type": 1 00:05:08.032 }, 00:05:08.032 { 00:05:08.032 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:08.032 "dma_device_type": 2 00:05:08.032 } 00:05:08.032 ], 00:05:08.032 "driver_specific": {} 00:05:08.032 } 00:05:08.032 ]' 00:05:08.032 23:26:52 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # jq length 00:05:08.032 23:26:52 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:05:08.032 23:26:52 rpc.rpc_daemon_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:05:08.032 23:26:52 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:08.032 23:26:52 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:08.032 [2024-07-24 23:26:52.889145] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:05:08.032 [2024-07-24 23:26:52.889172] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:08.032 [2024-07-24 23:26:52.889186] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e7d7e0 00:05:08.032 [2024-07-24 23:26:52.889192] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:08.032 [2024-07-24 23:26:52.890132] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:08.032 [2024-07-24 23:26:52.890152] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:05:08.032 Passthru0 00:05:08.032 23:26:52 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:08.033 23:26:52 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:05:08.033 23:26:52 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:08.033 23:26:52 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:08.033 23:26:52 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:08.033 23:26:52 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:05:08.033 { 00:05:08.033 "name": "Malloc2", 00:05:08.033 "aliases": [ 00:05:08.033 "6750feba-7c98-4530-921a-db9bc1c8d6ca" 00:05:08.033 ], 00:05:08.033 "product_name": "Malloc disk", 00:05:08.033 "block_size": 512, 00:05:08.033 "num_blocks": 16384, 00:05:08.033 "uuid": "6750feba-7c98-4530-921a-db9bc1c8d6ca", 00:05:08.033 "assigned_rate_limits": { 00:05:08.033 "rw_ios_per_sec": 0, 00:05:08.033 "rw_mbytes_per_sec": 0, 00:05:08.033 "r_mbytes_per_sec": 0, 00:05:08.033 "w_mbytes_per_sec": 0 00:05:08.033 }, 00:05:08.033 "claimed": true, 00:05:08.033 "claim_type": "exclusive_write", 00:05:08.033 "zoned": false, 00:05:08.033 "supported_io_types": { 00:05:08.033 "read": true, 00:05:08.033 "write": true, 00:05:08.033 "unmap": true, 00:05:08.033 "flush": true, 00:05:08.033 "reset": true, 00:05:08.033 "nvme_admin": false, 00:05:08.033 "nvme_io": false, 00:05:08.033 "nvme_io_md": false, 00:05:08.033 "write_zeroes": true, 00:05:08.033 "zcopy": true, 00:05:08.033 "get_zone_info": false, 00:05:08.033 "zone_management": false, 00:05:08.033 "zone_append": false, 00:05:08.033 "compare": false, 00:05:08.033 "compare_and_write": false, 00:05:08.033 "abort": true, 00:05:08.033 "seek_hole": false, 00:05:08.033 "seek_data": false, 00:05:08.033 "copy": true, 00:05:08.033 "nvme_iov_md": false 00:05:08.033 }, 00:05:08.033 "memory_domains": [ 00:05:08.033 { 00:05:08.033 "dma_device_id": "system", 00:05:08.033 "dma_device_type": 1 00:05:08.033 }, 00:05:08.033 { 00:05:08.033 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:08.033 "dma_device_type": 2 00:05:08.033 } 00:05:08.033 ], 00:05:08.033 "driver_specific": {} 00:05:08.033 }, 00:05:08.033 { 00:05:08.033 "name": "Passthru0", 00:05:08.033 "aliases": [ 00:05:08.033 "0fc12cf1-4c31-5a98-939b-3dc77729d3f3" 00:05:08.033 ], 00:05:08.033 "product_name": "passthru", 00:05:08.033 "block_size": 512, 00:05:08.033 "num_blocks": 16384, 00:05:08.033 "uuid": "0fc12cf1-4c31-5a98-939b-3dc77729d3f3", 00:05:08.033 "assigned_rate_limits": { 00:05:08.033 "rw_ios_per_sec": 0, 00:05:08.033 "rw_mbytes_per_sec": 0, 00:05:08.033 "r_mbytes_per_sec": 0, 00:05:08.033 "w_mbytes_per_sec": 0 00:05:08.033 }, 00:05:08.033 "claimed": false, 00:05:08.033 "zoned": false, 00:05:08.033 "supported_io_types": { 00:05:08.033 "read": true, 00:05:08.033 "write": true, 00:05:08.033 "unmap": true, 00:05:08.033 "flush": true, 00:05:08.033 "reset": true, 00:05:08.033 "nvme_admin": false, 00:05:08.033 "nvme_io": false, 00:05:08.033 "nvme_io_md": false, 00:05:08.033 "write_zeroes": true, 00:05:08.033 "zcopy": true, 00:05:08.033 "get_zone_info": false, 00:05:08.033 "zone_management": false, 00:05:08.033 "zone_append": false, 00:05:08.033 "compare": false, 00:05:08.033 "compare_and_write": false, 00:05:08.033 "abort": true, 00:05:08.033 "seek_hole": false, 00:05:08.033 "seek_data": false, 00:05:08.033 "copy": true, 00:05:08.033 "nvme_iov_md": false 00:05:08.033 }, 00:05:08.033 "memory_domains": [ 00:05:08.033 { 00:05:08.033 "dma_device_id": "system", 00:05:08.033 "dma_device_type": 1 00:05:08.033 }, 00:05:08.033 { 00:05:08.033 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:08.033 "dma_device_type": 2 00:05:08.033 } 00:05:08.033 ], 00:05:08.033 "driver_specific": { 00:05:08.033 "passthru": { 00:05:08.033 "name": "Passthru0", 00:05:08.033 "base_bdev_name": "Malloc2" 00:05:08.033 } 00:05:08.033 } 00:05:08.033 } 00:05:08.033 ]' 00:05:08.033 23:26:52 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # jq length 00:05:08.033 23:26:52 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:05:08.033 23:26:52 rpc.rpc_daemon_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:05:08.033 23:26:52 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:08.033 23:26:52 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:08.033 23:26:52 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:08.033 23:26:52 rpc.rpc_daemon_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:05:08.033 23:26:52 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:08.033 23:26:52 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:08.033 23:26:52 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:08.033 23:26:52 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:05:08.033 23:26:52 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:08.033 23:26:52 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:08.033 23:26:52 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:08.033 23:26:52 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:05:08.033 23:26:52 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # jq length 00:05:08.033 23:26:53 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:05:08.033 00:05:08.033 real 0m0.258s 00:05:08.033 user 0m0.166s 00:05:08.033 sys 0m0.030s 00:05:08.033 23:26:53 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:08.033 23:26:53 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:08.033 ************************************ 00:05:08.033 END TEST rpc_daemon_integrity 00:05:08.033 ************************************ 00:05:08.292 23:26:53 rpc -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:05:08.292 23:26:53 rpc -- rpc/rpc.sh@84 -- # killprocess 199068 00:05:08.292 23:26:53 rpc -- common/autotest_common.sh@950 -- # '[' -z 199068 ']' 00:05:08.292 23:26:53 rpc -- common/autotest_common.sh@954 -- # kill -0 199068 00:05:08.292 23:26:53 rpc -- common/autotest_common.sh@955 -- # uname 00:05:08.292 23:26:53 rpc -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:05:08.292 23:26:53 rpc -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 199068 00:05:08.292 23:26:53 rpc -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:05:08.292 23:26:53 rpc -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:05:08.292 23:26:53 rpc -- common/autotest_common.sh@968 -- # echo 'killing process with pid 199068' 00:05:08.292 killing process with pid 199068 00:05:08.292 23:26:53 rpc -- common/autotest_common.sh@969 -- # kill 199068 00:05:08.292 23:26:53 rpc -- common/autotest_common.sh@974 -- # wait 199068 00:05:08.549 00:05:08.549 real 0m2.356s 00:05:08.549 user 0m3.040s 00:05:08.549 sys 0m0.612s 00:05:08.549 23:26:53 rpc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:08.549 23:26:53 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:08.549 ************************************ 00:05:08.549 END TEST rpc 00:05:08.549 ************************************ 00:05:08.549 23:26:53 -- spdk/autotest.sh@170 -- # run_test skip_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:05:08.549 23:26:53 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:08.549 23:26:53 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:08.549 23:26:53 -- common/autotest_common.sh@10 -- # set +x 00:05:08.549 ************************************ 00:05:08.549 START TEST skip_rpc 00:05:08.549 ************************************ 00:05:08.549 23:26:53 skip_rpc -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:05:08.549 * Looking for test storage... 00:05:08.549 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:05:08.549 23:26:53 skip_rpc -- rpc/skip_rpc.sh@11 -- # CONFIG_PATH=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:05:08.549 23:26:53 skip_rpc -- rpc/skip_rpc.sh@12 -- # LOG_PATH=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:05:08.549 23:26:53 skip_rpc -- rpc/skip_rpc.sh@73 -- # run_test skip_rpc test_skip_rpc 00:05:08.549 23:26:53 skip_rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:08.549 23:26:53 skip_rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:08.549 23:26:53 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:08.807 ************************************ 00:05:08.807 START TEST skip_rpc 00:05:08.807 ************************************ 00:05:08.807 23:26:53 skip_rpc.skip_rpc -- common/autotest_common.sh@1125 -- # test_skip_rpc 00:05:08.807 23:26:53 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@16 -- # local spdk_pid=199633 00:05:08.807 23:26:53 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@18 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:08.807 23:26:53 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 00:05:08.807 23:26:53 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@19 -- # sleep 5 00:05:08.807 [2024-07-24 23:26:53.634334] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:05:08.807 [2024-07-24 23:26:53.634377] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid199633 ] 00:05:08.807 [2024-07-24 23:26:53.697482] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:08.807 [2024-07-24 23:26:53.768633] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:05:14.079 23:26:58 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@21 -- # NOT rpc_cmd spdk_get_version 00:05:14.079 23:26:58 skip_rpc.skip_rpc -- common/autotest_common.sh@650 -- # local es=0 00:05:14.079 23:26:58 skip_rpc.skip_rpc -- common/autotest_common.sh@652 -- # valid_exec_arg rpc_cmd spdk_get_version 00:05:14.079 23:26:58 skip_rpc.skip_rpc -- common/autotest_common.sh@638 -- # local arg=rpc_cmd 00:05:14.079 23:26:58 skip_rpc.skip_rpc -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:05:14.079 23:26:58 skip_rpc.skip_rpc -- common/autotest_common.sh@642 -- # type -t rpc_cmd 00:05:14.079 23:26:58 skip_rpc.skip_rpc -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:05:14.079 23:26:58 skip_rpc.skip_rpc -- common/autotest_common.sh@653 -- # rpc_cmd spdk_get_version 00:05:14.079 23:26:58 skip_rpc.skip_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:14.079 23:26:58 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:14.079 23:26:58 skip_rpc.skip_rpc -- common/autotest_common.sh@589 -- # [[ 1 == 0 ]] 00:05:14.079 23:26:58 skip_rpc.skip_rpc -- common/autotest_common.sh@653 -- # es=1 00:05:14.079 23:26:58 skip_rpc.skip_rpc -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:05:14.079 23:26:58 skip_rpc.skip_rpc -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:05:14.079 23:26:58 skip_rpc.skip_rpc -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:05:14.079 23:26:58 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@22 -- # trap - SIGINT SIGTERM EXIT 00:05:14.079 23:26:58 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@23 -- # killprocess 199633 00:05:14.079 23:26:58 skip_rpc.skip_rpc -- common/autotest_common.sh@950 -- # '[' -z 199633 ']' 00:05:14.079 23:26:58 skip_rpc.skip_rpc -- common/autotest_common.sh@954 -- # kill -0 199633 00:05:14.079 23:26:58 skip_rpc.skip_rpc -- common/autotest_common.sh@955 -- # uname 00:05:14.079 23:26:58 skip_rpc.skip_rpc -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:05:14.079 23:26:58 skip_rpc.skip_rpc -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 199633 00:05:14.079 23:26:58 skip_rpc.skip_rpc -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:05:14.079 23:26:58 skip_rpc.skip_rpc -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:05:14.079 23:26:58 skip_rpc.skip_rpc -- common/autotest_common.sh@968 -- # echo 'killing process with pid 199633' 00:05:14.079 killing process with pid 199633 00:05:14.079 23:26:58 skip_rpc.skip_rpc -- common/autotest_common.sh@969 -- # kill 199633 00:05:14.079 23:26:58 skip_rpc.skip_rpc -- common/autotest_common.sh@974 -- # wait 199633 00:05:14.079 00:05:14.079 real 0m5.362s 00:05:14.079 user 0m5.120s 00:05:14.079 sys 0m0.259s 00:05:14.079 23:26:58 skip_rpc.skip_rpc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:14.079 23:26:58 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:14.079 ************************************ 00:05:14.079 END TEST skip_rpc 00:05:14.079 ************************************ 00:05:14.079 23:26:58 skip_rpc -- rpc/skip_rpc.sh@74 -- # run_test skip_rpc_with_json test_skip_rpc_with_json 00:05:14.079 23:26:58 skip_rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:14.079 23:26:58 skip_rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:14.079 23:26:58 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:14.079 ************************************ 00:05:14.079 START TEST skip_rpc_with_json 00:05:14.079 ************************************ 00:05:14.079 23:26:58 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1125 -- # test_skip_rpc_with_json 00:05:14.079 23:26:58 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@44 -- # gen_json_config 00:05:14.079 23:26:58 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:05:14.079 23:26:58 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@28 -- # local spdk_pid=200566 00:05:14.079 23:26:58 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@30 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:14.079 23:26:58 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@31 -- # waitforlisten 200566 00:05:14.079 23:26:58 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@831 -- # '[' -z 200566 ']' 00:05:14.079 23:26:58 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:14.079 23:26:58 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@836 -- # local max_retries=100 00:05:14.079 23:26:58 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:14.079 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:14.079 23:26:58 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@840 -- # xtrace_disable 00:05:14.079 23:26:58 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:05:14.079 [2024-07-24 23:26:59.031860] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:05:14.079 [2024-07-24 23:26:59.031896] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid200566 ] 00:05:14.338 [2024-07-24 23:26:59.096293] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:14.338 [2024-07-24 23:26:59.174903] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:05:14.906 23:26:59 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:05:14.906 23:26:59 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@864 -- # return 0 00:05:14.906 23:26:59 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_get_transports --trtype tcp 00:05:14.906 23:26:59 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:14.906 23:26:59 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:05:14.906 [2024-07-24 23:26:59.831067] nvmf_rpc.c:2569:rpc_nvmf_get_transports: *ERROR*: transport 'tcp' does not exist 00:05:14.906 request: 00:05:14.906 { 00:05:14.906 "trtype": "tcp", 00:05:14.906 "method": "nvmf_get_transports", 00:05:14.906 "req_id": 1 00:05:14.906 } 00:05:14.906 Got JSON-RPC error response 00:05:14.906 response: 00:05:14.906 { 00:05:14.906 "code": -19, 00:05:14.906 "message": "No such device" 00:05:14.906 } 00:05:14.906 23:26:59 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@589 -- # [[ 1 == 0 ]] 00:05:14.906 23:26:59 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_create_transport -t tcp 00:05:14.906 23:26:59 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:14.906 23:26:59 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:05:14.906 [2024-07-24 23:26:59.843173] tcp.c: 677:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:05:14.906 23:26:59 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:14.906 23:26:59 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@36 -- # rpc_cmd save_config 00:05:14.906 23:26:59 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:14.906 23:26:59 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:05:15.165 23:26:59 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:15.165 23:26:59 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@37 -- # cat /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:05:15.165 { 00:05:15.165 "subsystems": [ 00:05:15.165 { 00:05:15.165 "subsystem": "keyring", 00:05:15.165 "config": [] 00:05:15.165 }, 00:05:15.165 { 00:05:15.165 "subsystem": "iobuf", 00:05:15.165 "config": [ 00:05:15.165 { 00:05:15.165 "method": "iobuf_set_options", 00:05:15.165 "params": { 00:05:15.165 "small_pool_count": 8192, 00:05:15.165 "large_pool_count": 1024, 00:05:15.165 "small_bufsize": 8192, 00:05:15.165 "large_bufsize": 135168 00:05:15.165 } 00:05:15.165 } 00:05:15.165 ] 00:05:15.165 }, 00:05:15.165 { 00:05:15.165 "subsystem": "sock", 00:05:15.165 "config": [ 00:05:15.165 { 00:05:15.165 "method": "sock_set_default_impl", 00:05:15.165 "params": { 00:05:15.165 "impl_name": "posix" 00:05:15.165 } 00:05:15.165 }, 00:05:15.165 { 00:05:15.165 "method": "sock_impl_set_options", 00:05:15.165 "params": { 00:05:15.165 "impl_name": "ssl", 00:05:15.165 "recv_buf_size": 4096, 00:05:15.165 "send_buf_size": 4096, 00:05:15.165 "enable_recv_pipe": true, 00:05:15.165 "enable_quickack": false, 00:05:15.165 "enable_placement_id": 0, 00:05:15.165 "enable_zerocopy_send_server": true, 00:05:15.165 "enable_zerocopy_send_client": false, 00:05:15.165 "zerocopy_threshold": 0, 00:05:15.165 "tls_version": 0, 00:05:15.165 "enable_ktls": false 00:05:15.165 } 00:05:15.165 }, 00:05:15.165 { 00:05:15.165 "method": "sock_impl_set_options", 00:05:15.165 "params": { 00:05:15.165 "impl_name": "posix", 00:05:15.165 "recv_buf_size": 2097152, 00:05:15.165 "send_buf_size": 2097152, 00:05:15.165 "enable_recv_pipe": true, 00:05:15.165 "enable_quickack": false, 00:05:15.165 "enable_placement_id": 0, 00:05:15.165 "enable_zerocopy_send_server": true, 00:05:15.165 "enable_zerocopy_send_client": false, 00:05:15.165 "zerocopy_threshold": 0, 00:05:15.165 "tls_version": 0, 00:05:15.165 "enable_ktls": false 00:05:15.165 } 00:05:15.165 } 00:05:15.165 ] 00:05:15.165 }, 00:05:15.165 { 00:05:15.165 "subsystem": "vmd", 00:05:15.165 "config": [] 00:05:15.165 }, 00:05:15.165 { 00:05:15.165 "subsystem": "accel", 00:05:15.165 "config": [ 00:05:15.165 { 00:05:15.165 "method": "accel_set_options", 00:05:15.165 "params": { 00:05:15.165 "small_cache_size": 128, 00:05:15.165 "large_cache_size": 16, 00:05:15.165 "task_count": 2048, 00:05:15.165 "sequence_count": 2048, 00:05:15.165 "buf_count": 2048 00:05:15.165 } 00:05:15.165 } 00:05:15.165 ] 00:05:15.165 }, 00:05:15.165 { 00:05:15.165 "subsystem": "bdev", 00:05:15.165 "config": [ 00:05:15.165 { 00:05:15.165 "method": "bdev_set_options", 00:05:15.165 "params": { 00:05:15.165 "bdev_io_pool_size": 65535, 00:05:15.165 "bdev_io_cache_size": 256, 00:05:15.165 "bdev_auto_examine": true, 00:05:15.165 "iobuf_small_cache_size": 128, 00:05:15.165 "iobuf_large_cache_size": 16 00:05:15.165 } 00:05:15.165 }, 00:05:15.165 { 00:05:15.165 "method": "bdev_raid_set_options", 00:05:15.165 "params": { 00:05:15.165 "process_window_size_kb": 1024, 00:05:15.165 "process_max_bandwidth_mb_sec": 0 00:05:15.165 } 00:05:15.165 }, 00:05:15.165 { 00:05:15.165 "method": "bdev_iscsi_set_options", 00:05:15.165 "params": { 00:05:15.165 "timeout_sec": 30 00:05:15.165 } 00:05:15.165 }, 00:05:15.165 { 00:05:15.165 "method": "bdev_nvme_set_options", 00:05:15.165 "params": { 00:05:15.165 "action_on_timeout": "none", 00:05:15.165 "timeout_us": 0, 00:05:15.165 "timeout_admin_us": 0, 00:05:15.165 "keep_alive_timeout_ms": 10000, 00:05:15.165 "arbitration_burst": 0, 00:05:15.165 "low_priority_weight": 0, 00:05:15.165 "medium_priority_weight": 0, 00:05:15.165 "high_priority_weight": 0, 00:05:15.165 "nvme_adminq_poll_period_us": 10000, 00:05:15.165 "nvme_ioq_poll_period_us": 0, 00:05:15.165 "io_queue_requests": 0, 00:05:15.165 "delay_cmd_submit": true, 00:05:15.165 "transport_retry_count": 4, 00:05:15.165 "bdev_retry_count": 3, 00:05:15.165 "transport_ack_timeout": 0, 00:05:15.165 "ctrlr_loss_timeout_sec": 0, 00:05:15.165 "reconnect_delay_sec": 0, 00:05:15.165 "fast_io_fail_timeout_sec": 0, 00:05:15.165 "disable_auto_failback": false, 00:05:15.165 "generate_uuids": false, 00:05:15.165 "transport_tos": 0, 00:05:15.165 "nvme_error_stat": false, 00:05:15.165 "rdma_srq_size": 0, 00:05:15.165 "io_path_stat": false, 00:05:15.165 "allow_accel_sequence": false, 00:05:15.165 "rdma_max_cq_size": 0, 00:05:15.165 "rdma_cm_event_timeout_ms": 0, 00:05:15.165 "dhchap_digests": [ 00:05:15.165 "sha256", 00:05:15.165 "sha384", 00:05:15.165 "sha512" 00:05:15.165 ], 00:05:15.165 "dhchap_dhgroups": [ 00:05:15.165 "null", 00:05:15.165 "ffdhe2048", 00:05:15.165 "ffdhe3072", 00:05:15.165 "ffdhe4096", 00:05:15.165 "ffdhe6144", 00:05:15.165 "ffdhe8192" 00:05:15.165 ] 00:05:15.165 } 00:05:15.165 }, 00:05:15.165 { 00:05:15.165 "method": "bdev_nvme_set_hotplug", 00:05:15.165 "params": { 00:05:15.165 "period_us": 100000, 00:05:15.165 "enable": false 00:05:15.165 } 00:05:15.165 }, 00:05:15.165 { 00:05:15.165 "method": "bdev_wait_for_examine" 00:05:15.165 } 00:05:15.165 ] 00:05:15.165 }, 00:05:15.165 { 00:05:15.165 "subsystem": "scsi", 00:05:15.165 "config": null 00:05:15.165 }, 00:05:15.165 { 00:05:15.165 "subsystem": "scheduler", 00:05:15.165 "config": [ 00:05:15.165 { 00:05:15.165 "method": "framework_set_scheduler", 00:05:15.165 "params": { 00:05:15.165 "name": "static" 00:05:15.165 } 00:05:15.165 } 00:05:15.165 ] 00:05:15.165 }, 00:05:15.165 { 00:05:15.165 "subsystem": "vhost_scsi", 00:05:15.165 "config": [] 00:05:15.165 }, 00:05:15.165 { 00:05:15.165 "subsystem": "vhost_blk", 00:05:15.165 "config": [] 00:05:15.165 }, 00:05:15.165 { 00:05:15.165 "subsystem": "ublk", 00:05:15.165 "config": [] 00:05:15.165 }, 00:05:15.165 { 00:05:15.165 "subsystem": "nbd", 00:05:15.165 "config": [] 00:05:15.165 }, 00:05:15.165 { 00:05:15.165 "subsystem": "nvmf", 00:05:15.165 "config": [ 00:05:15.165 { 00:05:15.165 "method": "nvmf_set_config", 00:05:15.165 "params": { 00:05:15.165 "discovery_filter": "match_any", 00:05:15.165 "admin_cmd_passthru": { 00:05:15.165 "identify_ctrlr": false 00:05:15.165 } 00:05:15.165 } 00:05:15.165 }, 00:05:15.165 { 00:05:15.165 "method": "nvmf_set_max_subsystems", 00:05:15.165 "params": { 00:05:15.165 "max_subsystems": 1024 00:05:15.165 } 00:05:15.165 }, 00:05:15.165 { 00:05:15.165 "method": "nvmf_set_crdt", 00:05:15.165 "params": { 00:05:15.165 "crdt1": 0, 00:05:15.165 "crdt2": 0, 00:05:15.165 "crdt3": 0 00:05:15.165 } 00:05:15.165 }, 00:05:15.165 { 00:05:15.165 "method": "nvmf_create_transport", 00:05:15.165 "params": { 00:05:15.165 "trtype": "TCP", 00:05:15.165 "max_queue_depth": 128, 00:05:15.165 "max_io_qpairs_per_ctrlr": 127, 00:05:15.165 "in_capsule_data_size": 4096, 00:05:15.165 "max_io_size": 131072, 00:05:15.165 "io_unit_size": 131072, 00:05:15.165 "max_aq_depth": 128, 00:05:15.165 "num_shared_buffers": 511, 00:05:15.165 "buf_cache_size": 4294967295, 00:05:15.165 "dif_insert_or_strip": false, 00:05:15.165 "zcopy": false, 00:05:15.165 "c2h_success": true, 00:05:15.165 "sock_priority": 0, 00:05:15.165 "abort_timeout_sec": 1, 00:05:15.165 "ack_timeout": 0, 00:05:15.165 "data_wr_pool_size": 0 00:05:15.165 } 00:05:15.165 } 00:05:15.165 ] 00:05:15.165 }, 00:05:15.165 { 00:05:15.165 "subsystem": "iscsi", 00:05:15.165 "config": [ 00:05:15.165 { 00:05:15.165 "method": "iscsi_set_options", 00:05:15.165 "params": { 00:05:15.165 "node_base": "iqn.2016-06.io.spdk", 00:05:15.165 "max_sessions": 128, 00:05:15.165 "max_connections_per_session": 2, 00:05:15.165 "max_queue_depth": 64, 00:05:15.165 "default_time2wait": 2, 00:05:15.165 "default_time2retain": 20, 00:05:15.165 "first_burst_length": 8192, 00:05:15.165 "immediate_data": true, 00:05:15.165 "allow_duplicated_isid": false, 00:05:15.165 "error_recovery_level": 0, 00:05:15.165 "nop_timeout": 60, 00:05:15.165 "nop_in_interval": 30, 00:05:15.165 "disable_chap": false, 00:05:15.165 "require_chap": false, 00:05:15.165 "mutual_chap": false, 00:05:15.165 "chap_group": 0, 00:05:15.165 "max_large_datain_per_connection": 64, 00:05:15.165 "max_r2t_per_connection": 4, 00:05:15.165 "pdu_pool_size": 36864, 00:05:15.165 "immediate_data_pool_size": 16384, 00:05:15.165 "data_out_pool_size": 2048 00:05:15.165 } 00:05:15.165 } 00:05:15.166 ] 00:05:15.166 } 00:05:15.166 ] 00:05:15.166 } 00:05:15.166 23:26:59 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@39 -- # trap - SIGINT SIGTERM EXIT 00:05:15.166 23:26:59 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@40 -- # killprocess 200566 00:05:15.166 23:26:59 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@950 -- # '[' -z 200566 ']' 00:05:15.166 23:26:59 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # kill -0 200566 00:05:15.166 23:26:59 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@955 -- # uname 00:05:15.166 23:27:00 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:05:15.166 23:27:00 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 200566 00:05:15.166 23:27:00 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:05:15.166 23:27:00 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:05:15.166 23:27:00 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@968 -- # echo 'killing process with pid 200566' 00:05:15.166 killing process with pid 200566 00:05:15.166 23:27:00 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@969 -- # kill 200566 00:05:15.166 23:27:00 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@974 -- # wait 200566 00:05:15.424 23:27:00 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@47 -- # local spdk_pid=200837 00:05:15.424 23:27:00 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@48 -- # sleep 5 00:05:15.424 23:27:00 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:05:20.697 23:27:05 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@50 -- # killprocess 200837 00:05:20.697 23:27:05 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@950 -- # '[' -z 200837 ']' 00:05:20.697 23:27:05 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # kill -0 200837 00:05:20.697 23:27:05 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@955 -- # uname 00:05:20.697 23:27:05 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:05:20.697 23:27:05 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 200837 00:05:20.698 23:27:05 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:05:20.698 23:27:05 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:05:20.698 23:27:05 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@968 -- # echo 'killing process with pid 200837' 00:05:20.698 killing process with pid 200837 00:05:20.698 23:27:05 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@969 -- # kill 200837 00:05:20.698 23:27:05 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@974 -- # wait 200837 00:05:20.957 23:27:05 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@51 -- # grep -q 'TCP Transport Init' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:05:20.957 23:27:05 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@52 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:05:20.957 00:05:20.957 real 0m6.732s 00:05:20.957 user 0m6.492s 00:05:20.957 sys 0m0.610s 00:05:20.957 23:27:05 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:20.957 23:27:05 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:05:20.957 ************************************ 00:05:20.957 END TEST skip_rpc_with_json 00:05:20.957 ************************************ 00:05:20.957 23:27:05 skip_rpc -- rpc/skip_rpc.sh@75 -- # run_test skip_rpc_with_delay test_skip_rpc_with_delay 00:05:20.957 23:27:05 skip_rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:20.957 23:27:05 skip_rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:20.957 23:27:05 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:20.957 ************************************ 00:05:20.957 START TEST skip_rpc_with_delay 00:05:20.957 ************************************ 00:05:20.957 23:27:05 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1125 -- # test_skip_rpc_with_delay 00:05:20.957 23:27:05 skip_rpc.skip_rpc_with_delay -- rpc/skip_rpc.sh@57 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:05:20.957 23:27:05 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@650 -- # local es=0 00:05:20.957 23:27:05 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:05:20.957 23:27:05 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:20.957 23:27:05 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:05:20.957 23:27:05 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:20.957 23:27:05 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:05:20.957 23:27:05 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:20.957 23:27:05 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:05:20.957 23:27:05 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:20.957 23:27:05 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:05:20.957 23:27:05 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:05:20.957 [2024-07-24 23:27:05.845307] app.c: 832:spdk_app_start: *ERROR*: Cannot use '--wait-for-rpc' if no RPC server is going to be started. 00:05:20.957 [2024-07-24 23:27:05.845362] app.c: 711:unclaim_cpu_cores: *ERROR*: Failed to unlink lock fd for core 0, errno: 2 00:05:20.957 23:27:05 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@653 -- # es=1 00:05:20.957 23:27:05 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:05:20.957 23:27:05 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:05:20.957 23:27:05 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:05:20.957 00:05:20.957 real 0m0.069s 00:05:20.957 user 0m0.045s 00:05:20.957 sys 0m0.023s 00:05:20.957 23:27:05 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:20.957 23:27:05 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@10 -- # set +x 00:05:20.957 ************************************ 00:05:20.957 END TEST skip_rpc_with_delay 00:05:20.957 ************************************ 00:05:20.957 23:27:05 skip_rpc -- rpc/skip_rpc.sh@77 -- # uname 00:05:20.957 23:27:05 skip_rpc -- rpc/skip_rpc.sh@77 -- # '[' Linux '!=' FreeBSD ']' 00:05:20.957 23:27:05 skip_rpc -- rpc/skip_rpc.sh@78 -- # run_test exit_on_failed_rpc_init test_exit_on_failed_rpc_init 00:05:20.957 23:27:05 skip_rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:20.957 23:27:05 skip_rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:20.957 23:27:05 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:20.957 ************************************ 00:05:20.957 START TEST exit_on_failed_rpc_init 00:05:20.957 ************************************ 00:05:20.957 23:27:05 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1125 -- # test_exit_on_failed_rpc_init 00:05:20.957 23:27:05 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@62 -- # local spdk_pid=201955 00:05:20.957 23:27:05 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@63 -- # waitforlisten 201955 00:05:20.957 23:27:05 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:05:20.957 23:27:05 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@831 -- # '[' -z 201955 ']' 00:05:20.957 23:27:05 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:20.957 23:27:05 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@836 -- # local max_retries=100 00:05:20.957 23:27:05 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:20.957 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:20.957 23:27:05 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@840 -- # xtrace_disable 00:05:20.957 23:27:05 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:05:21.216 [2024-07-24 23:27:05.979144] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:05:21.216 [2024-07-24 23:27:05.979187] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid201955 ] 00:05:21.216 [2024-07-24 23:27:06.042868] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:21.216 [2024-07-24 23:27:06.120692] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:05:21.784 23:27:06 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:05:21.784 23:27:06 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@864 -- # return 0 00:05:21.784 23:27:06 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@65 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:21.784 23:27:06 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@67 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:05:21.784 23:27:06 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@650 -- # local es=0 00:05:21.784 23:27:06 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:05:21.784 23:27:06 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:21.784 23:27:06 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:05:21.784 23:27:06 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:21.784 23:27:06 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:05:21.784 23:27:06 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:21.784 23:27:06 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:05:21.784 23:27:06 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:21.784 23:27:06 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:05:21.784 23:27:06 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:05:22.044 [2024-07-24 23:27:06.825567] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:05:22.044 [2024-07-24 23:27:06.825610] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid201974 ] 00:05:22.044 [2024-07-24 23:27:06.889907] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:22.044 [2024-07-24 23:27:06.962035] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:05:22.044 [2024-07-24 23:27:06.962100] rpc.c: 180:_spdk_rpc_listen: *ERROR*: RPC Unix domain socket path /var/tmp/spdk.sock in use. Specify another. 00:05:22.044 [2024-07-24 23:27:06.962111] rpc.c: 166:spdk_rpc_initialize: *ERROR*: Unable to start RPC service at /var/tmp/spdk.sock 00:05:22.044 [2024-07-24 23:27:06.962117] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:05:22.044 23:27:07 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@653 -- # es=234 00:05:22.044 23:27:07 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:05:22.044 23:27:07 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@662 -- # es=106 00:05:22.044 23:27:07 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@663 -- # case "$es" in 00:05:22.044 23:27:07 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@670 -- # es=1 00:05:22.044 23:27:07 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:05:22.044 23:27:07 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@69 -- # trap - SIGINT SIGTERM EXIT 00:05:22.044 23:27:07 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@70 -- # killprocess 201955 00:05:22.044 23:27:07 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@950 -- # '[' -z 201955 ']' 00:05:22.044 23:27:07 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@954 -- # kill -0 201955 00:05:22.044 23:27:07 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@955 -- # uname 00:05:22.303 23:27:07 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:05:22.303 23:27:07 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 201955 00:05:22.303 23:27:07 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:05:22.303 23:27:07 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:05:22.303 23:27:07 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@968 -- # echo 'killing process with pid 201955' 00:05:22.303 killing process with pid 201955 00:05:22.303 23:27:07 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@969 -- # kill 201955 00:05:22.303 23:27:07 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@974 -- # wait 201955 00:05:22.562 00:05:22.562 real 0m1.472s 00:05:22.562 user 0m1.683s 00:05:22.562 sys 0m0.408s 00:05:22.562 23:27:07 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:22.562 23:27:07 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:05:22.562 ************************************ 00:05:22.562 END TEST exit_on_failed_rpc_init 00:05:22.562 ************************************ 00:05:22.562 23:27:07 skip_rpc -- rpc/skip_rpc.sh@81 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:05:22.562 00:05:22.562 real 0m13.969s 00:05:22.562 user 0m13.467s 00:05:22.562 sys 0m1.533s 00:05:22.563 23:27:07 skip_rpc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:22.563 23:27:07 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:22.563 ************************************ 00:05:22.563 END TEST skip_rpc 00:05:22.563 ************************************ 00:05:22.563 23:27:07 -- spdk/autotest.sh@171 -- # run_test rpc_client /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:05:22.563 23:27:07 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:22.563 23:27:07 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:22.563 23:27:07 -- common/autotest_common.sh@10 -- # set +x 00:05:22.563 ************************************ 00:05:22.563 START TEST rpc_client 00:05:22.563 ************************************ 00:05:22.563 23:27:07 rpc_client -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:05:22.822 * Looking for test storage... 00:05:22.822 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client 00:05:22.822 23:27:07 rpc_client -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:05:22.822 OK 00:05:22.822 23:27:07 rpc_client -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:05:22.822 00:05:22.822 real 0m0.112s 00:05:22.822 user 0m0.051s 00:05:22.822 sys 0m0.069s 00:05:22.822 23:27:07 rpc_client -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:22.822 23:27:07 rpc_client -- common/autotest_common.sh@10 -- # set +x 00:05:22.822 ************************************ 00:05:22.822 END TEST rpc_client 00:05:22.822 ************************************ 00:05:22.822 23:27:07 -- spdk/autotest.sh@172 -- # run_test json_config /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config.sh 00:05:22.822 23:27:07 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:22.822 23:27:07 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:22.822 23:27:07 -- common/autotest_common.sh@10 -- # set +x 00:05:22.822 ************************************ 00:05:22.822 START TEST json_config 00:05:22.822 ************************************ 00:05:22.822 23:27:07 json_config -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config.sh 00:05:22.822 23:27:07 json_config -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:05:22.822 23:27:07 json_config -- nvmf/common.sh@7 -- # uname -s 00:05:22.822 23:27:07 json_config -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:05:22.822 23:27:07 json_config -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:05:22.822 23:27:07 json_config -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:05:22.822 23:27:07 json_config -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:05:22.822 23:27:07 json_config -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:05:22.822 23:27:07 json_config -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:05:22.822 23:27:07 json_config -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:05:22.822 23:27:07 json_config -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:05:22.822 23:27:07 json_config -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:05:22.822 23:27:07 json_config -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:05:22.822 23:27:07 json_config -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:801347e8-3fd0-e911-906e-0017a4403562 00:05:22.822 23:27:07 json_config -- nvmf/common.sh@18 -- # NVME_HOSTID=801347e8-3fd0-e911-906e-0017a4403562 00:05:22.822 23:27:07 json_config -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:05:22.822 23:27:07 json_config -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:05:22.822 23:27:07 json_config -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:05:22.822 23:27:07 json_config -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:05:22.822 23:27:07 json_config -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:05:22.822 23:27:07 json_config -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:05:22.822 23:27:07 json_config -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:05:22.822 23:27:07 json_config -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:05:22.822 23:27:07 json_config -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:22.823 23:27:07 json_config -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:22.823 23:27:07 json_config -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:22.823 23:27:07 json_config -- paths/export.sh@5 -- # export PATH 00:05:22.823 23:27:07 json_config -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:22.823 23:27:07 json_config -- nvmf/common.sh@47 -- # : 0 00:05:22.823 23:27:07 json_config -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:05:22.823 23:27:07 json_config -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:05:22.823 23:27:07 json_config -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:05:22.823 23:27:07 json_config -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:05:22.823 23:27:07 json_config -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:05:22.823 23:27:07 json_config -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:05:22.823 23:27:07 json_config -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:05:22.823 23:27:07 json_config -- nvmf/common.sh@51 -- # have_pci_nics=0 00:05:22.823 23:27:07 json_config -- json_config/json_config.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/common.sh 00:05:22.823 23:27:07 json_config -- json_config/json_config.sh@11 -- # [[ 0 -eq 1 ]] 00:05:22.823 23:27:07 json_config -- json_config/json_config.sh@15 -- # [[ 0 -ne 1 ]] 00:05:22.823 23:27:07 json_config -- json_config/json_config.sh@15 -- # [[ 0 -eq 1 ]] 00:05:22.823 23:27:07 json_config -- json_config/json_config.sh@26 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:05:22.823 23:27:07 json_config -- json_config/json_config.sh@31 -- # app_pid=(['target']='' ['initiator']='') 00:05:22.823 23:27:07 json_config -- json_config/json_config.sh@31 -- # declare -A app_pid 00:05:22.823 23:27:07 json_config -- json_config/json_config.sh@32 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock' ['initiator']='/var/tmp/spdk_initiator.sock') 00:05:22.823 23:27:07 json_config -- json_config/json_config.sh@32 -- # declare -A app_socket 00:05:22.823 23:27:07 json_config -- json_config/json_config.sh@33 -- # app_params=(['target']='-m 0x1 -s 1024' ['initiator']='-m 0x2 -g -u -s 1024') 00:05:22.823 23:27:07 json_config -- json_config/json_config.sh@33 -- # declare -A app_params 00:05:22.823 23:27:07 json_config -- json_config/json_config.sh@34 -- # configs_path=(['target']='/var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json' ['initiator']='/var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_initiator_config.json') 00:05:22.823 23:27:07 json_config -- json_config/json_config.sh@34 -- # declare -A configs_path 00:05:22.823 23:27:07 json_config -- json_config/json_config.sh@40 -- # last_event_id=0 00:05:22.823 23:27:07 json_config -- json_config/json_config.sh@359 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:05:22.823 23:27:07 json_config -- json_config/json_config.sh@360 -- # echo 'INFO: JSON configuration test init' 00:05:22.823 INFO: JSON configuration test init 00:05:22.823 23:27:07 json_config -- json_config/json_config.sh@361 -- # json_config_test_init 00:05:22.823 23:27:07 json_config -- json_config/json_config.sh@266 -- # timing_enter json_config_test_init 00:05:22.823 23:27:07 json_config -- common/autotest_common.sh@724 -- # xtrace_disable 00:05:22.823 23:27:07 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:22.823 23:27:07 json_config -- json_config/json_config.sh@267 -- # timing_enter json_config_setup_target 00:05:22.823 23:27:07 json_config -- common/autotest_common.sh@724 -- # xtrace_disable 00:05:22.823 23:27:07 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:22.823 23:27:07 json_config -- json_config/json_config.sh@269 -- # json_config_test_start_app target --wait-for-rpc 00:05:22.823 23:27:07 json_config -- json_config/common.sh@9 -- # local app=target 00:05:22.823 23:27:07 json_config -- json_config/common.sh@10 -- # shift 00:05:22.823 23:27:07 json_config -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:05:22.823 23:27:07 json_config -- json_config/common.sh@13 -- # [[ -z '' ]] 00:05:22.823 23:27:07 json_config -- json_config/common.sh@15 -- # local app_extra_params= 00:05:22.823 23:27:07 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:05:22.823 23:27:07 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:05:22.823 23:27:07 json_config -- json_config/common.sh@22 -- # app_pid["$app"]=202307 00:05:22.823 23:27:07 json_config -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:05:22.823 Waiting for target to run... 00:05:22.823 23:27:07 json_config -- json_config/common.sh@25 -- # waitforlisten 202307 /var/tmp/spdk_tgt.sock 00:05:22.823 23:27:07 json_config -- common/autotest_common.sh@831 -- # '[' -z 202307 ']' 00:05:22.823 23:27:07 json_config -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:05:22.823 23:27:07 json_config -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --wait-for-rpc 00:05:22.823 23:27:07 json_config -- common/autotest_common.sh@836 -- # local max_retries=100 00:05:22.823 23:27:07 json_config -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:05:22.823 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:05:22.823 23:27:07 json_config -- common/autotest_common.sh@840 -- # xtrace_disable 00:05:22.823 23:27:07 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:23.082 [2024-07-24 23:27:07.827371] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:05:23.082 [2024-07-24 23:27:07.827417] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid202307 ] 00:05:23.341 [2024-07-24 23:27:08.286717] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:23.600 [2024-07-24 23:27:08.378217] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:05:23.859 23:27:08 json_config -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:05:23.859 23:27:08 json_config -- common/autotest_common.sh@864 -- # return 0 00:05:23.859 23:27:08 json_config -- json_config/common.sh@26 -- # echo '' 00:05:23.859 00:05:23.859 23:27:08 json_config -- json_config/json_config.sh@273 -- # create_accel_config 00:05:23.859 23:27:08 json_config -- json_config/json_config.sh@97 -- # timing_enter create_accel_config 00:05:23.859 23:27:08 json_config -- common/autotest_common.sh@724 -- # xtrace_disable 00:05:23.859 23:27:08 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:23.859 23:27:08 json_config -- json_config/json_config.sh@99 -- # [[ 1 -eq 1 ]] 00:05:23.859 23:27:08 json_config -- json_config/json_config.sh@100 -- # tgt_rpc dpdk_cryptodev_scan_accel_module 00:05:23.859 23:27:08 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock dpdk_cryptodev_scan_accel_module 00:05:23.859 23:27:08 json_config -- json_config/json_config.sh@101 -- # tgt_rpc accel_assign_opc -o encrypt -m dpdk_cryptodev 00:05:23.859 23:27:08 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock accel_assign_opc -o encrypt -m dpdk_cryptodev 00:05:24.118 [2024-07-24 23:27:08.947937] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:05:24.118 23:27:08 json_config -- json_config/json_config.sh@102 -- # tgt_rpc accel_assign_opc -o decrypt -m dpdk_cryptodev 00:05:24.118 23:27:08 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock accel_assign_opc -o decrypt -m dpdk_cryptodev 00:05:24.377 [2024-07-24 23:27:09.124378] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:05:24.377 23:27:09 json_config -- json_config/json_config.sh@105 -- # timing_exit create_accel_config 00:05:24.377 23:27:09 json_config -- common/autotest_common.sh@730 -- # xtrace_disable 00:05:24.377 23:27:09 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:24.377 23:27:09 json_config -- json_config/json_config.sh@277 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --json-with-subsystems 00:05:24.377 23:27:09 json_config -- json_config/json_config.sh@278 -- # tgt_rpc load_config 00:05:24.377 23:27:09 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock load_config 00:05:24.377 [2024-07-24 23:27:09.356124] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:05:29.642 23:27:14 json_config -- json_config/json_config.sh@280 -- # tgt_check_notification_types 00:05:29.642 23:27:14 json_config -- json_config/json_config.sh@43 -- # timing_enter tgt_check_notification_types 00:05:29.642 23:27:14 json_config -- common/autotest_common.sh@724 -- # xtrace_disable 00:05:29.642 23:27:14 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:29.642 23:27:14 json_config -- json_config/json_config.sh@45 -- # local ret=0 00:05:29.642 23:27:14 json_config -- json_config/json_config.sh@46 -- # enabled_types=('bdev_register' 'bdev_unregister') 00:05:29.642 23:27:14 json_config -- json_config/json_config.sh@46 -- # local enabled_types 00:05:29.642 23:27:14 json_config -- json_config/json_config.sh@48 -- # tgt_rpc notify_get_types 00:05:29.642 23:27:14 json_config -- json_config/json_config.sh@48 -- # jq -r '.[]' 00:05:29.642 23:27:14 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_types 00:05:29.642 23:27:14 json_config -- json_config/json_config.sh@48 -- # get_types=('bdev_register' 'bdev_unregister') 00:05:29.642 23:27:14 json_config -- json_config/json_config.sh@48 -- # local get_types 00:05:29.642 23:27:14 json_config -- json_config/json_config.sh@50 -- # local type_diff 00:05:29.642 23:27:14 json_config -- json_config/json_config.sh@51 -- # echo bdev_register bdev_unregister bdev_register bdev_unregister 00:05:29.642 23:27:14 json_config -- json_config/json_config.sh@51 -- # tr ' ' '\n' 00:05:29.642 23:27:14 json_config -- json_config/json_config.sh@51 -- # sort 00:05:29.642 23:27:14 json_config -- json_config/json_config.sh@51 -- # uniq -u 00:05:29.642 23:27:14 json_config -- json_config/json_config.sh@51 -- # type_diff= 00:05:29.642 23:27:14 json_config -- json_config/json_config.sh@53 -- # [[ -n '' ]] 00:05:29.642 23:27:14 json_config -- json_config/json_config.sh@58 -- # timing_exit tgt_check_notification_types 00:05:29.642 23:27:14 json_config -- common/autotest_common.sh@730 -- # xtrace_disable 00:05:29.642 23:27:14 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:29.642 23:27:14 json_config -- json_config/json_config.sh@59 -- # return 0 00:05:29.642 23:27:14 json_config -- json_config/json_config.sh@282 -- # [[ 1 -eq 1 ]] 00:05:29.642 23:27:14 json_config -- json_config/json_config.sh@283 -- # create_bdev_subsystem_config 00:05:29.642 23:27:14 json_config -- json_config/json_config.sh@109 -- # timing_enter create_bdev_subsystem_config 00:05:29.642 23:27:14 json_config -- common/autotest_common.sh@724 -- # xtrace_disable 00:05:29.642 23:27:14 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:29.642 23:27:14 json_config -- json_config/json_config.sh@111 -- # expected_notifications=() 00:05:29.642 23:27:14 json_config -- json_config/json_config.sh@111 -- # local expected_notifications 00:05:29.642 23:27:14 json_config -- json_config/json_config.sh@115 -- # expected_notifications+=($(get_notifications)) 00:05:29.642 23:27:14 json_config -- json_config/json_config.sh@115 -- # get_notifications 00:05:29.642 23:27:14 json_config -- json_config/json_config.sh@63 -- # local ev_type ev_ctx event_id 00:05:29.642 23:27:14 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:05:29.642 23:27:14 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:05:29.642 23:27:14 json_config -- json_config/json_config.sh@62 -- # jq -r '.[] | "\(.type):\(.ctx):\(.id)"' 00:05:29.642 23:27:14 json_config -- json_config/json_config.sh@62 -- # tgt_rpc notify_get_notifications -i 0 00:05:29.642 23:27:14 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_notifications -i 0 00:05:29.901 23:27:14 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Nvme0n1 00:05:29.901 23:27:14 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:05:29.901 23:27:14 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:05:29.901 23:27:14 json_config -- json_config/json_config.sh@117 -- # [[ 1 -eq 1 ]] 00:05:29.901 23:27:14 json_config -- json_config/json_config.sh@118 -- # local lvol_store_base_bdev=Nvme0n1 00:05:29.901 23:27:14 json_config -- json_config/json_config.sh@120 -- # tgt_rpc bdev_split_create Nvme0n1 2 00:05:29.901 23:27:14 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_split_create Nvme0n1 2 00:05:29.901 Nvme0n1p0 Nvme0n1p1 00:05:29.901 23:27:14 json_config -- json_config/json_config.sh@121 -- # tgt_rpc bdev_split_create Malloc0 3 00:05:29.901 23:27:14 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_split_create Malloc0 3 00:05:30.159 [2024-07-24 23:27:15.043680] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:05:30.159 [2024-07-24 23:27:15.043721] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:05:30.159 00:05:30.159 23:27:15 json_config -- json_config/json_config.sh@122 -- # tgt_rpc bdev_malloc_create 8 4096 --name Malloc3 00:05:30.159 23:27:15 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 4096 --name Malloc3 00:05:30.417 Malloc3 00:05:30.417 23:27:15 json_config -- json_config/json_config.sh@123 -- # tgt_rpc bdev_passthru_create -b Malloc3 -p PTBdevFromMalloc3 00:05:30.417 23:27:15 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_passthru_create -b Malloc3 -p PTBdevFromMalloc3 00:05:30.417 [2024-07-24 23:27:15.392620] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:05:30.417 [2024-07-24 23:27:15.392668] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:30.417 [2024-07-24 23:27:15.392681] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11f83a0 00:05:30.417 [2024-07-24 23:27:15.392704] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:30.417 [2024-07-24 23:27:15.393792] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:30.417 [2024-07-24 23:27:15.393815] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: PTBdevFromMalloc3 00:05:30.417 PTBdevFromMalloc3 00:05:30.417 23:27:15 json_config -- json_config/json_config.sh@125 -- # tgt_rpc bdev_null_create Null0 32 512 00:05:30.417 23:27:15 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_null_create Null0 32 512 00:05:30.675 Null0 00:05:30.675 23:27:15 json_config -- json_config/json_config.sh@127 -- # tgt_rpc bdev_malloc_create 32 512 --name Malloc0 00:05:30.675 23:27:15 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 32 512 --name Malloc0 00:05:30.933 Malloc0 00:05:30.933 23:27:15 json_config -- json_config/json_config.sh@128 -- # tgt_rpc bdev_malloc_create 16 4096 --name Malloc1 00:05:30.933 23:27:15 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 16 4096 --name Malloc1 00:05:30.933 Malloc1 00:05:30.933 23:27:15 json_config -- json_config/json_config.sh@141 -- # expected_notifications+=(bdev_register:${lvol_store_base_bdev}p1 bdev_register:${lvol_store_base_bdev}p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1) 00:05:30.933 23:27:15 json_config -- json_config/json_config.sh@144 -- # dd if=/dev/zero of=/sample_aio bs=1024 count=102400 00:05:31.193 102400+0 records in 00:05:31.193 102400+0 records out 00:05:31.193 104857600 bytes (105 MB, 100 MiB) copied, 0.114438 s, 916 MB/s 00:05:31.193 23:27:15 json_config -- json_config/json_config.sh@145 -- # tgt_rpc bdev_aio_create /sample_aio aio_disk 1024 00:05:31.193 23:27:15 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_aio_create /sample_aio aio_disk 1024 00:05:31.193 aio_disk 00:05:31.193 23:27:16 json_config -- json_config/json_config.sh@146 -- # expected_notifications+=(bdev_register:aio_disk) 00:05:31.193 23:27:16 json_config -- json_config/json_config.sh@151 -- # tgt_rpc bdev_lvol_create_lvstore -c 1048576 Nvme0n1p0 lvs_test 00:05:31.193 23:27:16 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create_lvstore -c 1048576 Nvme0n1p0 lvs_test 00:05:33.777 2b13efa8-70ea-45b4-8b74-f263e3c49c13 00:05:33.777 23:27:18 json_config -- json_config/json_config.sh@158 -- # expected_notifications+=("bdev_register:$(tgt_rpc bdev_lvol_create -l lvs_test lvol0 32)" "bdev_register:$(tgt_rpc bdev_lvol_create -l lvs_test -t lvol1 32)" "bdev_register:$(tgt_rpc bdev_lvol_snapshot lvs_test/lvol0 snapshot0)" "bdev_register:$(tgt_rpc bdev_lvol_clone lvs_test/snapshot0 clone0)") 00:05:33.777 23:27:18 json_config -- json_config/json_config.sh@158 -- # tgt_rpc bdev_lvol_create -l lvs_test lvol0 32 00:05:33.777 23:27:18 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create -l lvs_test lvol0 32 00:05:33.777 23:27:18 json_config -- json_config/json_config.sh@158 -- # tgt_rpc bdev_lvol_create -l lvs_test -t lvol1 32 00:05:33.777 23:27:18 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create -l lvs_test -t lvol1 32 00:05:33.777 23:27:18 json_config -- json_config/json_config.sh@158 -- # tgt_rpc bdev_lvol_snapshot lvs_test/lvol0 snapshot0 00:05:33.777 23:27:18 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_snapshot lvs_test/lvol0 snapshot0 00:05:34.034 23:27:18 json_config -- json_config/json_config.sh@158 -- # tgt_rpc bdev_lvol_clone lvs_test/snapshot0 clone0 00:05:34.034 23:27:18 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_clone lvs_test/snapshot0 clone0 00:05:34.291 23:27:19 json_config -- json_config/json_config.sh@161 -- # [[ 1 -eq 1 ]] 00:05:34.291 23:27:19 json_config -- json_config/json_config.sh@162 -- # tgt_rpc bdev_malloc_create 8 1024 --name MallocForCryptoBdev 00:05:34.291 23:27:19 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 1024 --name MallocForCryptoBdev 00:05:34.291 MallocForCryptoBdev 00:05:34.291 23:27:19 json_config -- json_config/json_config.sh@163 -- # wc -l 00:05:34.291 23:27:19 json_config -- json_config/json_config.sh@163 -- # lspci -d:37c8 00:05:34.291 23:27:19 json_config -- json_config/json_config.sh@163 -- # [[ 3 -eq 0 ]] 00:05:34.291 23:27:19 json_config -- json_config/json_config.sh@166 -- # local crypto_driver=crypto_qat 00:05:34.291 23:27:19 json_config -- json_config/json_config.sh@169 -- # tgt_rpc bdev_crypto_create MallocForCryptoBdev CryptoMallocBdev -p crypto_qat -k 01234567891234560123456789123456 00:05:34.291 23:27:19 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_crypto_create MallocForCryptoBdev CryptoMallocBdev -p crypto_qat -k 01234567891234560123456789123456 00:05:34.549 [2024-07-24 23:27:19.418764] vbdev_crypto_rpc.c: 136:rpc_bdev_crypto_create: *WARNING*: "crypto_pmd" parameters is obsolete and ignored 00:05:34.549 CryptoMallocBdev 00:05:34.549 23:27:19 json_config -- json_config/json_config.sh@173 -- # expected_notifications+=(bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev) 00:05:34.549 23:27:19 json_config -- json_config/json_config.sh@176 -- # [[ 0 -eq 1 ]] 00:05:34.549 23:27:19 json_config -- json_config/json_config.sh@182 -- # tgt_check_notifications bdev_register:Nvme0n1 bdev_register:Nvme0n1p1 bdev_register:Nvme0n1p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1 bdev_register:aio_disk bdev_register:377f7689-4edc-4384-863c-d9351475e0eb bdev_register:08a77aa1-3b73-4979-91bd-34249101ba72 bdev_register:e3982b41-4fe4-4f30-b6ae-eb028acd33cc bdev_register:0fd0a94b-729b-443d-b321-f17cd0543cee bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev 00:05:34.549 23:27:19 json_config -- json_config/json_config.sh@71 -- # local events_to_check 00:05:34.549 23:27:19 json_config -- json_config/json_config.sh@72 -- # local recorded_events 00:05:34.549 23:27:19 json_config -- json_config/json_config.sh@75 -- # events_to_check=($(printf '%s\n' "$@" | sort)) 00:05:34.549 23:27:19 json_config -- json_config/json_config.sh@75 -- # printf '%s\n' bdev_register:Nvme0n1 bdev_register:Nvme0n1p1 bdev_register:Nvme0n1p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1 bdev_register:aio_disk bdev_register:377f7689-4edc-4384-863c-d9351475e0eb bdev_register:08a77aa1-3b73-4979-91bd-34249101ba72 bdev_register:e3982b41-4fe4-4f30-b6ae-eb028acd33cc bdev_register:0fd0a94b-729b-443d-b321-f17cd0543cee bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev 00:05:34.549 23:27:19 json_config -- json_config/json_config.sh@75 -- # sort 00:05:34.549 23:27:19 json_config -- json_config/json_config.sh@76 -- # recorded_events=($(get_notifications | sort)) 00:05:34.549 23:27:19 json_config -- json_config/json_config.sh@76 -- # get_notifications 00:05:34.549 23:27:19 json_config -- json_config/json_config.sh@76 -- # sort 00:05:34.549 23:27:19 json_config -- json_config/json_config.sh@63 -- # local ev_type ev_ctx event_id 00:05:34.549 23:27:19 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:05:34.549 23:27:19 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:05:34.549 23:27:19 json_config -- json_config/json_config.sh@62 -- # tgt_rpc notify_get_notifications -i 0 00:05:34.549 23:27:19 json_config -- json_config/json_config.sh@62 -- # jq -r '.[] | "\(.type):\(.ctx):\(.id)"' 00:05:34.549 23:27:19 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_notifications -i 0 00:05:34.806 23:27:19 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Nvme0n1 00:05:34.806 23:27:19 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:05:34.806 23:27:19 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:05:34.806 23:27:19 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Nvme0n1p1 00:05:34.806 23:27:19 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:05:34.806 23:27:19 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:05:34.806 23:27:19 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Nvme0n1p0 00:05:34.806 23:27:19 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:05:34.806 23:27:19 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:05:34.806 23:27:19 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Malloc3 00:05:34.806 23:27:19 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:05:34.806 23:27:19 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:05:34.806 23:27:19 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:PTBdevFromMalloc3 00:05:34.806 23:27:19 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:05:34.806 23:27:19 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:05:34.806 23:27:19 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Null0 00:05:34.806 23:27:19 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:05:34.806 23:27:19 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:05:34.807 23:27:19 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Malloc0 00:05:34.807 23:27:19 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:05:34.807 23:27:19 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:05:34.807 23:27:19 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Malloc0p2 00:05:34.807 23:27:19 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:05:34.807 23:27:19 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:05:34.807 23:27:19 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Malloc0p1 00:05:34.807 23:27:19 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:05:34.807 23:27:19 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:05:34.807 23:27:19 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Malloc0p0 00:05:34.807 23:27:19 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:05:34.807 23:27:19 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:05:34.807 23:27:19 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Malloc1 00:05:34.807 23:27:19 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:05:34.807 23:27:19 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:05:34.807 23:27:19 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:aio_disk 00:05:34.807 23:27:19 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:05:34.807 23:27:19 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:05:34.807 23:27:19 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:377f7689-4edc-4384-863c-d9351475e0eb 00:05:34.807 23:27:19 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:05:34.807 23:27:19 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:05:34.807 23:27:19 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:08a77aa1-3b73-4979-91bd-34249101ba72 00:05:34.807 23:27:19 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:05:34.807 23:27:19 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:05:34.807 23:27:19 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:e3982b41-4fe4-4f30-b6ae-eb028acd33cc 00:05:34.807 23:27:19 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:05:34.807 23:27:19 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:05:34.807 23:27:19 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:0fd0a94b-729b-443d-b321-f17cd0543cee 00:05:34.807 23:27:19 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:05:34.807 23:27:19 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:05:34.807 23:27:19 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:MallocForCryptoBdev 00:05:34.807 23:27:19 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:05:34.807 23:27:19 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:05:34.807 23:27:19 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:CryptoMallocBdev 00:05:34.807 23:27:19 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:05:34.807 23:27:19 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:05:34.807 23:27:19 json_config -- json_config/json_config.sh@78 -- # [[ bdev_register:08a77aa1-3b73-4979-91bd-34249101ba72 bdev_register:0fd0a94b-729b-443d-b321-f17cd0543cee bdev_register:377f7689-4edc-4384-863c-d9351475e0eb bdev_register:aio_disk bdev_register:CryptoMallocBdev bdev_register:e3982b41-4fe4-4f30-b6ae-eb028acd33cc bdev_register:Malloc0 bdev_register:Malloc0p0 bdev_register:Malloc0p1 bdev_register:Malloc0p2 bdev_register:Malloc1 bdev_register:Malloc3 bdev_register:MallocForCryptoBdev bdev_register:Null0 bdev_register:Nvme0n1 bdev_register:Nvme0n1p0 bdev_register:Nvme0n1p1 bdev_register:PTBdevFromMalloc3 != \b\d\e\v\_\r\e\g\i\s\t\e\r\:\0\8\a\7\7\a\a\1\-\3\b\7\3\-\4\9\7\9\-\9\1\b\d\-\3\4\2\4\9\1\0\1\b\a\7\2\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\0\f\d\0\a\9\4\b\-\7\2\9\b\-\4\4\3\d\-\b\3\2\1\-\f\1\7\c\d\0\5\4\3\c\e\e\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\3\7\7\f\7\6\8\9\-\4\e\d\c\-\4\3\8\4\-\8\6\3\c\-\d\9\3\5\1\4\7\5\e\0\e\b\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\a\i\o\_\d\i\s\k\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\C\r\y\p\t\o\M\a\l\l\o\c\B\d\e\v\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\e\3\9\8\2\b\4\1\-\4\f\e\4\-\4\f\3\0\-\b\6\a\e\-\e\b\0\2\8\a\c\d\3\3\c\c\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\2\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\3\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\F\o\r\C\r\y\p\t\o\B\d\e\v\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\u\l\l\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\p\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\p\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\P\T\B\d\e\v\F\r\o\m\M\a\l\l\o\c\3 ]] 00:05:34.807 23:27:19 json_config -- json_config/json_config.sh@90 -- # cat 00:05:34.807 23:27:19 json_config -- json_config/json_config.sh@90 -- # printf ' %s\n' bdev_register:08a77aa1-3b73-4979-91bd-34249101ba72 bdev_register:0fd0a94b-729b-443d-b321-f17cd0543cee bdev_register:377f7689-4edc-4384-863c-d9351475e0eb bdev_register:aio_disk bdev_register:CryptoMallocBdev bdev_register:e3982b41-4fe4-4f30-b6ae-eb028acd33cc bdev_register:Malloc0 bdev_register:Malloc0p0 bdev_register:Malloc0p1 bdev_register:Malloc0p2 bdev_register:Malloc1 bdev_register:Malloc3 bdev_register:MallocForCryptoBdev bdev_register:Null0 bdev_register:Nvme0n1 bdev_register:Nvme0n1p0 bdev_register:Nvme0n1p1 bdev_register:PTBdevFromMalloc3 00:05:34.807 Expected events matched: 00:05:34.807 bdev_register:08a77aa1-3b73-4979-91bd-34249101ba72 00:05:34.807 bdev_register:0fd0a94b-729b-443d-b321-f17cd0543cee 00:05:34.807 bdev_register:377f7689-4edc-4384-863c-d9351475e0eb 00:05:34.807 bdev_register:aio_disk 00:05:34.807 bdev_register:CryptoMallocBdev 00:05:34.807 bdev_register:e3982b41-4fe4-4f30-b6ae-eb028acd33cc 00:05:34.807 bdev_register:Malloc0 00:05:34.807 bdev_register:Malloc0p0 00:05:34.807 bdev_register:Malloc0p1 00:05:34.807 bdev_register:Malloc0p2 00:05:34.807 bdev_register:Malloc1 00:05:34.807 bdev_register:Malloc3 00:05:34.807 bdev_register:MallocForCryptoBdev 00:05:34.807 bdev_register:Null0 00:05:34.807 bdev_register:Nvme0n1 00:05:34.807 bdev_register:Nvme0n1p0 00:05:34.807 bdev_register:Nvme0n1p1 00:05:34.807 bdev_register:PTBdevFromMalloc3 00:05:34.807 23:27:19 json_config -- json_config/json_config.sh@184 -- # timing_exit create_bdev_subsystem_config 00:05:34.807 23:27:19 json_config -- common/autotest_common.sh@730 -- # xtrace_disable 00:05:34.807 23:27:19 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:34.807 23:27:19 json_config -- json_config/json_config.sh@286 -- # [[ 0 -eq 1 ]] 00:05:34.807 23:27:19 json_config -- json_config/json_config.sh@290 -- # [[ 0 -eq 1 ]] 00:05:34.807 23:27:19 json_config -- json_config/json_config.sh@294 -- # [[ 0 -eq 1 ]] 00:05:34.807 23:27:19 json_config -- json_config/json_config.sh@297 -- # timing_exit json_config_setup_target 00:05:34.807 23:27:19 json_config -- common/autotest_common.sh@730 -- # xtrace_disable 00:05:34.807 23:27:19 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:34.807 23:27:19 json_config -- json_config/json_config.sh@299 -- # [[ 0 -eq 1 ]] 00:05:34.807 23:27:19 json_config -- json_config/json_config.sh@304 -- # tgt_rpc bdev_malloc_create 8 512 --name MallocBdevForConfigChangeCheck 00:05:34.807 23:27:19 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 512 --name MallocBdevForConfigChangeCheck 00:05:35.065 MallocBdevForConfigChangeCheck 00:05:35.065 23:27:19 json_config -- json_config/json_config.sh@306 -- # timing_exit json_config_test_init 00:05:35.065 23:27:19 json_config -- common/autotest_common.sh@730 -- # xtrace_disable 00:05:35.065 23:27:19 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:35.065 23:27:19 json_config -- json_config/json_config.sh@363 -- # tgt_rpc save_config 00:05:35.065 23:27:19 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:05:35.323 23:27:20 json_config -- json_config/json_config.sh@365 -- # echo 'INFO: shutting down applications...' 00:05:35.323 INFO: shutting down applications... 00:05:35.323 23:27:20 json_config -- json_config/json_config.sh@366 -- # [[ 0 -eq 1 ]] 00:05:35.323 23:27:20 json_config -- json_config/json_config.sh@372 -- # json_config_clear target 00:05:35.323 23:27:20 json_config -- json_config/json_config.sh@336 -- # [[ -n 22 ]] 00:05:35.323 23:27:20 json_config -- json_config/json_config.sh@337 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/clear_config.py -s /var/tmp/spdk_tgt.sock clear_config 00:05:35.580 [2024-07-24 23:27:20.381509] vbdev_lvol.c: 150:vbdev_lvs_hotremove_cb: *NOTICE*: bdev Nvme0n1p0 being removed: closing lvstore lvs_test 00:05:36.950 Calling clear_iscsi_subsystem 00:05:36.950 Calling clear_nvmf_subsystem 00:05:36.950 Calling clear_nbd_subsystem 00:05:36.950 Calling clear_ublk_subsystem 00:05:36.950 Calling clear_vhost_blk_subsystem 00:05:36.950 Calling clear_vhost_scsi_subsystem 00:05:36.950 Calling clear_bdev_subsystem 00:05:36.950 23:27:21 json_config -- json_config/json_config.sh@341 -- # local config_filter=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py 00:05:36.950 23:27:21 json_config -- json_config/json_config.sh@347 -- # count=100 00:05:36.950 23:27:21 json_config -- json_config/json_config.sh@348 -- # '[' 100 -gt 0 ']' 00:05:36.950 23:27:21 json_config -- json_config/json_config.sh@349 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:05:36.950 23:27:21 json_config -- json_config/json_config.sh@349 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method delete_global_parameters 00:05:36.950 23:27:21 json_config -- json_config/json_config.sh@349 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method check_empty 00:05:37.208 23:27:22 json_config -- json_config/json_config.sh@349 -- # break 00:05:37.208 23:27:22 json_config -- json_config/json_config.sh@354 -- # '[' 100 -eq 0 ']' 00:05:37.208 23:27:22 json_config -- json_config/json_config.sh@373 -- # json_config_test_shutdown_app target 00:05:37.208 23:27:22 json_config -- json_config/common.sh@31 -- # local app=target 00:05:37.208 23:27:22 json_config -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:05:37.208 23:27:22 json_config -- json_config/common.sh@35 -- # [[ -n 202307 ]] 00:05:37.208 23:27:22 json_config -- json_config/common.sh@38 -- # kill -SIGINT 202307 00:05:37.208 23:27:22 json_config -- json_config/common.sh@40 -- # (( i = 0 )) 00:05:37.208 23:27:22 json_config -- json_config/common.sh@40 -- # (( i < 30 )) 00:05:37.208 23:27:22 json_config -- json_config/common.sh@41 -- # kill -0 202307 00:05:37.208 23:27:22 json_config -- json_config/common.sh@45 -- # sleep 0.5 00:05:37.773 23:27:22 json_config -- json_config/common.sh@40 -- # (( i++ )) 00:05:37.773 23:27:22 json_config -- json_config/common.sh@40 -- # (( i < 30 )) 00:05:37.773 23:27:22 json_config -- json_config/common.sh@41 -- # kill -0 202307 00:05:37.773 23:27:22 json_config -- json_config/common.sh@42 -- # app_pid["$app"]= 00:05:37.773 23:27:22 json_config -- json_config/common.sh@43 -- # break 00:05:37.773 23:27:22 json_config -- json_config/common.sh@48 -- # [[ -n '' ]] 00:05:37.773 23:27:22 json_config -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:05:37.773 SPDK target shutdown done 00:05:37.773 23:27:22 json_config -- json_config/json_config.sh@375 -- # echo 'INFO: relaunching applications...' 00:05:37.773 INFO: relaunching applications... 00:05:37.773 23:27:22 json_config -- json_config/json_config.sh@376 -- # json_config_test_start_app target --json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:05:37.773 23:27:22 json_config -- json_config/common.sh@9 -- # local app=target 00:05:37.773 23:27:22 json_config -- json_config/common.sh@10 -- # shift 00:05:37.773 23:27:22 json_config -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:05:37.773 23:27:22 json_config -- json_config/common.sh@13 -- # [[ -z '' ]] 00:05:37.773 23:27:22 json_config -- json_config/common.sh@15 -- # local app_extra_params= 00:05:37.773 23:27:22 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:05:37.773 23:27:22 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:05:37.773 23:27:22 json_config -- json_config/common.sh@22 -- # app_pid["$app"]=205350 00:05:37.773 23:27:22 json_config -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:05:37.773 Waiting for target to run... 00:05:37.773 23:27:22 json_config -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:05:37.773 23:27:22 json_config -- json_config/common.sh@25 -- # waitforlisten 205350 /var/tmp/spdk_tgt.sock 00:05:37.773 23:27:22 json_config -- common/autotest_common.sh@831 -- # '[' -z 205350 ']' 00:05:37.773 23:27:22 json_config -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:05:37.773 23:27:22 json_config -- common/autotest_common.sh@836 -- # local max_retries=100 00:05:37.773 23:27:22 json_config -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:05:37.773 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:05:37.773 23:27:22 json_config -- common/autotest_common.sh@840 -- # xtrace_disable 00:05:37.773 23:27:22 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:37.773 [2024-07-24 23:27:22.744657] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:05:37.773 [2024-07-24 23:27:22.744709] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid205350 ] 00:05:38.030 [2024-07-24 23:27:23.020211] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:38.288 [2024-07-24 23:27:23.088590] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:05:38.288 [2024-07-24 23:27:23.142075] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:05:38.288 [2024-07-24 23:27:23.150105] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:05:38.288 [2024-07-24 23:27:23.158123] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:05:38.288 [2024-07-24 23:27:23.237403] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:05:40.815 [2024-07-24 23:27:25.355956] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:05:40.815 [2024-07-24 23:27:25.355998] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:05:40.815 [2024-07-24 23:27:25.356005] vbdev_passthru.c: 736:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:05:40.815 [2024-07-24 23:27:25.363975] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Nvme0n1 00:05:40.815 [2024-07-24 23:27:25.363990] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Nvme0n1 00:05:40.815 [2024-07-24 23:27:25.371993] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:05:40.815 [2024-07-24 23:27:25.372008] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:05:40.815 [2024-07-24 23:27:25.380022] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "CryptoMallocBdev_AES_CBC" 00:05:40.815 [2024-07-24 23:27:25.380038] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: MallocForCryptoBdev 00:05:40.815 [2024-07-24 23:27:25.380043] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:05:43.344 [2024-07-24 23:27:28.235243] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:05:43.344 [2024-07-24 23:27:28.235274] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:43.344 [2024-07-24 23:27:28.235283] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x254b530 00:05:43.344 [2024-07-24 23:27:28.235289] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:43.344 [2024-07-24 23:27:28.235467] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:43.344 [2024-07-24 23:27:28.235482] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: PTBdevFromMalloc3 00:05:43.602 23:27:28 json_config -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:05:43.602 23:27:28 json_config -- common/autotest_common.sh@864 -- # return 0 00:05:43.602 23:27:28 json_config -- json_config/common.sh@26 -- # echo '' 00:05:43.602 00:05:43.602 23:27:28 json_config -- json_config/json_config.sh@377 -- # [[ 0 -eq 1 ]] 00:05:43.602 23:27:28 json_config -- json_config/json_config.sh@381 -- # echo 'INFO: Checking if target configuration is the same...' 00:05:43.602 INFO: Checking if target configuration is the same... 00:05:43.602 23:27:28 json_config -- json_config/json_config.sh@382 -- # tgt_rpc save_config 00:05:43.602 23:27:28 json_config -- json_config/json_config.sh@382 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh /dev/fd/62 /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:05:43.602 23:27:28 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:05:43.602 + '[' 2 -ne 2 ']' 00:05:43.602 +++ dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh 00:05:43.602 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/../.. 00:05:43.602 + rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:05:43.602 +++ basename /dev/fd/62 00:05:43.602 ++ mktemp /tmp/62.XXX 00:05:43.602 + tmp_file_1=/tmp/62.8Im 00:05:43.602 +++ basename /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:05:43.602 ++ mktemp /tmp/spdk_tgt_config.json.XXX 00:05:43.602 + tmp_file_2=/tmp/spdk_tgt_config.json.FGX 00:05:43.602 + ret=0 00:05:43.602 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:05:43.860 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:05:44.119 + diff -u /tmp/62.8Im /tmp/spdk_tgt_config.json.FGX 00:05:44.119 + echo 'INFO: JSON config files are the same' 00:05:44.119 INFO: JSON config files are the same 00:05:44.119 + rm /tmp/62.8Im /tmp/spdk_tgt_config.json.FGX 00:05:44.119 + exit 0 00:05:44.119 23:27:28 json_config -- json_config/json_config.sh@383 -- # [[ 0 -eq 1 ]] 00:05:44.119 23:27:28 json_config -- json_config/json_config.sh@388 -- # echo 'INFO: changing configuration and checking if this can be detected...' 00:05:44.119 INFO: changing configuration and checking if this can be detected... 00:05:44.119 23:27:28 json_config -- json_config/json_config.sh@390 -- # tgt_rpc bdev_malloc_delete MallocBdevForConfigChangeCheck 00:05:44.119 23:27:28 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_delete MallocBdevForConfigChangeCheck 00:05:44.119 23:27:29 json_config -- json_config/json_config.sh@391 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh /dev/fd/62 /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:05:44.119 23:27:29 json_config -- json_config/json_config.sh@391 -- # tgt_rpc save_config 00:05:44.119 23:27:29 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:05:44.119 + '[' 2 -ne 2 ']' 00:05:44.119 +++ dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh 00:05:44.119 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/../.. 00:05:44.119 + rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:05:44.119 +++ basename /dev/fd/62 00:05:44.119 ++ mktemp /tmp/62.XXX 00:05:44.119 + tmp_file_1=/tmp/62.O76 00:05:44.119 +++ basename /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:05:44.119 ++ mktemp /tmp/spdk_tgt_config.json.XXX 00:05:44.119 + tmp_file_2=/tmp/spdk_tgt_config.json.kOL 00:05:44.119 + ret=0 00:05:44.120 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:05:44.686 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:05:44.686 + diff -u /tmp/62.O76 /tmp/spdk_tgt_config.json.kOL 00:05:44.686 + ret=1 00:05:44.686 + echo '=== Start of file: /tmp/62.O76 ===' 00:05:44.686 + cat /tmp/62.O76 00:05:44.686 + echo '=== End of file: /tmp/62.O76 ===' 00:05:44.686 + echo '' 00:05:44.686 + echo '=== Start of file: /tmp/spdk_tgt_config.json.kOL ===' 00:05:44.686 + cat /tmp/spdk_tgt_config.json.kOL 00:05:44.686 + echo '=== End of file: /tmp/spdk_tgt_config.json.kOL ===' 00:05:44.686 + echo '' 00:05:44.686 + rm /tmp/62.O76 /tmp/spdk_tgt_config.json.kOL 00:05:44.686 + exit 1 00:05:44.686 23:27:29 json_config -- json_config/json_config.sh@395 -- # echo 'INFO: configuration change detected.' 00:05:44.686 INFO: configuration change detected. 00:05:44.686 23:27:29 json_config -- json_config/json_config.sh@398 -- # json_config_test_fini 00:05:44.686 23:27:29 json_config -- json_config/json_config.sh@310 -- # timing_enter json_config_test_fini 00:05:44.686 23:27:29 json_config -- common/autotest_common.sh@724 -- # xtrace_disable 00:05:44.686 23:27:29 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:44.686 23:27:29 json_config -- json_config/json_config.sh@311 -- # local ret=0 00:05:44.686 23:27:29 json_config -- json_config/json_config.sh@313 -- # [[ -n '' ]] 00:05:44.686 23:27:29 json_config -- json_config/json_config.sh@321 -- # [[ -n 205350 ]] 00:05:44.686 23:27:29 json_config -- json_config/json_config.sh@324 -- # cleanup_bdev_subsystem_config 00:05:44.686 23:27:29 json_config -- json_config/json_config.sh@188 -- # timing_enter cleanup_bdev_subsystem_config 00:05:44.686 23:27:29 json_config -- common/autotest_common.sh@724 -- # xtrace_disable 00:05:44.686 23:27:29 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:44.686 23:27:29 json_config -- json_config/json_config.sh@190 -- # [[ 1 -eq 1 ]] 00:05:44.686 23:27:29 json_config -- json_config/json_config.sh@191 -- # tgt_rpc bdev_lvol_delete lvs_test/clone0 00:05:44.686 23:27:29 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/clone0 00:05:44.686 23:27:29 json_config -- json_config/json_config.sh@192 -- # tgt_rpc bdev_lvol_delete lvs_test/lvol0 00:05:44.686 23:27:29 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/lvol0 00:05:44.944 23:27:29 json_config -- json_config/json_config.sh@193 -- # tgt_rpc bdev_lvol_delete lvs_test/snapshot0 00:05:44.944 23:27:29 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/snapshot0 00:05:44.944 23:27:29 json_config -- json_config/json_config.sh@194 -- # tgt_rpc bdev_lvol_delete_lvstore -l lvs_test 00:05:44.944 23:27:29 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete_lvstore -l lvs_test 00:05:45.203 23:27:30 json_config -- json_config/json_config.sh@197 -- # uname -s 00:05:45.203 23:27:30 json_config -- json_config/json_config.sh@197 -- # [[ Linux = Linux ]] 00:05:45.203 23:27:30 json_config -- json_config/json_config.sh@198 -- # rm -f /sample_aio 00:05:45.203 23:27:30 json_config -- json_config/json_config.sh@201 -- # [[ 0 -eq 1 ]] 00:05:45.203 23:27:30 json_config -- json_config/json_config.sh@205 -- # timing_exit cleanup_bdev_subsystem_config 00:05:45.203 23:27:30 json_config -- common/autotest_common.sh@730 -- # xtrace_disable 00:05:45.203 23:27:30 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:45.203 23:27:30 json_config -- json_config/json_config.sh@327 -- # killprocess 205350 00:05:45.203 23:27:30 json_config -- common/autotest_common.sh@950 -- # '[' -z 205350 ']' 00:05:45.203 23:27:30 json_config -- common/autotest_common.sh@954 -- # kill -0 205350 00:05:45.203 23:27:30 json_config -- common/autotest_common.sh@955 -- # uname 00:05:45.203 23:27:30 json_config -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:05:45.203 23:27:30 json_config -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 205350 00:05:45.203 23:27:30 json_config -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:05:45.203 23:27:30 json_config -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:05:45.203 23:27:30 json_config -- common/autotest_common.sh@968 -- # echo 'killing process with pid 205350' 00:05:45.203 killing process with pid 205350 00:05:45.203 23:27:30 json_config -- common/autotest_common.sh@969 -- # kill 205350 00:05:45.203 23:27:30 json_config -- common/autotest_common.sh@974 -- # wait 205350 00:05:47.103 23:27:31 json_config -- json_config/json_config.sh@330 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_initiator_config.json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:05:47.103 23:27:31 json_config -- json_config/json_config.sh@331 -- # timing_exit json_config_test_fini 00:05:47.103 23:27:31 json_config -- common/autotest_common.sh@730 -- # xtrace_disable 00:05:47.103 23:27:31 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:47.103 23:27:31 json_config -- json_config/json_config.sh@332 -- # return 0 00:05:47.103 23:27:31 json_config -- json_config/json_config.sh@400 -- # echo 'INFO: Success' 00:05:47.103 INFO: Success 00:05:47.103 00:05:47.103 real 0m24.175s 00:05:47.103 user 0m27.382s 00:05:47.103 sys 0m2.611s 00:05:47.103 23:27:31 json_config -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:47.103 23:27:31 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:47.103 ************************************ 00:05:47.103 END TEST json_config 00:05:47.103 ************************************ 00:05:47.103 23:27:31 -- spdk/autotest.sh@173 -- # run_test json_config_extra_key /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:05:47.103 23:27:31 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:47.103 23:27:31 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:47.103 23:27:31 -- common/autotest_common.sh@10 -- # set +x 00:05:47.103 ************************************ 00:05:47.103 START TEST json_config_extra_key 00:05:47.103 ************************************ 00:05:47.103 23:27:31 json_config_extra_key -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:05:47.103 23:27:31 json_config_extra_key -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:05:47.103 23:27:31 json_config_extra_key -- nvmf/common.sh@7 -- # uname -s 00:05:47.103 23:27:31 json_config_extra_key -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:05:47.103 23:27:31 json_config_extra_key -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:05:47.103 23:27:31 json_config_extra_key -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:05:47.103 23:27:31 json_config_extra_key -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:05:47.103 23:27:31 json_config_extra_key -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:05:47.103 23:27:31 json_config_extra_key -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:05:47.103 23:27:31 json_config_extra_key -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:05:47.103 23:27:31 json_config_extra_key -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:05:47.103 23:27:31 json_config_extra_key -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:05:47.103 23:27:31 json_config_extra_key -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:05:47.103 23:27:31 json_config_extra_key -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:801347e8-3fd0-e911-906e-0017a4403562 00:05:47.103 23:27:31 json_config_extra_key -- nvmf/common.sh@18 -- # NVME_HOSTID=801347e8-3fd0-e911-906e-0017a4403562 00:05:47.103 23:27:31 json_config_extra_key -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:05:47.103 23:27:31 json_config_extra_key -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:05:47.103 23:27:31 json_config_extra_key -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:05:47.103 23:27:31 json_config_extra_key -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:05:47.103 23:27:31 json_config_extra_key -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:05:47.103 23:27:31 json_config_extra_key -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:05:47.103 23:27:31 json_config_extra_key -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:05:47.103 23:27:31 json_config_extra_key -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:05:47.103 23:27:31 json_config_extra_key -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:47.104 23:27:31 json_config_extra_key -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:47.104 23:27:31 json_config_extra_key -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:47.104 23:27:31 json_config_extra_key -- paths/export.sh@5 -- # export PATH 00:05:47.104 23:27:31 json_config_extra_key -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:47.104 23:27:31 json_config_extra_key -- nvmf/common.sh@47 -- # : 0 00:05:47.104 23:27:31 json_config_extra_key -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:05:47.104 23:27:31 json_config_extra_key -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:05:47.104 23:27:31 json_config_extra_key -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:05:47.104 23:27:31 json_config_extra_key -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:05:47.104 23:27:31 json_config_extra_key -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:05:47.104 23:27:31 json_config_extra_key -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:05:47.104 23:27:31 json_config_extra_key -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:05:47.104 23:27:31 json_config_extra_key -- nvmf/common.sh@51 -- # have_pci_nics=0 00:05:47.104 23:27:31 json_config_extra_key -- json_config/json_config_extra_key.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/common.sh 00:05:47.104 23:27:31 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # app_pid=(['target']='') 00:05:47.104 23:27:31 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # declare -A app_pid 00:05:47.104 23:27:31 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:05:47.104 23:27:31 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # declare -A app_socket 00:05:47.104 23:27:31 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # app_params=(['target']='-m 0x1 -s 1024') 00:05:47.104 23:27:31 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # declare -A app_params 00:05:47.104 23:27:31 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # configs_path=(['target']='/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json') 00:05:47.104 23:27:31 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # declare -A configs_path 00:05:47.104 23:27:31 json_config_extra_key -- json_config/json_config_extra_key.sh@22 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:05:47.104 23:27:31 json_config_extra_key -- json_config/json_config_extra_key.sh@24 -- # echo 'INFO: launching applications...' 00:05:47.104 INFO: launching applications... 00:05:47.104 23:27:31 json_config_extra_key -- json_config/json_config_extra_key.sh@25 -- # json_config_test_start_app target --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json 00:05:47.104 23:27:31 json_config_extra_key -- json_config/common.sh@9 -- # local app=target 00:05:47.104 23:27:31 json_config_extra_key -- json_config/common.sh@10 -- # shift 00:05:47.104 23:27:31 json_config_extra_key -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:05:47.104 23:27:31 json_config_extra_key -- json_config/common.sh@13 -- # [[ -z '' ]] 00:05:47.104 23:27:31 json_config_extra_key -- json_config/common.sh@15 -- # local app_extra_params= 00:05:47.104 23:27:31 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:05:47.104 23:27:31 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:05:47.104 23:27:31 json_config_extra_key -- json_config/common.sh@22 -- # app_pid["$app"]=207058 00:05:47.104 23:27:31 json_config_extra_key -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:05:47.104 Waiting for target to run... 00:05:47.104 23:27:31 json_config_extra_key -- json_config/common.sh@25 -- # waitforlisten 207058 /var/tmp/spdk_tgt.sock 00:05:47.104 23:27:31 json_config_extra_key -- common/autotest_common.sh@831 -- # '[' -z 207058 ']' 00:05:47.104 23:27:31 json_config_extra_key -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:05:47.104 23:27:31 json_config_extra_key -- common/autotest_common.sh@836 -- # local max_retries=100 00:05:47.104 23:27:31 json_config_extra_key -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json 00:05:47.104 23:27:31 json_config_extra_key -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:05:47.104 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:05:47.104 23:27:31 json_config_extra_key -- common/autotest_common.sh@840 -- # xtrace_disable 00:05:47.104 23:27:31 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:05:47.104 [2024-07-24 23:27:32.035785] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:05:47.104 [2024-07-24 23:27:32.035830] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid207058 ] 00:05:47.362 [2024-07-24 23:27:32.311330] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:47.620 [2024-07-24 23:27:32.378729] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:05:47.879 23:27:32 json_config_extra_key -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:05:47.879 23:27:32 json_config_extra_key -- common/autotest_common.sh@864 -- # return 0 00:05:47.879 23:27:32 json_config_extra_key -- json_config/common.sh@26 -- # echo '' 00:05:47.879 00:05:47.879 23:27:32 json_config_extra_key -- json_config/json_config_extra_key.sh@27 -- # echo 'INFO: shutting down applications...' 00:05:47.879 INFO: shutting down applications... 00:05:47.879 23:27:32 json_config_extra_key -- json_config/json_config_extra_key.sh@28 -- # json_config_test_shutdown_app target 00:05:47.879 23:27:32 json_config_extra_key -- json_config/common.sh@31 -- # local app=target 00:05:47.879 23:27:32 json_config_extra_key -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:05:47.879 23:27:32 json_config_extra_key -- json_config/common.sh@35 -- # [[ -n 207058 ]] 00:05:47.879 23:27:32 json_config_extra_key -- json_config/common.sh@38 -- # kill -SIGINT 207058 00:05:47.879 23:27:32 json_config_extra_key -- json_config/common.sh@40 -- # (( i = 0 )) 00:05:47.879 23:27:32 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:05:47.879 23:27:32 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 207058 00:05:47.879 23:27:32 json_config_extra_key -- json_config/common.sh@45 -- # sleep 0.5 00:05:48.446 23:27:33 json_config_extra_key -- json_config/common.sh@40 -- # (( i++ )) 00:05:48.446 23:27:33 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:05:48.446 23:27:33 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 207058 00:05:48.446 23:27:33 json_config_extra_key -- json_config/common.sh@42 -- # app_pid["$app"]= 00:05:48.446 23:27:33 json_config_extra_key -- json_config/common.sh@43 -- # break 00:05:48.446 23:27:33 json_config_extra_key -- json_config/common.sh@48 -- # [[ -n '' ]] 00:05:48.446 23:27:33 json_config_extra_key -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:05:48.446 SPDK target shutdown done 00:05:48.446 23:27:33 json_config_extra_key -- json_config/json_config_extra_key.sh@30 -- # echo Success 00:05:48.446 Success 00:05:48.446 00:05:48.446 real 0m1.416s 00:05:48.446 user 0m1.026s 00:05:48.446 sys 0m0.377s 00:05:48.446 23:27:33 json_config_extra_key -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:48.446 23:27:33 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:05:48.446 ************************************ 00:05:48.446 END TEST json_config_extra_key 00:05:48.446 ************************************ 00:05:48.446 23:27:33 -- spdk/autotest.sh@174 -- # run_test alias_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:05:48.446 23:27:33 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:48.446 23:27:33 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:48.446 23:27:33 -- common/autotest_common.sh@10 -- # set +x 00:05:48.446 ************************************ 00:05:48.446 START TEST alias_rpc 00:05:48.446 ************************************ 00:05:48.446 23:27:33 alias_rpc -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:05:48.704 * Looking for test storage... 00:05:48.704 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc 00:05:48.704 23:27:33 alias_rpc -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:05:48.704 23:27:33 alias_rpc -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=207338 00:05:48.704 23:27:33 alias_rpc -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 207338 00:05:48.704 23:27:33 alias_rpc -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:48.704 23:27:33 alias_rpc -- common/autotest_common.sh@831 -- # '[' -z 207338 ']' 00:05:48.704 23:27:33 alias_rpc -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:48.704 23:27:33 alias_rpc -- common/autotest_common.sh@836 -- # local max_retries=100 00:05:48.704 23:27:33 alias_rpc -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:48.704 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:48.704 23:27:33 alias_rpc -- common/autotest_common.sh@840 -- # xtrace_disable 00:05:48.704 23:27:33 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:48.704 [2024-07-24 23:27:33.543768] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:05:48.704 [2024-07-24 23:27:33.543814] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid207338 ] 00:05:48.704 [2024-07-24 23:27:33.607980] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:48.704 [2024-07-24 23:27:33.684765] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:05:49.639 23:27:34 alias_rpc -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:05:49.639 23:27:34 alias_rpc -- common/autotest_common.sh@864 -- # return 0 00:05:49.639 23:27:34 alias_rpc -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_config -i 00:05:49.639 23:27:34 alias_rpc -- alias_rpc/alias_rpc.sh@19 -- # killprocess 207338 00:05:49.639 23:27:34 alias_rpc -- common/autotest_common.sh@950 -- # '[' -z 207338 ']' 00:05:49.639 23:27:34 alias_rpc -- common/autotest_common.sh@954 -- # kill -0 207338 00:05:49.639 23:27:34 alias_rpc -- common/autotest_common.sh@955 -- # uname 00:05:49.639 23:27:34 alias_rpc -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:05:49.639 23:27:34 alias_rpc -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 207338 00:05:49.639 23:27:34 alias_rpc -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:05:49.639 23:27:34 alias_rpc -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:05:49.639 23:27:34 alias_rpc -- common/autotest_common.sh@968 -- # echo 'killing process with pid 207338' 00:05:49.639 killing process with pid 207338 00:05:49.639 23:27:34 alias_rpc -- common/autotest_common.sh@969 -- # kill 207338 00:05:49.639 23:27:34 alias_rpc -- common/autotest_common.sh@974 -- # wait 207338 00:05:49.897 00:05:49.897 real 0m1.481s 00:05:49.897 user 0m1.595s 00:05:49.897 sys 0m0.399s 00:05:49.897 23:27:34 alias_rpc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:49.897 23:27:34 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:49.897 ************************************ 00:05:49.897 END TEST alias_rpc 00:05:49.897 ************************************ 00:05:50.156 23:27:34 -- spdk/autotest.sh@176 -- # [[ 0 -eq 0 ]] 00:05:50.156 23:27:34 -- spdk/autotest.sh@177 -- # run_test spdkcli_tcp /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/tcp.sh 00:05:50.156 23:27:34 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:50.156 23:27:34 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:50.156 23:27:34 -- common/autotest_common.sh@10 -- # set +x 00:05:50.156 ************************************ 00:05:50.156 START TEST spdkcli_tcp 00:05:50.156 ************************************ 00:05:50.156 23:27:34 spdkcli_tcp -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/tcp.sh 00:05:50.156 * Looking for test storage... 00:05:50.156 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli 00:05:50.156 23:27:35 spdkcli_tcp -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/common.sh 00:05:50.156 23:27:35 spdkcli_tcp -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:05:50.156 23:27:35 spdkcli_tcp -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/clear_config.py 00:05:50.156 23:27:35 spdkcli_tcp -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:05:50.156 23:27:35 spdkcli_tcp -- spdkcli/tcp.sh@19 -- # PORT=9998 00:05:50.156 23:27:35 spdkcli_tcp -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:05:50.156 23:27:35 spdkcli_tcp -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:05:50.156 23:27:35 spdkcli_tcp -- common/autotest_common.sh@724 -- # xtrace_disable 00:05:50.156 23:27:35 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:05:50.156 23:27:35 spdkcli_tcp -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=207623 00:05:50.156 23:27:35 spdkcli_tcp -- spdkcli/tcp.sh@27 -- # waitforlisten 207623 00:05:50.156 23:27:35 spdkcli_tcp -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:05:50.156 23:27:35 spdkcli_tcp -- common/autotest_common.sh@831 -- # '[' -z 207623 ']' 00:05:50.156 23:27:35 spdkcli_tcp -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:50.156 23:27:35 spdkcli_tcp -- common/autotest_common.sh@836 -- # local max_retries=100 00:05:50.156 23:27:35 spdkcli_tcp -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:50.156 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:50.156 23:27:35 spdkcli_tcp -- common/autotest_common.sh@840 -- # xtrace_disable 00:05:50.156 23:27:35 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:05:50.156 [2024-07-24 23:27:35.091629] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:05:50.156 [2024-07-24 23:27:35.091680] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid207623 ] 00:05:50.156 [2024-07-24 23:27:35.155826] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:50.414 [2024-07-24 23:27:35.233887] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:05:50.414 [2024-07-24 23:27:35.233889] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:05:50.981 23:27:35 spdkcli_tcp -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:05:50.981 23:27:35 spdkcli_tcp -- common/autotest_common.sh@864 -- # return 0 00:05:50.981 23:27:35 spdkcli_tcp -- spdkcli/tcp.sh@31 -- # socat_pid=207846 00:05:50.981 23:27:35 spdkcli_tcp -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:05:50.981 23:27:35 spdkcli_tcp -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:05:51.241 [ 00:05:51.241 "bdev_malloc_delete", 00:05:51.241 "bdev_malloc_create", 00:05:51.241 "bdev_null_resize", 00:05:51.241 "bdev_null_delete", 00:05:51.241 "bdev_null_create", 00:05:51.241 "bdev_nvme_cuse_unregister", 00:05:51.241 "bdev_nvme_cuse_register", 00:05:51.241 "bdev_opal_new_user", 00:05:51.241 "bdev_opal_set_lock_state", 00:05:51.241 "bdev_opal_delete", 00:05:51.241 "bdev_opal_get_info", 00:05:51.241 "bdev_opal_create", 00:05:51.241 "bdev_nvme_opal_revert", 00:05:51.241 "bdev_nvme_opal_init", 00:05:51.241 "bdev_nvme_send_cmd", 00:05:51.241 "bdev_nvme_get_path_iostat", 00:05:51.241 "bdev_nvme_get_mdns_discovery_info", 00:05:51.241 "bdev_nvme_stop_mdns_discovery", 00:05:51.241 "bdev_nvme_start_mdns_discovery", 00:05:51.241 "bdev_nvme_set_multipath_policy", 00:05:51.241 "bdev_nvme_set_preferred_path", 00:05:51.241 "bdev_nvme_get_io_paths", 00:05:51.241 "bdev_nvme_remove_error_injection", 00:05:51.241 "bdev_nvme_add_error_injection", 00:05:51.241 "bdev_nvme_get_discovery_info", 00:05:51.241 "bdev_nvme_stop_discovery", 00:05:51.241 "bdev_nvme_start_discovery", 00:05:51.241 "bdev_nvme_get_controller_health_info", 00:05:51.241 "bdev_nvme_disable_controller", 00:05:51.241 "bdev_nvme_enable_controller", 00:05:51.241 "bdev_nvme_reset_controller", 00:05:51.241 "bdev_nvme_get_transport_statistics", 00:05:51.241 "bdev_nvme_apply_firmware", 00:05:51.241 "bdev_nvme_detach_controller", 00:05:51.241 "bdev_nvme_get_controllers", 00:05:51.241 "bdev_nvme_attach_controller", 00:05:51.241 "bdev_nvme_set_hotplug", 00:05:51.241 "bdev_nvme_set_options", 00:05:51.241 "bdev_passthru_delete", 00:05:51.241 "bdev_passthru_create", 00:05:51.241 "bdev_lvol_set_parent_bdev", 00:05:51.241 "bdev_lvol_set_parent", 00:05:51.241 "bdev_lvol_check_shallow_copy", 00:05:51.241 "bdev_lvol_start_shallow_copy", 00:05:51.241 "bdev_lvol_grow_lvstore", 00:05:51.241 "bdev_lvol_get_lvols", 00:05:51.241 "bdev_lvol_get_lvstores", 00:05:51.241 "bdev_lvol_delete", 00:05:51.241 "bdev_lvol_set_read_only", 00:05:51.241 "bdev_lvol_resize", 00:05:51.241 "bdev_lvol_decouple_parent", 00:05:51.241 "bdev_lvol_inflate", 00:05:51.241 "bdev_lvol_rename", 00:05:51.241 "bdev_lvol_clone_bdev", 00:05:51.241 "bdev_lvol_clone", 00:05:51.241 "bdev_lvol_snapshot", 00:05:51.241 "bdev_lvol_create", 00:05:51.241 "bdev_lvol_delete_lvstore", 00:05:51.241 "bdev_lvol_rename_lvstore", 00:05:51.241 "bdev_lvol_create_lvstore", 00:05:51.241 "bdev_raid_set_options", 00:05:51.241 "bdev_raid_remove_base_bdev", 00:05:51.241 "bdev_raid_add_base_bdev", 00:05:51.241 "bdev_raid_delete", 00:05:51.241 "bdev_raid_create", 00:05:51.241 "bdev_raid_get_bdevs", 00:05:51.241 "bdev_error_inject_error", 00:05:51.241 "bdev_error_delete", 00:05:51.241 "bdev_error_create", 00:05:51.241 "bdev_split_delete", 00:05:51.241 "bdev_split_create", 00:05:51.241 "bdev_delay_delete", 00:05:51.241 "bdev_delay_create", 00:05:51.241 "bdev_delay_update_latency", 00:05:51.241 "bdev_zone_block_delete", 00:05:51.241 "bdev_zone_block_create", 00:05:51.241 "blobfs_create", 00:05:51.241 "blobfs_detect", 00:05:51.241 "blobfs_set_cache_size", 00:05:51.241 "bdev_crypto_delete", 00:05:51.241 "bdev_crypto_create", 00:05:51.241 "bdev_compress_delete", 00:05:51.241 "bdev_compress_create", 00:05:51.241 "bdev_compress_get_orphans", 00:05:51.241 "bdev_aio_delete", 00:05:51.241 "bdev_aio_rescan", 00:05:51.241 "bdev_aio_create", 00:05:51.241 "bdev_ftl_set_property", 00:05:51.241 "bdev_ftl_get_properties", 00:05:51.241 "bdev_ftl_get_stats", 00:05:51.241 "bdev_ftl_unmap", 00:05:51.241 "bdev_ftl_unload", 00:05:51.241 "bdev_ftl_delete", 00:05:51.241 "bdev_ftl_load", 00:05:51.241 "bdev_ftl_create", 00:05:51.241 "bdev_virtio_attach_controller", 00:05:51.241 "bdev_virtio_scsi_get_devices", 00:05:51.241 "bdev_virtio_detach_controller", 00:05:51.241 "bdev_virtio_blk_set_hotplug", 00:05:51.241 "bdev_iscsi_delete", 00:05:51.241 "bdev_iscsi_create", 00:05:51.241 "bdev_iscsi_set_options", 00:05:51.241 "accel_error_inject_error", 00:05:51.241 "ioat_scan_accel_module", 00:05:51.241 "dsa_scan_accel_module", 00:05:51.241 "iaa_scan_accel_module", 00:05:51.241 "dpdk_cryptodev_get_driver", 00:05:51.241 "dpdk_cryptodev_set_driver", 00:05:51.241 "dpdk_cryptodev_scan_accel_module", 00:05:51.241 "compressdev_scan_accel_module", 00:05:51.241 "keyring_file_remove_key", 00:05:51.241 "keyring_file_add_key", 00:05:51.241 "keyring_linux_set_options", 00:05:51.241 "iscsi_get_histogram", 00:05:51.241 "iscsi_enable_histogram", 00:05:51.241 "iscsi_set_options", 00:05:51.241 "iscsi_get_auth_groups", 00:05:51.241 "iscsi_auth_group_remove_secret", 00:05:51.241 "iscsi_auth_group_add_secret", 00:05:51.241 "iscsi_delete_auth_group", 00:05:51.241 "iscsi_create_auth_group", 00:05:51.241 "iscsi_set_discovery_auth", 00:05:51.241 "iscsi_get_options", 00:05:51.241 "iscsi_target_node_request_logout", 00:05:51.241 "iscsi_target_node_set_redirect", 00:05:51.241 "iscsi_target_node_set_auth", 00:05:51.241 "iscsi_target_node_add_lun", 00:05:51.241 "iscsi_get_stats", 00:05:51.241 "iscsi_get_connections", 00:05:51.241 "iscsi_portal_group_set_auth", 00:05:51.241 "iscsi_start_portal_group", 00:05:51.241 "iscsi_delete_portal_group", 00:05:51.241 "iscsi_create_portal_group", 00:05:51.241 "iscsi_get_portal_groups", 00:05:51.241 "iscsi_delete_target_node", 00:05:51.241 "iscsi_target_node_remove_pg_ig_maps", 00:05:51.241 "iscsi_target_node_add_pg_ig_maps", 00:05:51.241 "iscsi_create_target_node", 00:05:51.241 "iscsi_get_target_nodes", 00:05:51.241 "iscsi_delete_initiator_group", 00:05:51.241 "iscsi_initiator_group_remove_initiators", 00:05:51.241 "iscsi_initiator_group_add_initiators", 00:05:51.241 "iscsi_create_initiator_group", 00:05:51.242 "iscsi_get_initiator_groups", 00:05:51.242 "nvmf_set_crdt", 00:05:51.242 "nvmf_set_config", 00:05:51.242 "nvmf_set_max_subsystems", 00:05:51.242 "nvmf_stop_mdns_prr", 00:05:51.242 "nvmf_publish_mdns_prr", 00:05:51.242 "nvmf_subsystem_get_listeners", 00:05:51.242 "nvmf_subsystem_get_qpairs", 00:05:51.242 "nvmf_subsystem_get_controllers", 00:05:51.242 "nvmf_get_stats", 00:05:51.242 "nvmf_get_transports", 00:05:51.242 "nvmf_create_transport", 00:05:51.242 "nvmf_get_targets", 00:05:51.242 "nvmf_delete_target", 00:05:51.242 "nvmf_create_target", 00:05:51.242 "nvmf_subsystem_allow_any_host", 00:05:51.242 "nvmf_subsystem_remove_host", 00:05:51.242 "nvmf_subsystem_add_host", 00:05:51.242 "nvmf_ns_remove_host", 00:05:51.242 "nvmf_ns_add_host", 00:05:51.242 "nvmf_subsystem_remove_ns", 00:05:51.242 "nvmf_subsystem_add_ns", 00:05:51.242 "nvmf_subsystem_listener_set_ana_state", 00:05:51.242 "nvmf_discovery_get_referrals", 00:05:51.242 "nvmf_discovery_remove_referral", 00:05:51.242 "nvmf_discovery_add_referral", 00:05:51.242 "nvmf_subsystem_remove_listener", 00:05:51.242 "nvmf_subsystem_add_listener", 00:05:51.242 "nvmf_delete_subsystem", 00:05:51.242 "nvmf_create_subsystem", 00:05:51.242 "nvmf_get_subsystems", 00:05:51.242 "env_dpdk_get_mem_stats", 00:05:51.242 "nbd_get_disks", 00:05:51.242 "nbd_stop_disk", 00:05:51.242 "nbd_start_disk", 00:05:51.242 "ublk_recover_disk", 00:05:51.242 "ublk_get_disks", 00:05:51.242 "ublk_stop_disk", 00:05:51.242 "ublk_start_disk", 00:05:51.242 "ublk_destroy_target", 00:05:51.242 "ublk_create_target", 00:05:51.242 "virtio_blk_create_transport", 00:05:51.242 "virtio_blk_get_transports", 00:05:51.242 "vhost_controller_set_coalescing", 00:05:51.242 "vhost_get_controllers", 00:05:51.242 "vhost_delete_controller", 00:05:51.242 "vhost_create_blk_controller", 00:05:51.242 "vhost_scsi_controller_remove_target", 00:05:51.242 "vhost_scsi_controller_add_target", 00:05:51.242 "vhost_start_scsi_controller", 00:05:51.242 "vhost_create_scsi_controller", 00:05:51.242 "thread_set_cpumask", 00:05:51.242 "framework_get_governor", 00:05:51.242 "framework_get_scheduler", 00:05:51.242 "framework_set_scheduler", 00:05:51.242 "framework_get_reactors", 00:05:51.242 "thread_get_io_channels", 00:05:51.242 "thread_get_pollers", 00:05:51.242 "thread_get_stats", 00:05:51.242 "framework_monitor_context_switch", 00:05:51.242 "spdk_kill_instance", 00:05:51.242 "log_enable_timestamps", 00:05:51.242 "log_get_flags", 00:05:51.242 "log_clear_flag", 00:05:51.242 "log_set_flag", 00:05:51.242 "log_get_level", 00:05:51.242 "log_set_level", 00:05:51.242 "log_get_print_level", 00:05:51.242 "log_set_print_level", 00:05:51.242 "framework_enable_cpumask_locks", 00:05:51.242 "framework_disable_cpumask_locks", 00:05:51.242 "framework_wait_init", 00:05:51.242 "framework_start_init", 00:05:51.242 "scsi_get_devices", 00:05:51.242 "bdev_get_histogram", 00:05:51.242 "bdev_enable_histogram", 00:05:51.242 "bdev_set_qos_limit", 00:05:51.242 "bdev_set_qd_sampling_period", 00:05:51.242 "bdev_get_bdevs", 00:05:51.242 "bdev_reset_iostat", 00:05:51.242 "bdev_get_iostat", 00:05:51.242 "bdev_examine", 00:05:51.242 "bdev_wait_for_examine", 00:05:51.242 "bdev_set_options", 00:05:51.242 "notify_get_notifications", 00:05:51.242 "notify_get_types", 00:05:51.242 "accel_get_stats", 00:05:51.242 "accel_set_options", 00:05:51.242 "accel_set_driver", 00:05:51.242 "accel_crypto_key_destroy", 00:05:51.242 "accel_crypto_keys_get", 00:05:51.242 "accel_crypto_key_create", 00:05:51.242 "accel_assign_opc", 00:05:51.242 "accel_get_module_info", 00:05:51.242 "accel_get_opc_assignments", 00:05:51.242 "vmd_rescan", 00:05:51.242 "vmd_remove_device", 00:05:51.242 "vmd_enable", 00:05:51.242 "sock_get_default_impl", 00:05:51.242 "sock_set_default_impl", 00:05:51.242 "sock_impl_set_options", 00:05:51.242 "sock_impl_get_options", 00:05:51.242 "iobuf_get_stats", 00:05:51.242 "iobuf_set_options", 00:05:51.242 "framework_get_pci_devices", 00:05:51.242 "framework_get_config", 00:05:51.242 "framework_get_subsystems", 00:05:51.242 "trace_get_info", 00:05:51.242 "trace_get_tpoint_group_mask", 00:05:51.242 "trace_disable_tpoint_group", 00:05:51.242 "trace_enable_tpoint_group", 00:05:51.242 "trace_clear_tpoint_mask", 00:05:51.242 "trace_set_tpoint_mask", 00:05:51.242 "keyring_get_keys", 00:05:51.242 "spdk_get_version", 00:05:51.242 "rpc_get_methods" 00:05:51.242 ] 00:05:51.242 23:27:36 spdkcli_tcp -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:05:51.242 23:27:36 spdkcli_tcp -- common/autotest_common.sh@730 -- # xtrace_disable 00:05:51.242 23:27:36 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:05:51.242 23:27:36 spdkcli_tcp -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:05:51.242 23:27:36 spdkcli_tcp -- spdkcli/tcp.sh@38 -- # killprocess 207623 00:05:51.242 23:27:36 spdkcli_tcp -- common/autotest_common.sh@950 -- # '[' -z 207623 ']' 00:05:51.242 23:27:36 spdkcli_tcp -- common/autotest_common.sh@954 -- # kill -0 207623 00:05:51.242 23:27:36 spdkcli_tcp -- common/autotest_common.sh@955 -- # uname 00:05:51.242 23:27:36 spdkcli_tcp -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:05:51.242 23:27:36 spdkcli_tcp -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 207623 00:05:51.242 23:27:36 spdkcli_tcp -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:05:51.242 23:27:36 spdkcli_tcp -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:05:51.242 23:27:36 spdkcli_tcp -- common/autotest_common.sh@968 -- # echo 'killing process with pid 207623' 00:05:51.242 killing process with pid 207623 00:05:51.242 23:27:36 spdkcli_tcp -- common/autotest_common.sh@969 -- # kill 207623 00:05:51.242 23:27:36 spdkcli_tcp -- common/autotest_common.sh@974 -- # wait 207623 00:05:51.502 00:05:51.502 real 0m1.504s 00:05:51.502 user 0m2.761s 00:05:51.502 sys 0m0.434s 00:05:51.502 23:27:36 spdkcli_tcp -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:51.502 23:27:36 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:05:51.502 ************************************ 00:05:51.502 END TEST spdkcli_tcp 00:05:51.502 ************************************ 00:05:51.502 23:27:36 -- spdk/autotest.sh@180 -- # run_test dpdk_mem_utility /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:05:51.502 23:27:36 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:51.502 23:27:36 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:51.502 23:27:36 -- common/autotest_common.sh@10 -- # set +x 00:05:51.761 ************************************ 00:05:51.761 START TEST dpdk_mem_utility 00:05:51.761 ************************************ 00:05:51.761 23:27:36 dpdk_mem_utility -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:05:51.761 * Looking for test storage... 00:05:51.761 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility 00:05:51.761 23:27:36 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:05:51.761 23:27:36 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=207925 00:05:51.761 23:27:36 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 207925 00:05:51.761 23:27:36 dpdk_mem_utility -- common/autotest_common.sh@831 -- # '[' -z 207925 ']' 00:05:51.761 23:27:36 dpdk_mem_utility -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:51.761 23:27:36 dpdk_mem_utility -- common/autotest_common.sh@836 -- # local max_retries=100 00:05:51.761 23:27:36 dpdk_mem_utility -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:51.761 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:51.761 23:27:36 dpdk_mem_utility -- common/autotest_common.sh@840 -- # xtrace_disable 00:05:51.761 23:27:36 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:05:51.761 23:27:36 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:51.761 [2024-07-24 23:27:36.658623] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:05:51.761 [2024-07-24 23:27:36.658668] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid207925 ] 00:05:51.761 [2024-07-24 23:27:36.723091] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:52.020 [2024-07-24 23:27:36.800300] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:05:52.592 23:27:37 dpdk_mem_utility -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:05:52.592 23:27:37 dpdk_mem_utility -- common/autotest_common.sh@864 -- # return 0 00:05:52.592 23:27:37 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:05:52.592 23:27:37 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:05:52.592 23:27:37 dpdk_mem_utility -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:52.592 23:27:37 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:05:52.592 { 00:05:52.592 "filename": "/tmp/spdk_mem_dump.txt" 00:05:52.592 } 00:05:52.592 23:27:37 dpdk_mem_utility -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:52.592 23:27:37 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:05:52.592 DPDK memory size 814.000000 MiB in 1 heap(s) 00:05:52.592 1 heaps totaling size 814.000000 MiB 00:05:52.592 size: 814.000000 MiB heap id: 0 00:05:52.592 end heaps---------- 00:05:52.592 8 mempools totaling size 598.116089 MiB 00:05:52.592 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:05:52.592 size: 158.602051 MiB name: PDU_data_out_Pool 00:05:52.592 size: 84.521057 MiB name: bdev_io_207925 00:05:52.592 size: 51.011292 MiB name: evtpool_207925 00:05:52.592 size: 50.003479 MiB name: msgpool_207925 00:05:52.592 size: 21.763794 MiB name: PDU_Pool 00:05:52.592 size: 19.513306 MiB name: SCSI_TASK_Pool 00:05:52.592 size: 0.026123 MiB name: Session_Pool 00:05:52.592 end mempools------- 00:05:52.592 201 memzones totaling size 4.176453 MiB 00:05:52.592 size: 1.000366 MiB name: RG_ring_0_207925 00:05:52.592 size: 1.000366 MiB name: RG_ring_1_207925 00:05:52.592 size: 1.000366 MiB name: RG_ring_4_207925 00:05:52.592 size: 1.000366 MiB name: RG_ring_5_207925 00:05:52.592 size: 0.125366 MiB name: RG_ring_2_207925 00:05:52.592 size: 0.015991 MiB name: RG_ring_3_207925 00:05:52.592 size: 0.001160 MiB name: QAT_SYM_CAPA_GEN_1 00:05:52.592 size: 0.000305 MiB name: 0000:1a:01.0_qat 00:05:52.592 size: 0.000305 MiB name: 0000:1a:01.1_qat 00:05:52.592 size: 0.000305 MiB name: 0000:1a:01.2_qat 00:05:52.592 size: 0.000305 MiB name: 0000:1a:01.3_qat 00:05:52.592 size: 0.000305 MiB name: 0000:1a:01.4_qat 00:05:52.592 size: 0.000305 MiB name: 0000:1a:01.5_qat 00:05:52.592 size: 0.000305 MiB name: 0000:1a:01.6_qat 00:05:52.592 size: 0.000305 MiB name: 0000:1a:01.7_qat 00:05:52.592 size: 0.000305 MiB name: 0000:1a:02.0_qat 00:05:52.592 size: 0.000305 MiB name: 0000:1a:02.1_qat 00:05:52.592 size: 0.000305 MiB name: 0000:1a:02.2_qat 00:05:52.592 size: 0.000305 MiB name: 0000:1a:02.3_qat 00:05:52.592 size: 0.000305 MiB name: 0000:1a:02.4_qat 00:05:52.592 size: 0.000305 MiB name: 0000:1a:02.5_qat 00:05:52.592 size: 0.000305 MiB name: 0000:1a:02.6_qat 00:05:52.592 size: 0.000305 MiB name: 0000:1a:02.7_qat 00:05:52.592 size: 0.000305 MiB name: 0000:1c:01.0_qat 00:05:52.592 size: 0.000305 MiB name: 0000:1c:01.1_qat 00:05:52.592 size: 0.000305 MiB name: 0000:1c:01.2_qat 00:05:52.592 size: 0.000305 MiB name: 0000:1c:01.3_qat 00:05:52.592 size: 0.000305 MiB name: 0000:1c:01.4_qat 00:05:52.592 size: 0.000305 MiB name: 0000:1c:01.5_qat 00:05:52.592 size: 0.000305 MiB name: 0000:1c:01.6_qat 00:05:52.592 size: 0.000305 MiB name: 0000:1c:01.7_qat 00:05:52.592 size: 0.000305 MiB name: 0000:1c:02.0_qat 00:05:52.592 size: 0.000305 MiB name: 0000:1c:02.1_qat 00:05:52.592 size: 0.000305 MiB name: 0000:1c:02.2_qat 00:05:52.592 size: 0.000305 MiB name: 0000:1c:02.3_qat 00:05:52.592 size: 0.000305 MiB name: 0000:1c:02.4_qat 00:05:52.592 size: 0.000305 MiB name: 0000:1c:02.5_qat 00:05:52.592 size: 0.000305 MiB name: 0000:1c:02.6_qat 00:05:52.592 size: 0.000305 MiB name: 0000:1c:02.7_qat 00:05:52.592 size: 0.000305 MiB name: 0000:1e:01.0_qat 00:05:52.592 size: 0.000305 MiB name: 0000:1e:01.1_qat 00:05:52.592 size: 0.000305 MiB name: 0000:1e:01.2_qat 00:05:52.592 size: 0.000305 MiB name: 0000:1e:01.3_qat 00:05:52.592 size: 0.000305 MiB name: 0000:1e:01.4_qat 00:05:52.592 size: 0.000305 MiB name: 0000:1e:01.5_qat 00:05:52.592 size: 0.000305 MiB name: 0000:1e:01.6_qat 00:05:52.592 size: 0.000305 MiB name: 0000:1e:01.7_qat 00:05:52.592 size: 0.000305 MiB name: 0000:1e:02.0_qat 00:05:52.592 size: 0.000305 MiB name: 0000:1e:02.1_qat 00:05:52.592 size: 0.000305 MiB name: 0000:1e:02.2_qat 00:05:52.592 size: 0.000305 MiB name: 0000:1e:02.3_qat 00:05:52.592 size: 0.000305 MiB name: 0000:1e:02.4_qat 00:05:52.592 size: 0.000305 MiB name: 0000:1e:02.5_qat 00:05:52.592 size: 0.000305 MiB name: 0000:1e:02.6_qat 00:05:52.592 size: 0.000305 MiB name: 0000:1e:02.7_qat 00:05:52.592 size: 0.000183 MiB name: QAT_ASYM_CAPA_GEN_1 00:05:52.592 size: 0.000122 MiB name: rte_cryptodev_data_0 00:05:52.592 size: 0.000122 MiB name: rte_cryptodev_data_1 00:05:52.592 size: 0.000122 MiB name: rte_compressdev_data_0 00:05:52.592 size: 0.000122 MiB name: rte_cryptodev_data_2 00:05:52.592 size: 0.000122 MiB name: rte_cryptodev_data_3 00:05:52.592 size: 0.000122 MiB name: rte_compressdev_data_1 00:05:52.592 size: 0.000122 MiB name: rte_cryptodev_data_4 00:05:52.592 size: 0.000122 MiB name: rte_cryptodev_data_5 00:05:52.592 size: 0.000122 MiB name: rte_compressdev_data_2 00:05:52.592 size: 0.000122 MiB name: rte_cryptodev_data_6 00:05:52.592 size: 0.000122 MiB name: rte_cryptodev_data_7 00:05:52.592 size: 0.000122 MiB name: rte_compressdev_data_3 00:05:52.592 size: 0.000122 MiB name: rte_cryptodev_data_8 00:05:52.592 size: 0.000122 MiB name: rte_cryptodev_data_9 00:05:52.592 size: 0.000122 MiB name: rte_compressdev_data_4 00:05:52.592 size: 0.000122 MiB name: rte_cryptodev_data_10 00:05:52.592 size: 0.000122 MiB name: rte_cryptodev_data_11 00:05:52.592 size: 0.000122 MiB name: rte_compressdev_data_5 00:05:52.592 size: 0.000122 MiB name: rte_cryptodev_data_12 00:05:52.592 size: 0.000122 MiB name: rte_cryptodev_data_13 00:05:52.592 size: 0.000122 MiB name: rte_compressdev_data_6 00:05:52.592 size: 0.000122 MiB name: rte_cryptodev_data_14 00:05:52.592 size: 0.000122 MiB name: rte_cryptodev_data_15 00:05:52.592 size: 0.000122 MiB name: rte_compressdev_data_7 00:05:52.592 size: 0.000122 MiB name: rte_cryptodev_data_16 00:05:52.592 size: 0.000122 MiB name: rte_cryptodev_data_17 00:05:52.592 size: 0.000122 MiB name: rte_compressdev_data_8 00:05:52.592 size: 0.000122 MiB name: rte_cryptodev_data_18 00:05:52.592 size: 0.000122 MiB name: rte_cryptodev_data_19 00:05:52.592 size: 0.000122 MiB name: rte_compressdev_data_9 00:05:52.592 size: 0.000122 MiB name: rte_cryptodev_data_20 00:05:52.592 size: 0.000122 MiB name: rte_cryptodev_data_21 00:05:52.592 size: 0.000122 MiB name: rte_compressdev_data_10 00:05:52.592 size: 0.000122 MiB name: rte_cryptodev_data_22 00:05:52.592 size: 0.000122 MiB name: rte_cryptodev_data_23 00:05:52.592 size: 0.000122 MiB name: rte_compressdev_data_11 00:05:52.592 size: 0.000122 MiB name: rte_cryptodev_data_24 00:05:52.592 size: 0.000122 MiB name: rte_cryptodev_data_25 00:05:52.592 size: 0.000122 MiB name: rte_compressdev_data_12 00:05:52.592 size: 0.000122 MiB name: rte_cryptodev_data_26 00:05:52.592 size: 0.000122 MiB name: rte_cryptodev_data_27 00:05:52.592 size: 0.000122 MiB name: rte_compressdev_data_13 00:05:52.592 size: 0.000122 MiB name: rte_cryptodev_data_28 00:05:52.592 size: 0.000122 MiB name: rte_cryptodev_data_29 00:05:52.592 size: 0.000122 MiB name: rte_compressdev_data_14 00:05:52.592 size: 0.000122 MiB name: rte_cryptodev_data_30 00:05:52.592 size: 0.000122 MiB name: rte_cryptodev_data_31 00:05:52.592 size: 0.000122 MiB name: rte_compressdev_data_15 00:05:52.592 size: 0.000122 MiB name: rte_cryptodev_data_32 00:05:52.592 size: 0.000122 MiB name: rte_cryptodev_data_33 00:05:52.592 size: 0.000122 MiB name: rte_compressdev_data_16 00:05:52.592 size: 0.000122 MiB name: rte_cryptodev_data_34 00:05:52.592 size: 0.000122 MiB name: rte_cryptodev_data_35 00:05:52.592 size: 0.000122 MiB name: rte_compressdev_data_17 00:05:52.592 size: 0.000122 MiB name: rte_cryptodev_data_36 00:05:52.592 size: 0.000122 MiB name: rte_cryptodev_data_37 00:05:52.592 size: 0.000122 MiB name: rte_compressdev_data_18 00:05:52.592 size: 0.000122 MiB name: rte_cryptodev_data_38 00:05:52.592 size: 0.000122 MiB name: rte_cryptodev_data_39 00:05:52.592 size: 0.000122 MiB name: rte_compressdev_data_19 00:05:52.592 size: 0.000122 MiB name: rte_cryptodev_data_40 00:05:52.592 size: 0.000122 MiB name: rte_cryptodev_data_41 00:05:52.592 size: 0.000122 MiB name: rte_compressdev_data_20 00:05:52.592 size: 0.000122 MiB name: rte_cryptodev_data_42 00:05:52.592 size: 0.000122 MiB name: rte_cryptodev_data_43 00:05:52.592 size: 0.000122 MiB name: rte_compressdev_data_21 00:05:52.592 size: 0.000122 MiB name: rte_cryptodev_data_44 00:05:52.592 size: 0.000122 MiB name: rte_cryptodev_data_45 00:05:52.592 size: 0.000122 MiB name: rte_compressdev_data_22 00:05:52.592 size: 0.000122 MiB name: rte_cryptodev_data_46 00:05:52.592 size: 0.000122 MiB name: rte_cryptodev_data_47 00:05:52.592 size: 0.000122 MiB name: rte_compressdev_data_23 00:05:52.592 size: 0.000122 MiB name: rte_cryptodev_data_48 00:05:52.592 size: 0.000122 MiB name: rte_cryptodev_data_49 00:05:52.592 size: 0.000122 MiB name: rte_compressdev_data_24 00:05:52.592 size: 0.000122 MiB name: rte_cryptodev_data_50 00:05:52.592 size: 0.000122 MiB name: rte_cryptodev_data_51 00:05:52.592 size: 0.000122 MiB name: rte_compressdev_data_25 00:05:52.592 size: 0.000122 MiB name: rte_cryptodev_data_52 00:05:52.592 size: 0.000122 MiB name: rte_cryptodev_data_53 00:05:52.593 size: 0.000122 MiB name: rte_compressdev_data_26 00:05:52.593 size: 0.000122 MiB name: rte_cryptodev_data_54 00:05:52.593 size: 0.000122 MiB name: rte_cryptodev_data_55 00:05:52.593 size: 0.000122 MiB name: rte_compressdev_data_27 00:05:52.593 size: 0.000122 MiB name: rte_cryptodev_data_56 00:05:52.593 size: 0.000122 MiB name: rte_cryptodev_data_57 00:05:52.593 size: 0.000122 MiB name: rte_compressdev_data_28 00:05:52.593 size: 0.000122 MiB name: rte_cryptodev_data_58 00:05:52.593 size: 0.000122 MiB name: rte_cryptodev_data_59 00:05:52.593 size: 0.000122 MiB name: rte_compressdev_data_29 00:05:52.593 size: 0.000122 MiB name: rte_cryptodev_data_60 00:05:52.593 size: 0.000122 MiB name: rte_cryptodev_data_61 00:05:52.593 size: 0.000122 MiB name: rte_compressdev_data_30 00:05:52.593 size: 0.000122 MiB name: rte_cryptodev_data_62 00:05:52.593 size: 0.000122 MiB name: rte_cryptodev_data_63 00:05:52.593 size: 0.000122 MiB name: rte_compressdev_data_31 00:05:52.593 size: 0.000122 MiB name: rte_cryptodev_data_64 00:05:52.593 size: 0.000122 MiB name: rte_cryptodev_data_65 00:05:52.593 size: 0.000122 MiB name: rte_compressdev_data_32 00:05:52.593 size: 0.000122 MiB name: rte_cryptodev_data_66 00:05:52.593 size: 0.000122 MiB name: rte_cryptodev_data_67 00:05:52.593 size: 0.000122 MiB name: rte_compressdev_data_33 00:05:52.593 size: 0.000122 MiB name: rte_cryptodev_data_68 00:05:52.593 size: 0.000122 MiB name: rte_cryptodev_data_69 00:05:52.593 size: 0.000122 MiB name: rte_compressdev_data_34 00:05:52.593 size: 0.000122 MiB name: rte_cryptodev_data_70 00:05:52.593 size: 0.000122 MiB name: rte_cryptodev_data_71 00:05:52.593 size: 0.000122 MiB name: rte_compressdev_data_35 00:05:52.593 size: 0.000122 MiB name: rte_cryptodev_data_72 00:05:52.593 size: 0.000122 MiB name: rte_cryptodev_data_73 00:05:52.593 size: 0.000122 MiB name: rte_compressdev_data_36 00:05:52.593 size: 0.000122 MiB name: rte_cryptodev_data_74 00:05:52.593 size: 0.000122 MiB name: rte_cryptodev_data_75 00:05:52.593 size: 0.000122 MiB name: rte_compressdev_data_37 00:05:52.593 size: 0.000122 MiB name: rte_cryptodev_data_76 00:05:52.593 size: 0.000122 MiB name: rte_cryptodev_data_77 00:05:52.593 size: 0.000122 MiB name: rte_compressdev_data_38 00:05:52.593 size: 0.000122 MiB name: rte_cryptodev_data_78 00:05:52.593 size: 0.000122 MiB name: rte_cryptodev_data_79 00:05:52.593 size: 0.000122 MiB name: rte_compressdev_data_39 00:05:52.593 size: 0.000122 MiB name: rte_cryptodev_data_80 00:05:52.593 size: 0.000122 MiB name: rte_cryptodev_data_81 00:05:52.593 size: 0.000122 MiB name: rte_compressdev_data_40 00:05:52.593 size: 0.000122 MiB name: rte_cryptodev_data_82 00:05:52.593 size: 0.000122 MiB name: rte_cryptodev_data_83 00:05:52.593 size: 0.000122 MiB name: rte_compressdev_data_41 00:05:52.593 size: 0.000122 MiB name: rte_cryptodev_data_84 00:05:52.593 size: 0.000122 MiB name: rte_cryptodev_data_85 00:05:52.593 size: 0.000122 MiB name: rte_compressdev_data_42 00:05:52.593 size: 0.000122 MiB name: rte_cryptodev_data_86 00:05:52.593 size: 0.000122 MiB name: rte_cryptodev_data_87 00:05:52.593 size: 0.000122 MiB name: rte_compressdev_data_43 00:05:52.593 size: 0.000122 MiB name: rte_cryptodev_data_88 00:05:52.593 size: 0.000122 MiB name: rte_cryptodev_data_89 00:05:52.593 size: 0.000122 MiB name: rte_compressdev_data_44 00:05:52.593 size: 0.000122 MiB name: rte_cryptodev_data_90 00:05:52.593 size: 0.000122 MiB name: rte_cryptodev_data_91 00:05:52.593 size: 0.000122 MiB name: rte_compressdev_data_45 00:05:52.593 size: 0.000122 MiB name: rte_cryptodev_data_92 00:05:52.593 size: 0.000122 MiB name: rte_cryptodev_data_93 00:05:52.593 size: 0.000122 MiB name: rte_compressdev_data_46 00:05:52.593 size: 0.000122 MiB name: rte_cryptodev_data_94 00:05:52.593 size: 0.000122 MiB name: rte_cryptodev_data_95 00:05:52.593 size: 0.000122 MiB name: rte_compressdev_data_47 00:05:52.593 size: 0.000061 MiB name: QAT_COMP_CAPA_GEN_1 00:05:52.593 end memzones------- 00:05:52.593 23:27:37 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:05:52.593 heap id: 0 total size: 814.000000 MiB number of busy elements: 637 number of free elements: 14 00:05:52.593 list of free elements. size: 11.781372 MiB 00:05:52.593 element at address: 0x200000400000 with size: 1.999512 MiB 00:05:52.593 element at address: 0x200018e00000 with size: 0.999878 MiB 00:05:52.593 element at address: 0x200019000000 with size: 0.999878 MiB 00:05:52.593 element at address: 0x200003e00000 with size: 0.996460 MiB 00:05:52.593 element at address: 0x200031c00000 with size: 0.994446 MiB 00:05:52.593 element at address: 0x200013800000 with size: 0.978699 MiB 00:05:52.593 element at address: 0x200007000000 with size: 0.959839 MiB 00:05:52.593 element at address: 0x200019200000 with size: 0.936584 MiB 00:05:52.593 element at address: 0x20001aa00000 with size: 0.564758 MiB 00:05:52.593 element at address: 0x200003a00000 with size: 0.494507 MiB 00:05:52.593 element at address: 0x20000b200000 with size: 0.488892 MiB 00:05:52.593 element at address: 0x200000800000 with size: 0.486511 MiB 00:05:52.593 element at address: 0x200019400000 with size: 0.485657 MiB 00:05:52.593 element at address: 0x200027e00000 with size: 0.395752 MiB 00:05:52.593 list of standard malloc elements. size: 199.898621 MiB 00:05:52.593 element at address: 0x20000b3fff80 with size: 132.000122 MiB 00:05:52.593 element at address: 0x2000071fff80 with size: 64.000122 MiB 00:05:52.593 element at address: 0x200018efff80 with size: 1.000122 MiB 00:05:52.593 element at address: 0x2000190fff80 with size: 1.000122 MiB 00:05:52.593 element at address: 0x2000192fff80 with size: 1.000122 MiB 00:05:52.593 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:05:52.593 element at address: 0x2000192eff00 with size: 0.062622 MiB 00:05:52.593 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:05:52.593 element at address: 0x20000032bc80 with size: 0.004395 MiB 00:05:52.593 element at address: 0x20000032f740 with size: 0.004395 MiB 00:05:52.593 element at address: 0x200000333200 with size: 0.004395 MiB 00:05:52.593 element at address: 0x200000336cc0 with size: 0.004395 MiB 00:05:52.593 element at address: 0x20000033a780 with size: 0.004395 MiB 00:05:52.593 element at address: 0x20000033e240 with size: 0.004395 MiB 00:05:52.593 element at address: 0x200000341d00 with size: 0.004395 MiB 00:05:52.593 element at address: 0x2000003457c0 with size: 0.004395 MiB 00:05:52.593 element at address: 0x200000349280 with size: 0.004395 MiB 00:05:52.593 element at address: 0x20000034cd40 with size: 0.004395 MiB 00:05:52.593 element at address: 0x200000350800 with size: 0.004395 MiB 00:05:52.593 element at address: 0x2000003542c0 with size: 0.004395 MiB 00:05:52.593 element at address: 0x200000357d80 with size: 0.004395 MiB 00:05:52.593 element at address: 0x20000035b840 with size: 0.004395 MiB 00:05:52.593 element at address: 0x20000035f300 with size: 0.004395 MiB 00:05:52.593 element at address: 0x200000362dc0 with size: 0.004395 MiB 00:05:52.593 element at address: 0x200000366880 with size: 0.004395 MiB 00:05:52.593 element at address: 0x20000036a340 with size: 0.004395 MiB 00:05:52.593 element at address: 0x20000036de00 with size: 0.004395 MiB 00:05:52.593 element at address: 0x2000003718c0 with size: 0.004395 MiB 00:05:52.593 element at address: 0x200000375380 with size: 0.004395 MiB 00:05:52.593 element at address: 0x200000378e40 with size: 0.004395 MiB 00:05:52.593 element at address: 0x20000037c900 with size: 0.004395 MiB 00:05:52.593 element at address: 0x2000003803c0 with size: 0.004395 MiB 00:05:52.593 element at address: 0x200000383e80 with size: 0.004395 MiB 00:05:52.593 element at address: 0x200000387940 with size: 0.004395 MiB 00:05:52.593 element at address: 0x20000038b400 with size: 0.004395 MiB 00:05:52.593 element at address: 0x20000038eec0 with size: 0.004395 MiB 00:05:52.593 element at address: 0x200000392980 with size: 0.004395 MiB 00:05:52.593 element at address: 0x200000396440 with size: 0.004395 MiB 00:05:52.593 element at address: 0x200000399f00 with size: 0.004395 MiB 00:05:52.593 element at address: 0x20000039d9c0 with size: 0.004395 MiB 00:05:52.593 element at address: 0x2000003a1480 with size: 0.004395 MiB 00:05:52.593 element at address: 0x2000003a4f40 with size: 0.004395 MiB 00:05:52.593 element at address: 0x2000003a8a00 with size: 0.004395 MiB 00:05:52.593 element at address: 0x2000003ac4c0 with size: 0.004395 MiB 00:05:52.593 element at address: 0x2000003aff80 with size: 0.004395 MiB 00:05:52.593 element at address: 0x2000003b3a40 with size: 0.004395 MiB 00:05:52.593 element at address: 0x2000003b7500 with size: 0.004395 MiB 00:05:52.593 element at address: 0x2000003bafc0 with size: 0.004395 MiB 00:05:52.593 element at address: 0x2000003bea80 with size: 0.004395 MiB 00:05:52.593 element at address: 0x2000003c2540 with size: 0.004395 MiB 00:05:52.593 element at address: 0x2000003c6000 with size: 0.004395 MiB 00:05:52.593 element at address: 0x2000003c9ac0 with size: 0.004395 MiB 00:05:52.593 element at address: 0x2000003cd580 with size: 0.004395 MiB 00:05:52.593 element at address: 0x2000003d1040 with size: 0.004395 MiB 00:05:52.593 element at address: 0x2000003d4b00 with size: 0.004395 MiB 00:05:52.593 element at address: 0x2000003d8d00 with size: 0.004395 MiB 00:05:52.593 element at address: 0x200000329b80 with size: 0.004028 MiB 00:05:52.593 element at address: 0x20000032ac00 with size: 0.004028 MiB 00:05:52.593 element at address: 0x20000032d640 with size: 0.004028 MiB 00:05:52.593 element at address: 0x20000032e6c0 with size: 0.004028 MiB 00:05:52.593 element at address: 0x200000331100 with size: 0.004028 MiB 00:05:52.593 element at address: 0x200000332180 with size: 0.004028 MiB 00:05:52.593 element at address: 0x200000334bc0 with size: 0.004028 MiB 00:05:52.593 element at address: 0x200000335c40 with size: 0.004028 MiB 00:05:52.593 element at address: 0x200000338680 with size: 0.004028 MiB 00:05:52.593 element at address: 0x200000339700 with size: 0.004028 MiB 00:05:52.593 element at address: 0x20000033c140 with size: 0.004028 MiB 00:05:52.593 element at address: 0x20000033d1c0 with size: 0.004028 MiB 00:05:52.593 element at address: 0x20000033fc00 with size: 0.004028 MiB 00:05:52.593 element at address: 0x200000340c80 with size: 0.004028 MiB 00:05:52.593 element at address: 0x2000003436c0 with size: 0.004028 MiB 00:05:52.593 element at address: 0x200000344740 with size: 0.004028 MiB 00:05:52.593 element at address: 0x200000347180 with size: 0.004028 MiB 00:05:52.593 element at address: 0x200000348200 with size: 0.004028 MiB 00:05:52.594 element at address: 0x20000034ac40 with size: 0.004028 MiB 00:05:52.594 element at address: 0x20000034bcc0 with size: 0.004028 MiB 00:05:52.594 element at address: 0x20000034e700 with size: 0.004028 MiB 00:05:52.594 element at address: 0x20000034f780 with size: 0.004028 MiB 00:05:52.594 element at address: 0x2000003521c0 with size: 0.004028 MiB 00:05:52.594 element at address: 0x200000353240 with size: 0.004028 MiB 00:05:52.594 element at address: 0x200000355c80 with size: 0.004028 MiB 00:05:52.594 element at address: 0x200000356d00 with size: 0.004028 MiB 00:05:52.594 element at address: 0x200000359740 with size: 0.004028 MiB 00:05:52.594 element at address: 0x20000035a7c0 with size: 0.004028 MiB 00:05:52.594 element at address: 0x20000035d200 with size: 0.004028 MiB 00:05:52.594 element at address: 0x20000035e280 with size: 0.004028 MiB 00:05:52.594 element at address: 0x200000360cc0 with size: 0.004028 MiB 00:05:52.594 element at address: 0x200000361d40 with size: 0.004028 MiB 00:05:52.594 element at address: 0x200000364780 with size: 0.004028 MiB 00:05:52.594 element at address: 0x200000365800 with size: 0.004028 MiB 00:05:52.594 element at address: 0x200000368240 with size: 0.004028 MiB 00:05:52.594 element at address: 0x2000003692c0 with size: 0.004028 MiB 00:05:52.594 element at address: 0x20000036bd00 with size: 0.004028 MiB 00:05:52.594 element at address: 0x20000036cd80 with size: 0.004028 MiB 00:05:52.594 element at address: 0x20000036f7c0 with size: 0.004028 MiB 00:05:52.594 element at address: 0x200000370840 with size: 0.004028 MiB 00:05:52.594 element at address: 0x200000373280 with size: 0.004028 MiB 00:05:52.594 element at address: 0x200000374300 with size: 0.004028 MiB 00:05:52.594 element at address: 0x200000376d40 with size: 0.004028 MiB 00:05:52.594 element at address: 0x200000377dc0 with size: 0.004028 MiB 00:05:52.594 element at address: 0x20000037a800 with size: 0.004028 MiB 00:05:52.594 element at address: 0x20000037b880 with size: 0.004028 MiB 00:05:52.594 element at address: 0x20000037e2c0 with size: 0.004028 MiB 00:05:52.594 element at address: 0x20000037f340 with size: 0.004028 MiB 00:05:52.594 element at address: 0x200000381d80 with size: 0.004028 MiB 00:05:52.594 element at address: 0x200000382e00 with size: 0.004028 MiB 00:05:52.594 element at address: 0x200000385840 with size: 0.004028 MiB 00:05:52.594 element at address: 0x2000003868c0 with size: 0.004028 MiB 00:05:52.594 element at address: 0x200000389300 with size: 0.004028 MiB 00:05:52.594 element at address: 0x20000038a380 with size: 0.004028 MiB 00:05:52.594 element at address: 0x20000038cdc0 with size: 0.004028 MiB 00:05:52.594 element at address: 0x20000038de40 with size: 0.004028 MiB 00:05:52.594 element at address: 0x200000390880 with size: 0.004028 MiB 00:05:52.594 element at address: 0x200000391900 with size: 0.004028 MiB 00:05:52.594 element at address: 0x200000394340 with size: 0.004028 MiB 00:05:52.594 element at address: 0x2000003953c0 with size: 0.004028 MiB 00:05:52.594 element at address: 0x200000397e00 with size: 0.004028 MiB 00:05:52.594 element at address: 0x200000398e80 with size: 0.004028 MiB 00:05:52.594 element at address: 0x20000039b8c0 with size: 0.004028 MiB 00:05:52.594 element at address: 0x20000039c940 with size: 0.004028 MiB 00:05:52.594 element at address: 0x20000039f380 with size: 0.004028 MiB 00:05:52.594 element at address: 0x2000003a0400 with size: 0.004028 MiB 00:05:52.594 element at address: 0x2000003a2e40 with size: 0.004028 MiB 00:05:52.594 element at address: 0x2000003a3ec0 with size: 0.004028 MiB 00:05:52.594 element at address: 0x2000003a6900 with size: 0.004028 MiB 00:05:52.594 element at address: 0x2000003a7980 with size: 0.004028 MiB 00:05:52.594 element at address: 0x2000003aa3c0 with size: 0.004028 MiB 00:05:52.594 element at address: 0x2000003ab440 with size: 0.004028 MiB 00:05:52.594 element at address: 0x2000003ade80 with size: 0.004028 MiB 00:05:52.594 element at address: 0x2000003aef00 with size: 0.004028 MiB 00:05:52.594 element at address: 0x2000003b1940 with size: 0.004028 MiB 00:05:52.594 element at address: 0x2000003b29c0 with size: 0.004028 MiB 00:05:52.594 element at address: 0x2000003b5400 with size: 0.004028 MiB 00:05:52.594 element at address: 0x2000003b6480 with size: 0.004028 MiB 00:05:52.594 element at address: 0x2000003b8ec0 with size: 0.004028 MiB 00:05:52.594 element at address: 0x2000003b9f40 with size: 0.004028 MiB 00:05:52.594 element at address: 0x2000003bc980 with size: 0.004028 MiB 00:05:52.594 element at address: 0x2000003bda00 with size: 0.004028 MiB 00:05:52.594 element at address: 0x2000003c0440 with size: 0.004028 MiB 00:05:52.594 element at address: 0x2000003c14c0 with size: 0.004028 MiB 00:05:52.594 element at address: 0x2000003c3f00 with size: 0.004028 MiB 00:05:52.594 element at address: 0x2000003c4f80 with size: 0.004028 MiB 00:05:52.594 element at address: 0x2000003c79c0 with size: 0.004028 MiB 00:05:52.594 element at address: 0x2000003c8a40 with size: 0.004028 MiB 00:05:52.594 element at address: 0x2000003cb480 with size: 0.004028 MiB 00:05:52.594 element at address: 0x2000003cc500 with size: 0.004028 MiB 00:05:52.594 element at address: 0x2000003cef40 with size: 0.004028 MiB 00:05:52.594 element at address: 0x2000003cffc0 with size: 0.004028 MiB 00:05:52.594 element at address: 0x2000003d2a00 with size: 0.004028 MiB 00:05:52.594 element at address: 0x2000003d3a80 with size: 0.004028 MiB 00:05:52.594 element at address: 0x2000003d6c00 with size: 0.004028 MiB 00:05:52.594 element at address: 0x2000003d7c80 with size: 0.004028 MiB 00:05:52.594 element at address: 0x200000200000 with size: 0.000305 MiB 00:05:52.594 element at address: 0x2000192efdc0 with size: 0.000305 MiB 00:05:52.594 element at address: 0x200000200140 with size: 0.000183 MiB 00:05:52.594 element at address: 0x200000200200 with size: 0.000183 MiB 00:05:52.594 element at address: 0x2000002002c0 with size: 0.000183 MiB 00:05:52.594 element at address: 0x200000200380 with size: 0.000183 MiB 00:05:52.594 element at address: 0x200000200440 with size: 0.000183 MiB 00:05:52.594 element at address: 0x200000200500 with size: 0.000183 MiB 00:05:52.594 element at address: 0x2000002005c0 with size: 0.000183 MiB 00:05:52.594 element at address: 0x200000200680 with size: 0.000183 MiB 00:05:52.594 element at address: 0x200000200740 with size: 0.000183 MiB 00:05:52.594 element at address: 0x200000200800 with size: 0.000183 MiB 00:05:52.594 element at address: 0x2000002008c0 with size: 0.000183 MiB 00:05:52.594 element at address: 0x200000200980 with size: 0.000183 MiB 00:05:52.594 element at address: 0x200000200a40 with size: 0.000183 MiB 00:05:52.594 element at address: 0x200000200b00 with size: 0.000183 MiB 00:05:52.594 element at address: 0x200000200bc0 with size: 0.000183 MiB 00:05:52.594 element at address: 0x200000200c80 with size: 0.000183 MiB 00:05:52.594 element at address: 0x200000200d40 with size: 0.000183 MiB 00:05:52.594 element at address: 0x200000200e00 with size: 0.000183 MiB 00:05:52.594 element at address: 0x200000200ec0 with size: 0.000183 MiB 00:05:52.594 element at address: 0x2000002010c0 with size: 0.000183 MiB 00:05:52.594 element at address: 0x200000205380 with size: 0.000183 MiB 00:05:52.594 element at address: 0x200000225640 with size: 0.000183 MiB 00:05:52.594 element at address: 0x200000225700 with size: 0.000183 MiB 00:05:52.594 element at address: 0x2000002257c0 with size: 0.000183 MiB 00:05:52.594 element at address: 0x200000225880 with size: 0.000183 MiB 00:05:52.594 element at address: 0x200000225940 with size: 0.000183 MiB 00:05:52.594 element at address: 0x200000225a00 with size: 0.000183 MiB 00:05:52.594 element at address: 0x200000225ac0 with size: 0.000183 MiB 00:05:52.594 element at address: 0x200000225b80 with size: 0.000183 MiB 00:05:52.594 element at address: 0x200000225c40 with size: 0.000183 MiB 00:05:52.594 element at address: 0x200000225d00 with size: 0.000183 MiB 00:05:52.594 element at address: 0x200000225dc0 with size: 0.000183 MiB 00:05:52.594 element at address: 0x200000225e80 with size: 0.000183 MiB 00:05:52.594 element at address: 0x200000225f40 with size: 0.000183 MiB 00:05:52.594 element at address: 0x200000226000 with size: 0.000183 MiB 00:05:52.594 element at address: 0x2000002260c0 with size: 0.000183 MiB 00:05:52.594 element at address: 0x200000226180 with size: 0.000183 MiB 00:05:52.594 element at address: 0x200000226240 with size: 0.000183 MiB 00:05:52.594 element at address: 0x200000226300 with size: 0.000183 MiB 00:05:52.594 element at address: 0x200000226500 with size: 0.000183 MiB 00:05:52.594 element at address: 0x2000002265c0 with size: 0.000183 MiB 00:05:52.594 element at address: 0x200000226680 with size: 0.000183 MiB 00:05:52.594 element at address: 0x200000226740 with size: 0.000183 MiB 00:05:52.594 element at address: 0x200000226800 with size: 0.000183 MiB 00:05:52.594 element at address: 0x2000002268c0 with size: 0.000183 MiB 00:05:52.594 element at address: 0x200000226980 with size: 0.000183 MiB 00:05:52.594 element at address: 0x200000226a40 with size: 0.000183 MiB 00:05:52.594 element at address: 0x200000226b00 with size: 0.000183 MiB 00:05:52.594 element at address: 0x200000226bc0 with size: 0.000183 MiB 00:05:52.594 element at address: 0x200000226c80 with size: 0.000183 MiB 00:05:52.594 element at address: 0x200000226d40 with size: 0.000183 MiB 00:05:52.594 element at address: 0x200000226e00 with size: 0.000183 MiB 00:05:52.594 element at address: 0x200000226ec0 with size: 0.000183 MiB 00:05:52.594 element at address: 0x200000226f80 with size: 0.000183 MiB 00:05:52.594 element at address: 0x200000227040 with size: 0.000183 MiB 00:05:52.594 element at address: 0x200000227100 with size: 0.000183 MiB 00:05:52.594 element at address: 0x200000329300 with size: 0.000183 MiB 00:05:52.594 element at address: 0x2000003293c0 with size: 0.000183 MiB 00:05:52.594 element at address: 0x200000329580 with size: 0.000183 MiB 00:05:52.594 element at address: 0x200000329640 with size: 0.000183 MiB 00:05:52.594 element at address: 0x200000329800 with size: 0.000183 MiB 00:05:52.594 element at address: 0x20000032ce80 with size: 0.000183 MiB 00:05:52.594 element at address: 0x20000032d040 with size: 0.000183 MiB 00:05:52.594 element at address: 0x20000032d100 with size: 0.000183 MiB 00:05:52.594 element at address: 0x20000032d2c0 with size: 0.000183 MiB 00:05:52.594 element at address: 0x200000330940 with size: 0.000183 MiB 00:05:52.594 element at address: 0x200000330b00 with size: 0.000183 MiB 00:05:52.594 element at address: 0x200000330bc0 with size: 0.000183 MiB 00:05:52.594 element at address: 0x200000330d80 with size: 0.000183 MiB 00:05:52.594 element at address: 0x200000334400 with size: 0.000183 MiB 00:05:52.595 element at address: 0x2000003345c0 with size: 0.000183 MiB 00:05:52.595 element at address: 0x200000334680 with size: 0.000183 MiB 00:05:52.595 element at address: 0x200000334840 with size: 0.000183 MiB 00:05:52.595 element at address: 0x200000337ec0 with size: 0.000183 MiB 00:05:52.595 element at address: 0x200000338080 with size: 0.000183 MiB 00:05:52.595 element at address: 0x200000338140 with size: 0.000183 MiB 00:05:52.595 element at address: 0x200000338300 with size: 0.000183 MiB 00:05:52.595 element at address: 0x20000033b980 with size: 0.000183 MiB 00:05:52.595 element at address: 0x20000033bb40 with size: 0.000183 MiB 00:05:52.595 element at address: 0x20000033bc00 with size: 0.000183 MiB 00:05:52.595 element at address: 0x20000033bdc0 with size: 0.000183 MiB 00:05:52.595 element at address: 0x20000033f440 with size: 0.000183 MiB 00:05:52.595 element at address: 0x20000033f600 with size: 0.000183 MiB 00:05:52.595 element at address: 0x20000033f6c0 with size: 0.000183 MiB 00:05:52.595 element at address: 0x20000033f880 with size: 0.000183 MiB 00:05:52.595 element at address: 0x200000342f00 with size: 0.000183 MiB 00:05:52.595 element at address: 0x2000003430c0 with size: 0.000183 MiB 00:05:52.595 element at address: 0x200000343180 with size: 0.000183 MiB 00:05:52.595 element at address: 0x200000343340 with size: 0.000183 MiB 00:05:52.595 element at address: 0x2000003469c0 with size: 0.000183 MiB 00:05:52.595 element at address: 0x200000346b80 with size: 0.000183 MiB 00:05:52.595 element at address: 0x200000346c40 with size: 0.000183 MiB 00:05:52.595 element at address: 0x200000346e00 with size: 0.000183 MiB 00:05:52.595 element at address: 0x20000034a480 with size: 0.000183 MiB 00:05:52.595 element at address: 0x20000034a640 with size: 0.000183 MiB 00:05:52.595 element at address: 0x20000034a700 with size: 0.000183 MiB 00:05:52.595 element at address: 0x20000034a8c0 with size: 0.000183 MiB 00:05:52.595 element at address: 0x20000034df40 with size: 0.000183 MiB 00:05:52.595 element at address: 0x20000034e100 with size: 0.000183 MiB 00:05:52.595 element at address: 0x20000034e1c0 with size: 0.000183 MiB 00:05:52.595 element at address: 0x20000034e380 with size: 0.000183 MiB 00:05:52.595 element at address: 0x200000351a00 with size: 0.000183 MiB 00:05:52.595 element at address: 0x200000351bc0 with size: 0.000183 MiB 00:05:52.595 element at address: 0x200000351c80 with size: 0.000183 MiB 00:05:52.595 element at address: 0x200000351e40 with size: 0.000183 MiB 00:05:52.595 element at address: 0x2000003554c0 with size: 0.000183 MiB 00:05:52.595 element at address: 0x200000355680 with size: 0.000183 MiB 00:05:52.595 element at address: 0x200000355740 with size: 0.000183 MiB 00:05:52.595 element at address: 0x200000355900 with size: 0.000183 MiB 00:05:52.595 element at address: 0x200000358f80 with size: 0.000183 MiB 00:05:52.595 element at address: 0x200000359140 with size: 0.000183 MiB 00:05:52.595 element at address: 0x200000359200 with size: 0.000183 MiB 00:05:52.595 element at address: 0x2000003593c0 with size: 0.000183 MiB 00:05:52.595 element at address: 0x20000035ca40 with size: 0.000183 MiB 00:05:52.595 element at address: 0x20000035cc00 with size: 0.000183 MiB 00:05:52.595 element at address: 0x20000035ccc0 with size: 0.000183 MiB 00:05:52.595 element at address: 0x20000035ce80 with size: 0.000183 MiB 00:05:52.595 element at address: 0x200000360500 with size: 0.000183 MiB 00:05:52.595 element at address: 0x2000003606c0 with size: 0.000183 MiB 00:05:52.595 element at address: 0x200000360780 with size: 0.000183 MiB 00:05:52.595 element at address: 0x200000360940 with size: 0.000183 MiB 00:05:52.595 element at address: 0x200000363fc0 with size: 0.000183 MiB 00:05:52.595 element at address: 0x200000364180 with size: 0.000183 MiB 00:05:52.595 element at address: 0x200000364240 with size: 0.000183 MiB 00:05:52.595 element at address: 0x200000364400 with size: 0.000183 MiB 00:05:52.595 element at address: 0x200000367a80 with size: 0.000183 MiB 00:05:52.595 element at address: 0x200000367c40 with size: 0.000183 MiB 00:05:52.595 element at address: 0x200000367d00 with size: 0.000183 MiB 00:05:52.595 element at address: 0x200000367ec0 with size: 0.000183 MiB 00:05:52.595 element at address: 0x20000036b540 with size: 0.000183 MiB 00:05:52.595 element at address: 0x20000036b700 with size: 0.000183 MiB 00:05:52.595 element at address: 0x20000036b7c0 with size: 0.000183 MiB 00:05:52.595 element at address: 0x20000036b980 with size: 0.000183 MiB 00:05:52.595 element at address: 0x20000036f000 with size: 0.000183 MiB 00:05:52.595 element at address: 0x20000036f1c0 with size: 0.000183 MiB 00:05:52.595 element at address: 0x20000036f280 with size: 0.000183 MiB 00:05:52.595 element at address: 0x20000036f440 with size: 0.000183 MiB 00:05:52.595 element at address: 0x200000372ac0 with size: 0.000183 MiB 00:05:52.595 element at address: 0x200000372c80 with size: 0.000183 MiB 00:05:52.595 element at address: 0x200000372d40 with size: 0.000183 MiB 00:05:52.595 element at address: 0x200000372f00 with size: 0.000183 MiB 00:05:52.595 element at address: 0x200000376580 with size: 0.000183 MiB 00:05:52.595 element at address: 0x200000376740 with size: 0.000183 MiB 00:05:52.595 element at address: 0x200000376800 with size: 0.000183 MiB 00:05:52.595 element at address: 0x2000003769c0 with size: 0.000183 MiB 00:05:52.595 element at address: 0x20000037a040 with size: 0.000183 MiB 00:05:52.595 element at address: 0x20000037a200 with size: 0.000183 MiB 00:05:52.595 element at address: 0x20000037a2c0 with size: 0.000183 MiB 00:05:52.595 element at address: 0x20000037a480 with size: 0.000183 MiB 00:05:52.595 element at address: 0x20000037db00 with size: 0.000183 MiB 00:05:52.595 element at address: 0x20000037dcc0 with size: 0.000183 MiB 00:05:52.595 element at address: 0x20000037dd80 with size: 0.000183 MiB 00:05:52.595 element at address: 0x20000037df40 with size: 0.000183 MiB 00:05:52.595 element at address: 0x2000003815c0 with size: 0.000183 MiB 00:05:52.595 element at address: 0x200000381780 with size: 0.000183 MiB 00:05:52.595 element at address: 0x200000381840 with size: 0.000183 MiB 00:05:52.595 element at address: 0x200000381a00 with size: 0.000183 MiB 00:05:52.595 element at address: 0x200000385080 with size: 0.000183 MiB 00:05:52.595 element at address: 0x200000385240 with size: 0.000183 MiB 00:05:52.595 element at address: 0x200000385300 with size: 0.000183 MiB 00:05:52.595 element at address: 0x2000003854c0 with size: 0.000183 MiB 00:05:52.595 element at address: 0x200000388b40 with size: 0.000183 MiB 00:05:52.595 element at address: 0x200000388d00 with size: 0.000183 MiB 00:05:52.595 element at address: 0x200000388dc0 with size: 0.000183 MiB 00:05:52.595 element at address: 0x200000388f80 with size: 0.000183 MiB 00:05:52.595 element at address: 0x20000038c600 with size: 0.000183 MiB 00:05:52.595 element at address: 0x20000038c7c0 with size: 0.000183 MiB 00:05:52.595 element at address: 0x20000038c880 with size: 0.000183 MiB 00:05:52.595 element at address: 0x20000038ca40 with size: 0.000183 MiB 00:05:52.595 element at address: 0x2000003900c0 with size: 0.000183 MiB 00:05:52.595 element at address: 0x200000390280 with size: 0.000183 MiB 00:05:52.595 element at address: 0x200000390340 with size: 0.000183 MiB 00:05:52.595 element at address: 0x200000390500 with size: 0.000183 MiB 00:05:52.595 element at address: 0x200000393b80 with size: 0.000183 MiB 00:05:52.595 element at address: 0x200000393d40 with size: 0.000183 MiB 00:05:52.595 element at address: 0x200000393e00 with size: 0.000183 MiB 00:05:52.595 element at address: 0x200000393fc0 with size: 0.000183 MiB 00:05:52.595 element at address: 0x200000397640 with size: 0.000183 MiB 00:05:52.595 element at address: 0x200000397800 with size: 0.000183 MiB 00:05:52.595 element at address: 0x2000003978c0 with size: 0.000183 MiB 00:05:52.595 element at address: 0x200000397a80 with size: 0.000183 MiB 00:05:52.595 element at address: 0x20000039b100 with size: 0.000183 MiB 00:05:52.595 element at address: 0x20000039b2c0 with size: 0.000183 MiB 00:05:52.595 element at address: 0x20000039b380 with size: 0.000183 MiB 00:05:52.595 element at address: 0x20000039b540 with size: 0.000183 MiB 00:05:52.595 element at address: 0x20000039ebc0 with size: 0.000183 MiB 00:05:52.595 element at address: 0x20000039ed80 with size: 0.000183 MiB 00:05:52.595 element at address: 0x20000039ee40 with size: 0.000183 MiB 00:05:52.595 element at address: 0x20000039f000 with size: 0.000183 MiB 00:05:52.595 element at address: 0x2000003a2680 with size: 0.000183 MiB 00:05:52.595 element at address: 0x2000003a2840 with size: 0.000183 MiB 00:05:52.595 element at address: 0x2000003a2900 with size: 0.000183 MiB 00:05:52.595 element at address: 0x2000003a2ac0 with size: 0.000183 MiB 00:05:52.595 element at address: 0x2000003a6140 with size: 0.000183 MiB 00:05:52.595 element at address: 0x2000003a6300 with size: 0.000183 MiB 00:05:52.595 element at address: 0x2000003a63c0 with size: 0.000183 MiB 00:05:52.595 element at address: 0x2000003a6580 with size: 0.000183 MiB 00:05:52.595 element at address: 0x2000003a9c00 with size: 0.000183 MiB 00:05:52.595 element at address: 0x2000003a9dc0 with size: 0.000183 MiB 00:05:52.595 element at address: 0x2000003a9e80 with size: 0.000183 MiB 00:05:52.595 element at address: 0x2000003aa040 with size: 0.000183 MiB 00:05:52.595 element at address: 0x2000003ad6c0 with size: 0.000183 MiB 00:05:52.595 element at address: 0x2000003ad880 with size: 0.000183 MiB 00:05:52.595 element at address: 0x2000003ad940 with size: 0.000183 MiB 00:05:52.595 element at address: 0x2000003adb00 with size: 0.000183 MiB 00:05:52.595 element at address: 0x2000003b1180 with size: 0.000183 MiB 00:05:52.595 element at address: 0x2000003b1340 with size: 0.000183 MiB 00:05:52.595 element at address: 0x2000003b1400 with size: 0.000183 MiB 00:05:52.595 element at address: 0x2000003b15c0 with size: 0.000183 MiB 00:05:52.595 element at address: 0x2000003b4c40 with size: 0.000183 MiB 00:05:52.595 element at address: 0x2000003b4e00 with size: 0.000183 MiB 00:05:52.595 element at address: 0x2000003b4ec0 with size: 0.000183 MiB 00:05:52.595 element at address: 0x2000003b5080 with size: 0.000183 MiB 00:05:52.595 element at address: 0x2000003b8700 with size: 0.000183 MiB 00:05:52.595 element at address: 0x2000003b88c0 with size: 0.000183 MiB 00:05:52.595 element at address: 0x2000003b8980 with size: 0.000183 MiB 00:05:52.596 element at address: 0x2000003b8b40 with size: 0.000183 MiB 00:05:52.596 element at address: 0x2000003bc1c0 with size: 0.000183 MiB 00:05:52.596 element at address: 0x2000003bc380 with size: 0.000183 MiB 00:05:52.596 element at address: 0x2000003bc440 with size: 0.000183 MiB 00:05:52.596 element at address: 0x2000003bc600 with size: 0.000183 MiB 00:05:52.596 element at address: 0x2000003bfc80 with size: 0.000183 MiB 00:05:52.596 element at address: 0x2000003bfe40 with size: 0.000183 MiB 00:05:52.596 element at address: 0x2000003bff00 with size: 0.000183 MiB 00:05:52.596 element at address: 0x2000003c00c0 with size: 0.000183 MiB 00:05:52.596 element at address: 0x2000003c3740 with size: 0.000183 MiB 00:05:52.596 element at address: 0x2000003c3900 with size: 0.000183 MiB 00:05:52.596 element at address: 0x2000003c39c0 with size: 0.000183 MiB 00:05:52.596 element at address: 0x2000003c3b80 with size: 0.000183 MiB 00:05:52.596 element at address: 0x2000003c7200 with size: 0.000183 MiB 00:05:52.596 element at address: 0x2000003c73c0 with size: 0.000183 MiB 00:05:52.596 element at address: 0x2000003c7480 with size: 0.000183 MiB 00:05:52.596 element at address: 0x2000003c7640 with size: 0.000183 MiB 00:05:52.596 element at address: 0x2000003cacc0 with size: 0.000183 MiB 00:05:52.596 element at address: 0x2000003cae80 with size: 0.000183 MiB 00:05:52.596 element at address: 0x2000003caf40 with size: 0.000183 MiB 00:05:52.596 element at address: 0x2000003cb100 with size: 0.000183 MiB 00:05:52.596 element at address: 0x2000003ce780 with size: 0.000183 MiB 00:05:52.596 element at address: 0x2000003ce940 with size: 0.000183 MiB 00:05:52.596 element at address: 0x2000003cea00 with size: 0.000183 MiB 00:05:52.596 element at address: 0x2000003cebc0 with size: 0.000183 MiB 00:05:52.596 element at address: 0x2000003d2240 with size: 0.000183 MiB 00:05:52.596 element at address: 0x2000003d2400 with size: 0.000183 MiB 00:05:52.596 element at address: 0x2000003d24c0 with size: 0.000183 MiB 00:05:52.596 element at address: 0x2000003d2680 with size: 0.000183 MiB 00:05:52.596 element at address: 0x2000003d5dc0 with size: 0.000183 MiB 00:05:52.596 element at address: 0x2000003d64c0 with size: 0.000183 MiB 00:05:52.596 element at address: 0x2000003d6580 with size: 0.000183 MiB 00:05:52.596 element at address: 0x2000003d6880 with size: 0.000183 MiB 00:05:52.596 element at address: 0x20000087c8c0 with size: 0.000183 MiB 00:05:52.596 element at address: 0x20000087c980 with size: 0.000183 MiB 00:05:52.596 element at address: 0x20000087ca40 with size: 0.000183 MiB 00:05:52.596 element at address: 0x20000087cb00 with size: 0.000183 MiB 00:05:52.596 element at address: 0x20000087cbc0 with size: 0.000183 MiB 00:05:52.596 element at address: 0x20000087cc80 with size: 0.000183 MiB 00:05:52.596 element at address: 0x20000087cd40 with size: 0.000183 MiB 00:05:52.596 element at address: 0x20000087ce00 with size: 0.000183 MiB 00:05:52.596 element at address: 0x20000087cec0 with size: 0.000183 MiB 00:05:52.596 element at address: 0x2000008fd180 with size: 0.000183 MiB 00:05:52.596 element at address: 0x200003a7e980 with size: 0.000183 MiB 00:05:52.596 element at address: 0x200003a7ea40 with size: 0.000183 MiB 00:05:52.596 element at address: 0x200003a7eb00 with size: 0.000183 MiB 00:05:52.596 element at address: 0x200003a7ebc0 with size: 0.000183 MiB 00:05:52.596 element at address: 0x200003a7ec80 with size: 0.000183 MiB 00:05:52.596 element at address: 0x200003a7ed40 with size: 0.000183 MiB 00:05:52.596 element at address: 0x200003a7ee00 with size: 0.000183 MiB 00:05:52.596 element at address: 0x200003a7eec0 with size: 0.000183 MiB 00:05:52.596 element at address: 0x200003a7ef80 with size: 0.000183 MiB 00:05:52.596 element at address: 0x200003a7f040 with size: 0.000183 MiB 00:05:52.596 element at address: 0x200003a7f100 with size: 0.000183 MiB 00:05:52.596 element at address: 0x200003a7f1c0 with size: 0.000183 MiB 00:05:52.596 element at address: 0x200003a7f280 with size: 0.000183 MiB 00:05:52.596 element at address: 0x200003a7f340 with size: 0.000183 MiB 00:05:52.596 element at address: 0x200003a7f400 with size: 0.000183 MiB 00:05:52.596 element at address: 0x200003a7f4c0 with size: 0.000183 MiB 00:05:52.596 element at address: 0x200003a7f580 with size: 0.000183 MiB 00:05:52.596 element at address: 0x200003a7f640 with size: 0.000183 MiB 00:05:52.596 element at address: 0x200003a7f700 with size: 0.000183 MiB 00:05:52.596 element at address: 0x200003a7f7c0 with size: 0.000183 MiB 00:05:52.596 element at address: 0x200003a7f880 with size: 0.000183 MiB 00:05:52.596 element at address: 0x200003a7f940 with size: 0.000183 MiB 00:05:52.596 element at address: 0x2000070fdd80 with size: 0.000183 MiB 00:05:52.596 element at address: 0x20000b27d280 with size: 0.000183 MiB 00:05:52.596 element at address: 0x20000b27d340 with size: 0.000183 MiB 00:05:52.596 element at address: 0x20000b27d400 with size: 0.000183 MiB 00:05:52.596 element at address: 0x20000b27d4c0 with size: 0.000183 MiB 00:05:52.596 element at address: 0x20000b27d580 with size: 0.000183 MiB 00:05:52.596 element at address: 0x20000b27d640 with size: 0.000183 MiB 00:05:52.596 element at address: 0x20000b27d700 with size: 0.000183 MiB 00:05:52.596 element at address: 0x20000b27d7c0 with size: 0.000183 MiB 00:05:52.596 element at address: 0x20000b27d880 with size: 0.000183 MiB 00:05:52.596 element at address: 0x20000b27d940 with size: 0.000183 MiB 00:05:52.596 element at address: 0x20000b27da00 with size: 0.000183 MiB 00:05:52.596 element at address: 0x20000b27dac0 with size: 0.000183 MiB 00:05:52.596 element at address: 0x20000b2fdd80 with size: 0.000183 MiB 00:05:52.596 element at address: 0x2000138fa8c0 with size: 0.000183 MiB 00:05:52.596 element at address: 0x2000192efc40 with size: 0.000183 MiB 00:05:52.596 element at address: 0x2000192efd00 with size: 0.000183 MiB 00:05:52.596 element at address: 0x2000194bc740 with size: 0.000183 MiB 00:05:52.596 element at address: 0x20001aa90940 with size: 0.000183 MiB 00:05:52.596 element at address: 0x20001aa90a00 with size: 0.000183 MiB 00:05:52.596 element at address: 0x20001aa90ac0 with size: 0.000183 MiB 00:05:52.596 element at address: 0x20001aa90b80 with size: 0.000183 MiB 00:05:52.596 element at address: 0x20001aa90c40 with size: 0.000183 MiB 00:05:52.596 element at address: 0x20001aa90d00 with size: 0.000183 MiB 00:05:52.596 element at address: 0x20001aa90dc0 with size: 0.000183 MiB 00:05:52.596 element at address: 0x20001aa90e80 with size: 0.000183 MiB 00:05:52.596 element at address: 0x20001aa90f40 with size: 0.000183 MiB 00:05:52.596 element at address: 0x20001aa91000 with size: 0.000183 MiB 00:05:52.596 element at address: 0x20001aa910c0 with size: 0.000183 MiB 00:05:52.596 element at address: 0x20001aa91180 with size: 0.000183 MiB 00:05:52.596 element at address: 0x20001aa91240 with size: 0.000183 MiB 00:05:52.596 element at address: 0x20001aa91300 with size: 0.000183 MiB 00:05:52.596 element at address: 0x20001aa913c0 with size: 0.000183 MiB 00:05:52.596 element at address: 0x20001aa91480 with size: 0.000183 MiB 00:05:52.596 element at address: 0x20001aa91540 with size: 0.000183 MiB 00:05:52.596 element at address: 0x20001aa91600 with size: 0.000183 MiB 00:05:52.596 element at address: 0x20001aa916c0 with size: 0.000183 MiB 00:05:52.596 element at address: 0x20001aa91780 with size: 0.000183 MiB 00:05:52.596 element at address: 0x20001aa91840 with size: 0.000183 MiB 00:05:52.596 element at address: 0x20001aa91900 with size: 0.000183 MiB 00:05:52.596 element at address: 0x20001aa919c0 with size: 0.000183 MiB 00:05:52.596 element at address: 0x20001aa91a80 with size: 0.000183 MiB 00:05:52.596 element at address: 0x20001aa91b40 with size: 0.000183 MiB 00:05:52.596 element at address: 0x20001aa91c00 with size: 0.000183 MiB 00:05:52.596 element at address: 0x20001aa91cc0 with size: 0.000183 MiB 00:05:52.596 element at address: 0x20001aa91d80 with size: 0.000183 MiB 00:05:52.596 element at address: 0x20001aa91e40 with size: 0.000183 MiB 00:05:52.596 element at address: 0x20001aa91f00 with size: 0.000183 MiB 00:05:52.596 element at address: 0x20001aa91fc0 with size: 0.000183 MiB 00:05:52.596 element at address: 0x20001aa92080 with size: 0.000183 MiB 00:05:52.596 element at address: 0x20001aa92140 with size: 0.000183 MiB 00:05:52.596 element at address: 0x20001aa92200 with size: 0.000183 MiB 00:05:52.596 element at address: 0x20001aa922c0 with size: 0.000183 MiB 00:05:52.596 element at address: 0x20001aa92380 with size: 0.000183 MiB 00:05:52.596 element at address: 0x20001aa92440 with size: 0.000183 MiB 00:05:52.596 element at address: 0x20001aa92500 with size: 0.000183 MiB 00:05:52.596 element at address: 0x20001aa925c0 with size: 0.000183 MiB 00:05:52.596 element at address: 0x20001aa92680 with size: 0.000183 MiB 00:05:52.596 element at address: 0x20001aa92740 with size: 0.000183 MiB 00:05:52.597 element at address: 0x20001aa92800 with size: 0.000183 MiB 00:05:52.597 element at address: 0x20001aa928c0 with size: 0.000183 MiB 00:05:52.597 element at address: 0x20001aa92980 with size: 0.000183 MiB 00:05:52.597 element at address: 0x20001aa92a40 with size: 0.000183 MiB 00:05:52.597 element at address: 0x20001aa92b00 with size: 0.000183 MiB 00:05:52.597 element at address: 0x20001aa92bc0 with size: 0.000183 MiB 00:05:52.597 element at address: 0x20001aa92c80 with size: 0.000183 MiB 00:05:52.597 element at address: 0x20001aa92d40 with size: 0.000183 MiB 00:05:52.597 element at address: 0x20001aa92e00 with size: 0.000183 MiB 00:05:52.597 element at address: 0x20001aa92ec0 with size: 0.000183 MiB 00:05:52.597 element at address: 0x20001aa92f80 with size: 0.000183 MiB 00:05:52.597 element at address: 0x20001aa93040 with size: 0.000183 MiB 00:05:52.597 element at address: 0x20001aa93100 with size: 0.000183 MiB 00:05:52.597 element at address: 0x20001aa931c0 with size: 0.000183 MiB 00:05:52.597 element at address: 0x20001aa93280 with size: 0.000183 MiB 00:05:52.597 element at address: 0x20001aa93340 with size: 0.000183 MiB 00:05:52.597 element at address: 0x20001aa93400 with size: 0.000183 MiB 00:05:52.597 element at address: 0x20001aa934c0 with size: 0.000183 MiB 00:05:52.597 element at address: 0x20001aa93580 with size: 0.000183 MiB 00:05:52.597 element at address: 0x20001aa93640 with size: 0.000183 MiB 00:05:52.597 element at address: 0x20001aa93700 with size: 0.000183 MiB 00:05:52.597 element at address: 0x20001aa937c0 with size: 0.000183 MiB 00:05:52.597 element at address: 0x20001aa93880 with size: 0.000183 MiB 00:05:52.597 element at address: 0x20001aa93940 with size: 0.000183 MiB 00:05:52.597 element at address: 0x20001aa93a00 with size: 0.000183 MiB 00:05:52.597 element at address: 0x20001aa93ac0 with size: 0.000183 MiB 00:05:52.597 element at address: 0x20001aa93b80 with size: 0.000183 MiB 00:05:52.597 element at address: 0x20001aa93c40 with size: 0.000183 MiB 00:05:52.597 element at address: 0x20001aa93d00 with size: 0.000183 MiB 00:05:52.597 element at address: 0x20001aa93dc0 with size: 0.000183 MiB 00:05:52.597 element at address: 0x20001aa93e80 with size: 0.000183 MiB 00:05:52.597 element at address: 0x20001aa93f40 with size: 0.000183 MiB 00:05:52.597 element at address: 0x20001aa94000 with size: 0.000183 MiB 00:05:52.597 element at address: 0x20001aa940c0 with size: 0.000183 MiB 00:05:52.597 element at address: 0x20001aa94180 with size: 0.000183 MiB 00:05:52.597 element at address: 0x20001aa94240 with size: 0.000183 MiB 00:05:52.597 element at address: 0x20001aa94300 with size: 0.000183 MiB 00:05:52.597 element at address: 0x20001aa943c0 with size: 0.000183 MiB 00:05:52.597 element at address: 0x20001aa94480 with size: 0.000183 MiB 00:05:52.597 element at address: 0x20001aa94540 with size: 0.000183 MiB 00:05:52.597 element at address: 0x20001aa94600 with size: 0.000183 MiB 00:05:52.597 element at address: 0x20001aa946c0 with size: 0.000183 MiB 00:05:52.597 element at address: 0x20001aa94780 with size: 0.000183 MiB 00:05:52.597 element at address: 0x20001aa94840 with size: 0.000183 MiB 00:05:52.597 element at address: 0x20001aa94900 with size: 0.000183 MiB 00:05:52.597 element at address: 0x20001aa949c0 with size: 0.000183 MiB 00:05:52.597 element at address: 0x20001aa94a80 with size: 0.000183 MiB 00:05:52.597 element at address: 0x20001aa94b40 with size: 0.000183 MiB 00:05:52.597 element at address: 0x20001aa94c00 with size: 0.000183 MiB 00:05:52.597 element at address: 0x20001aa94cc0 with size: 0.000183 MiB 00:05:52.597 element at address: 0x20001aa94d80 with size: 0.000183 MiB 00:05:52.597 element at address: 0x20001aa94e40 with size: 0.000183 MiB 00:05:52.597 element at address: 0x20001aa94f00 with size: 0.000183 MiB 00:05:52.597 element at address: 0x20001aa94fc0 with size: 0.000183 MiB 00:05:52.597 element at address: 0x20001aa95080 with size: 0.000183 MiB 00:05:52.597 element at address: 0x20001aa95140 with size: 0.000183 MiB 00:05:52.597 element at address: 0x20001aa95200 with size: 0.000183 MiB 00:05:52.597 element at address: 0x20001aa952c0 with size: 0.000183 MiB 00:05:52.597 element at address: 0x20001aa95380 with size: 0.000183 MiB 00:05:52.597 element at address: 0x20001aa95440 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e65500 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e655c0 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6c1c0 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6c3c0 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6c480 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6c540 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6c600 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6c6c0 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6c780 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6c840 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6c900 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6c9c0 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6ca80 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6cb40 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6cc00 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6ccc0 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6cd80 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6ce40 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6cf00 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6cfc0 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6d080 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6d140 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6d200 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6d2c0 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6d380 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6d440 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6d500 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6d5c0 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6d680 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6d740 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6d800 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6d8c0 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6d980 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6da40 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6db00 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6dbc0 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6dc80 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6dd40 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6de00 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6dec0 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6df80 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6e040 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6e100 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6e1c0 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6e280 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6e340 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6e400 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6e4c0 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6e580 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6e640 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6e700 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6e7c0 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6e880 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6e940 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6ea00 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6eac0 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6eb80 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6ec40 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6ed00 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6edc0 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6ee80 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6ef40 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6f000 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6f0c0 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6f180 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6f240 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6f300 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6f3c0 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6f480 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6f540 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6f600 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6f6c0 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6f780 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6f840 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6f900 with size: 0.000183 MiB 00:05:52.597 element at address: 0x200027e6f9c0 with size: 0.000183 MiB 00:05:52.598 element at address: 0x200027e6fa80 with size: 0.000183 MiB 00:05:52.598 element at address: 0x200027e6fb40 with size: 0.000183 MiB 00:05:52.598 element at address: 0x200027e6fc00 with size: 0.000183 MiB 00:05:52.598 element at address: 0x200027e6fcc0 with size: 0.000183 MiB 00:05:52.598 element at address: 0x200027e6fd80 with size: 0.000183 MiB 00:05:52.598 element at address: 0x200027e6fe40 with size: 0.000183 MiB 00:05:52.598 element at address: 0x200027e6ff00 with size: 0.000183 MiB 00:05:52.598 list of memzone associated elements. size: 602.320007 MiB 00:05:52.598 element at address: 0x20001aa95500 with size: 211.416748 MiB 00:05:52.598 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:05:52.598 element at address: 0x200027e6ffc0 with size: 157.562561 MiB 00:05:52.598 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:05:52.598 element at address: 0x2000139fab80 with size: 84.020630 MiB 00:05:52.598 associated memzone info: size: 84.020508 MiB name: MP_bdev_io_207925_0 00:05:52.598 element at address: 0x2000009ff380 with size: 48.003052 MiB 00:05:52.598 associated memzone info: size: 48.002930 MiB name: MP_evtpool_207925_0 00:05:52.598 element at address: 0x200003fff380 with size: 48.003052 MiB 00:05:52.598 associated memzone info: size: 48.002930 MiB name: MP_msgpool_207925_0 00:05:52.598 element at address: 0x2000195be940 with size: 20.255554 MiB 00:05:52.598 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:05:52.598 element at address: 0x200031dfeb40 with size: 18.005066 MiB 00:05:52.598 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:05:52.598 element at address: 0x2000005ffe00 with size: 2.000488 MiB 00:05:52.598 associated memzone info: size: 2.000366 MiB name: RG_MP_evtpool_207925 00:05:52.598 element at address: 0x200003bffe00 with size: 2.000488 MiB 00:05:52.598 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_207925 00:05:52.598 element at address: 0x2000002271c0 with size: 1.008118 MiB 00:05:52.598 associated memzone info: size: 1.007996 MiB name: MP_evtpool_207925 00:05:52.598 element at address: 0x20000b2fde40 with size: 1.008118 MiB 00:05:52.598 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:05:52.598 element at address: 0x2000194bc800 with size: 1.008118 MiB 00:05:52.598 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:05:52.598 element at address: 0x2000070fde40 with size: 1.008118 MiB 00:05:52.598 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:05:52.598 element at address: 0x2000008fd240 with size: 1.008118 MiB 00:05:52.598 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:05:52.598 element at address: 0x200003eff180 with size: 1.000488 MiB 00:05:52.598 associated memzone info: size: 1.000366 MiB name: RG_ring_0_207925 00:05:52.598 element at address: 0x200003affc00 with size: 1.000488 MiB 00:05:52.598 associated memzone info: size: 1.000366 MiB name: RG_ring_1_207925 00:05:52.598 element at address: 0x2000138fa980 with size: 1.000488 MiB 00:05:52.598 associated memzone info: size: 1.000366 MiB name: RG_ring_4_207925 00:05:52.598 element at address: 0x200031cfe940 with size: 1.000488 MiB 00:05:52.598 associated memzone info: size: 1.000366 MiB name: RG_ring_5_207925 00:05:52.598 element at address: 0x200003a7fa00 with size: 0.500488 MiB 00:05:52.598 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_207925 00:05:52.598 element at address: 0x20000b27db80 with size: 0.500488 MiB 00:05:52.598 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:05:52.598 element at address: 0x20000087cf80 with size: 0.500488 MiB 00:05:52.598 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:05:52.598 element at address: 0x20001947c540 with size: 0.250488 MiB 00:05:52.598 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:05:52.598 element at address: 0x200000205440 with size: 0.125488 MiB 00:05:52.598 associated memzone info: size: 0.125366 MiB name: RG_ring_2_207925 00:05:52.598 element at address: 0x2000070f5b80 with size: 0.031738 MiB 00:05:52.598 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:05:52.598 element at address: 0x200027e65680 with size: 0.023743 MiB 00:05:52.598 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:05:52.598 element at address: 0x200000201180 with size: 0.016113 MiB 00:05:52.598 associated memzone info: size: 0.015991 MiB name: RG_ring_3_207925 00:05:52.598 element at address: 0x200027e6b7c0 with size: 0.002441 MiB 00:05:52.598 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:05:52.598 element at address: 0x2000003d5f80 with size: 0.001282 MiB 00:05:52.598 associated memzone info: size: 0.001160 MiB name: QAT_SYM_CAPA_GEN_1 00:05:52.598 element at address: 0x2000003d6a40 with size: 0.000427 MiB 00:05:52.598 associated memzone info: size: 0.000305 MiB name: 0000:1a:01.0_qat 00:05:52.598 element at address: 0x2000003d2840 with size: 0.000427 MiB 00:05:52.598 associated memzone info: size: 0.000305 MiB name: 0000:1a:01.1_qat 00:05:52.598 element at address: 0x2000003ced80 with size: 0.000427 MiB 00:05:52.598 associated memzone info: size: 0.000305 MiB name: 0000:1a:01.2_qat 00:05:52.598 element at address: 0x2000003cb2c0 with size: 0.000427 MiB 00:05:52.598 associated memzone info: size: 0.000305 MiB name: 0000:1a:01.3_qat 00:05:52.598 element at address: 0x2000003c7800 with size: 0.000427 MiB 00:05:52.598 associated memzone info: size: 0.000305 MiB name: 0000:1a:01.4_qat 00:05:52.598 element at address: 0x2000003c3d40 with size: 0.000427 MiB 00:05:52.598 associated memzone info: size: 0.000305 MiB name: 0000:1a:01.5_qat 00:05:52.598 element at address: 0x2000003c0280 with size: 0.000427 MiB 00:05:52.598 associated memzone info: size: 0.000305 MiB name: 0000:1a:01.6_qat 00:05:52.598 element at address: 0x2000003bc7c0 with size: 0.000427 MiB 00:05:52.598 associated memzone info: size: 0.000305 MiB name: 0000:1a:01.7_qat 00:05:52.598 element at address: 0x2000003b8d00 with size: 0.000427 MiB 00:05:52.598 associated memzone info: size: 0.000305 MiB name: 0000:1a:02.0_qat 00:05:52.598 element at address: 0x2000003b5240 with size: 0.000427 MiB 00:05:52.598 associated memzone info: size: 0.000305 MiB name: 0000:1a:02.1_qat 00:05:52.598 element at address: 0x2000003b1780 with size: 0.000427 MiB 00:05:52.598 associated memzone info: size: 0.000305 MiB name: 0000:1a:02.2_qat 00:05:52.598 element at address: 0x2000003adcc0 with size: 0.000427 MiB 00:05:52.598 associated memzone info: size: 0.000305 MiB name: 0000:1a:02.3_qat 00:05:52.598 element at address: 0x2000003aa200 with size: 0.000427 MiB 00:05:52.598 associated memzone info: size: 0.000305 MiB name: 0000:1a:02.4_qat 00:05:52.598 element at address: 0x2000003a6740 with size: 0.000427 MiB 00:05:52.598 associated memzone info: size: 0.000305 MiB name: 0000:1a:02.5_qat 00:05:52.598 element at address: 0x2000003a2c80 with size: 0.000427 MiB 00:05:52.598 associated memzone info: size: 0.000305 MiB name: 0000:1a:02.6_qat 00:05:52.598 element at address: 0x20000039f1c0 with size: 0.000427 MiB 00:05:52.598 associated memzone info: size: 0.000305 MiB name: 0000:1a:02.7_qat 00:05:52.598 element at address: 0x20000039b700 with size: 0.000427 MiB 00:05:52.598 associated memzone info: size: 0.000305 MiB name: 0000:1c:01.0_qat 00:05:52.598 element at address: 0x200000397c40 with size: 0.000427 MiB 00:05:52.598 associated memzone info: size: 0.000305 MiB name: 0000:1c:01.1_qat 00:05:52.598 element at address: 0x200000394180 with size: 0.000427 MiB 00:05:52.598 associated memzone info: size: 0.000305 MiB name: 0000:1c:01.2_qat 00:05:52.598 element at address: 0x2000003906c0 with size: 0.000427 MiB 00:05:52.598 associated memzone info: size: 0.000305 MiB name: 0000:1c:01.3_qat 00:05:52.598 element at address: 0x20000038cc00 with size: 0.000427 MiB 00:05:52.598 associated memzone info: size: 0.000305 MiB name: 0000:1c:01.4_qat 00:05:52.598 element at address: 0x200000389140 with size: 0.000427 MiB 00:05:52.598 associated memzone info: size: 0.000305 MiB name: 0000:1c:01.5_qat 00:05:52.598 element at address: 0x200000385680 with size: 0.000427 MiB 00:05:52.598 associated memzone info: size: 0.000305 MiB name: 0000:1c:01.6_qat 00:05:52.598 element at address: 0x200000381bc0 with size: 0.000427 MiB 00:05:52.598 associated memzone info: size: 0.000305 MiB name: 0000:1c:01.7_qat 00:05:52.598 element at address: 0x20000037e100 with size: 0.000427 MiB 00:05:52.598 associated memzone info: size: 0.000305 MiB name: 0000:1c:02.0_qat 00:05:52.598 element at address: 0x20000037a640 with size: 0.000427 MiB 00:05:52.598 associated memzone info: size: 0.000305 MiB name: 0000:1c:02.1_qat 00:05:52.598 element at address: 0x200000376b80 with size: 0.000427 MiB 00:05:52.598 associated memzone info: size: 0.000305 MiB name: 0000:1c:02.2_qat 00:05:52.598 element at address: 0x2000003730c0 with size: 0.000427 MiB 00:05:52.598 associated memzone info: size: 0.000305 MiB name: 0000:1c:02.3_qat 00:05:52.598 element at address: 0x20000036f600 with size: 0.000427 MiB 00:05:52.598 associated memzone info: size: 0.000305 MiB name: 0000:1c:02.4_qat 00:05:52.598 element at address: 0x20000036bb40 with size: 0.000427 MiB 00:05:52.598 associated memzone info: size: 0.000305 MiB name: 0000:1c:02.5_qat 00:05:52.598 element at address: 0x200000368080 with size: 0.000427 MiB 00:05:52.598 associated memzone info: size: 0.000305 MiB name: 0000:1c:02.6_qat 00:05:52.598 element at address: 0x2000003645c0 with size: 0.000427 MiB 00:05:52.598 associated memzone info: size: 0.000305 MiB name: 0000:1c:02.7_qat 00:05:52.598 element at address: 0x200000360b00 with size: 0.000427 MiB 00:05:52.598 associated memzone info: size: 0.000305 MiB name: 0000:1e:01.0_qat 00:05:52.598 element at address: 0x20000035d040 with size: 0.000427 MiB 00:05:52.598 associated memzone info: size: 0.000305 MiB name: 0000:1e:01.1_qat 00:05:52.598 element at address: 0x200000359580 with size: 0.000427 MiB 00:05:52.598 associated memzone info: size: 0.000305 MiB name: 0000:1e:01.2_qat 00:05:52.598 element at address: 0x200000355ac0 with size: 0.000427 MiB 00:05:52.598 associated memzone info: size: 0.000305 MiB name: 0000:1e:01.3_qat 00:05:52.598 element at address: 0x200000352000 with size: 0.000427 MiB 00:05:52.598 associated memzone info: size: 0.000305 MiB name: 0000:1e:01.4_qat 00:05:52.598 element at address: 0x20000034e540 with size: 0.000427 MiB 00:05:52.599 associated memzone info: size: 0.000305 MiB name: 0000:1e:01.5_qat 00:05:52.599 element at address: 0x20000034aa80 with size: 0.000427 MiB 00:05:52.599 associated memzone info: size: 0.000305 MiB name: 0000:1e:01.6_qat 00:05:52.599 element at address: 0x200000346fc0 with size: 0.000427 MiB 00:05:52.599 associated memzone info: size: 0.000305 MiB name: 0000:1e:01.7_qat 00:05:52.599 element at address: 0x200000343500 with size: 0.000427 MiB 00:05:52.599 associated memzone info: size: 0.000305 MiB name: 0000:1e:02.0_qat 00:05:52.599 element at address: 0x20000033fa40 with size: 0.000427 MiB 00:05:52.599 associated memzone info: size: 0.000305 MiB name: 0000:1e:02.1_qat 00:05:52.599 element at address: 0x20000033bf80 with size: 0.000427 MiB 00:05:52.599 associated memzone info: size: 0.000305 MiB name: 0000:1e:02.2_qat 00:05:52.599 element at address: 0x2000003384c0 with size: 0.000427 MiB 00:05:52.599 associated memzone info: size: 0.000305 MiB name: 0000:1e:02.3_qat 00:05:52.599 element at address: 0x200000334a00 with size: 0.000427 MiB 00:05:52.599 associated memzone info: size: 0.000305 MiB name: 0000:1e:02.4_qat 00:05:52.599 element at address: 0x200000330f40 with size: 0.000427 MiB 00:05:52.599 associated memzone info: size: 0.000305 MiB name: 0000:1e:02.5_qat 00:05:52.599 element at address: 0x20000032d480 with size: 0.000427 MiB 00:05:52.599 associated memzone info: size: 0.000305 MiB name: 0000:1e:02.6_qat 00:05:52.599 element at address: 0x2000003299c0 with size: 0.000427 MiB 00:05:52.599 associated memzone info: size: 0.000305 MiB name: 0000:1e:02.7_qat 00:05:52.599 element at address: 0x2000003d6740 with size: 0.000305 MiB 00:05:52.599 associated memzone info: size: 0.000183 MiB name: QAT_ASYM_CAPA_GEN_1 00:05:52.599 element at address: 0x2000002263c0 with size: 0.000305 MiB 00:05:52.599 associated memzone info: size: 0.000183 MiB name: MP_msgpool_207925 00:05:52.599 element at address: 0x200000200f80 with size: 0.000305 MiB 00:05:52.599 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_207925 00:05:52.599 element at address: 0x200027e6c280 with size: 0.000305 MiB 00:05:52.599 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:05:52.599 element at address: 0x2000003d6940 with size: 0.000244 MiB 00:05:52.599 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_0 00:05:52.599 element at address: 0x2000003d6640 with size: 0.000244 MiB 00:05:52.599 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_1 00:05:52.599 element at address: 0x2000003d5e80 with size: 0.000244 MiB 00:05:52.599 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_0 00:05:52.599 element at address: 0x2000003d2740 with size: 0.000244 MiB 00:05:52.599 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_2 00:05:52.599 element at address: 0x2000003d2580 with size: 0.000244 MiB 00:05:52.599 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_3 00:05:52.599 element at address: 0x2000003d2300 with size: 0.000244 MiB 00:05:52.599 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_1 00:05:52.599 element at address: 0x2000003cec80 with size: 0.000244 MiB 00:05:52.599 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_4 00:05:52.599 element at address: 0x2000003ceac0 with size: 0.000244 MiB 00:05:52.599 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_5 00:05:52.599 element at address: 0x2000003ce840 with size: 0.000244 MiB 00:05:52.599 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_2 00:05:52.599 element at address: 0x2000003cb1c0 with size: 0.000244 MiB 00:05:52.599 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_6 00:05:52.599 element at address: 0x2000003cb000 with size: 0.000244 MiB 00:05:52.599 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_7 00:05:52.599 element at address: 0x2000003cad80 with size: 0.000244 MiB 00:05:52.599 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_3 00:05:52.599 element at address: 0x2000003c7700 with size: 0.000244 MiB 00:05:52.599 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_8 00:05:52.599 element at address: 0x2000003c7540 with size: 0.000244 MiB 00:05:52.599 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_9 00:05:52.599 element at address: 0x2000003c72c0 with size: 0.000244 MiB 00:05:52.599 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_4 00:05:52.599 element at address: 0x2000003c3c40 with size: 0.000244 MiB 00:05:52.599 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_10 00:05:52.599 element at address: 0x2000003c3a80 with size: 0.000244 MiB 00:05:52.599 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_11 00:05:52.599 element at address: 0x2000003c3800 with size: 0.000244 MiB 00:05:52.599 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_5 00:05:52.599 element at address: 0x2000003c0180 with size: 0.000244 MiB 00:05:52.599 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_12 00:05:52.599 element at address: 0x2000003bffc0 with size: 0.000244 MiB 00:05:52.599 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_13 00:05:52.599 element at address: 0x2000003bfd40 with size: 0.000244 MiB 00:05:52.599 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_6 00:05:52.599 element at address: 0x2000003bc6c0 with size: 0.000244 MiB 00:05:52.599 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_14 00:05:52.599 element at address: 0x2000003bc500 with size: 0.000244 MiB 00:05:52.599 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_15 00:05:52.599 element at address: 0x2000003bc280 with size: 0.000244 MiB 00:05:52.599 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_7 00:05:52.599 element at address: 0x2000003b8c00 with size: 0.000244 MiB 00:05:52.599 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_16 00:05:52.599 element at address: 0x2000003b8a40 with size: 0.000244 MiB 00:05:52.599 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_17 00:05:52.599 element at address: 0x2000003b87c0 with size: 0.000244 MiB 00:05:52.599 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_8 00:05:52.599 element at address: 0x2000003b5140 with size: 0.000244 MiB 00:05:52.599 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_18 00:05:52.599 element at address: 0x2000003b4f80 with size: 0.000244 MiB 00:05:52.599 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_19 00:05:52.599 element at address: 0x2000003b4d00 with size: 0.000244 MiB 00:05:52.599 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_9 00:05:52.599 element at address: 0x2000003b1680 with size: 0.000244 MiB 00:05:52.599 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_20 00:05:52.599 element at address: 0x2000003b14c0 with size: 0.000244 MiB 00:05:52.599 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_21 00:05:52.599 element at address: 0x2000003b1240 with size: 0.000244 MiB 00:05:52.599 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_10 00:05:52.599 element at address: 0x2000003adbc0 with size: 0.000244 MiB 00:05:52.599 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_22 00:05:52.599 element at address: 0x2000003ada00 with size: 0.000244 MiB 00:05:52.599 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_23 00:05:52.599 element at address: 0x2000003ad780 with size: 0.000244 MiB 00:05:52.599 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_11 00:05:52.599 element at address: 0x2000003aa100 with size: 0.000244 MiB 00:05:52.599 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_24 00:05:52.599 element at address: 0x2000003a9f40 with size: 0.000244 MiB 00:05:52.599 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_25 00:05:52.599 element at address: 0x2000003a9cc0 with size: 0.000244 MiB 00:05:52.599 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_12 00:05:52.599 element at address: 0x2000003a6640 with size: 0.000244 MiB 00:05:52.599 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_26 00:05:52.599 element at address: 0x2000003a6480 with size: 0.000244 MiB 00:05:52.599 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_27 00:05:52.599 element at address: 0x2000003a6200 with size: 0.000244 MiB 00:05:52.599 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_13 00:05:52.599 element at address: 0x2000003a2b80 with size: 0.000244 MiB 00:05:52.599 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_28 00:05:52.599 element at address: 0x2000003a29c0 with size: 0.000244 MiB 00:05:52.599 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_29 00:05:52.599 element at address: 0x2000003a2740 with size: 0.000244 MiB 00:05:52.599 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_14 00:05:52.599 element at address: 0x20000039f0c0 with size: 0.000244 MiB 00:05:52.599 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_30 00:05:52.599 element at address: 0x20000039ef00 with size: 0.000244 MiB 00:05:52.599 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_31 00:05:52.599 element at address: 0x20000039ec80 with size: 0.000244 MiB 00:05:52.599 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_15 00:05:52.599 element at address: 0x20000039b600 with size: 0.000244 MiB 00:05:52.599 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_32 00:05:52.599 element at address: 0x20000039b440 with size: 0.000244 MiB 00:05:52.599 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_33 00:05:52.599 element at address: 0x20000039b1c0 with size: 0.000244 MiB 00:05:52.599 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_16 00:05:52.599 element at address: 0x200000397b40 with size: 0.000244 MiB 00:05:52.599 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_34 00:05:52.599 element at address: 0x200000397980 with size: 0.000244 MiB 00:05:52.599 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_35 00:05:52.599 element at address: 0x200000397700 with size: 0.000244 MiB 00:05:52.599 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_17 00:05:52.599 element at address: 0x200000394080 with size: 0.000244 MiB 00:05:52.599 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_36 00:05:52.599 element at address: 0x200000393ec0 with size: 0.000244 MiB 00:05:52.599 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_37 00:05:52.600 element at address: 0x200000393c40 with size: 0.000244 MiB 00:05:52.600 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_18 00:05:52.600 element at address: 0x2000003905c0 with size: 0.000244 MiB 00:05:52.600 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_38 00:05:52.600 element at address: 0x200000390400 with size: 0.000244 MiB 00:05:52.600 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_39 00:05:52.600 element at address: 0x200000390180 with size: 0.000244 MiB 00:05:52.600 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_19 00:05:52.600 element at address: 0x20000038cb00 with size: 0.000244 MiB 00:05:52.600 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_40 00:05:52.600 element at address: 0x20000038c940 with size: 0.000244 MiB 00:05:52.600 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_41 00:05:52.600 element at address: 0x20000038c6c0 with size: 0.000244 MiB 00:05:52.600 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_20 00:05:52.600 element at address: 0x200000389040 with size: 0.000244 MiB 00:05:52.600 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_42 00:05:52.600 element at address: 0x200000388e80 with size: 0.000244 MiB 00:05:52.600 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_43 00:05:52.600 element at address: 0x200000388c00 with size: 0.000244 MiB 00:05:52.600 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_21 00:05:52.600 element at address: 0x200000385580 with size: 0.000244 MiB 00:05:52.600 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_44 00:05:52.600 element at address: 0x2000003853c0 with size: 0.000244 MiB 00:05:52.600 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_45 00:05:52.600 element at address: 0x200000385140 with size: 0.000244 MiB 00:05:52.600 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_22 00:05:52.600 element at address: 0x200000381ac0 with size: 0.000244 MiB 00:05:52.600 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_46 00:05:52.600 element at address: 0x200000381900 with size: 0.000244 MiB 00:05:52.600 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_47 00:05:52.600 element at address: 0x200000381680 with size: 0.000244 MiB 00:05:52.600 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_23 00:05:52.600 element at address: 0x20000037e000 with size: 0.000244 MiB 00:05:52.600 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_48 00:05:52.600 element at address: 0x20000037de40 with size: 0.000244 MiB 00:05:52.600 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_49 00:05:52.600 element at address: 0x20000037dbc0 with size: 0.000244 MiB 00:05:52.600 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_24 00:05:52.600 element at address: 0x20000037a540 with size: 0.000244 MiB 00:05:52.600 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_50 00:05:52.600 element at address: 0x20000037a380 with size: 0.000244 MiB 00:05:52.600 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_51 00:05:52.600 element at address: 0x20000037a100 with size: 0.000244 MiB 00:05:52.600 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_25 00:05:52.600 element at address: 0x200000376a80 with size: 0.000244 MiB 00:05:52.600 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_52 00:05:52.600 element at address: 0x2000003768c0 with size: 0.000244 MiB 00:05:52.600 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_53 00:05:52.600 element at address: 0x200000376640 with size: 0.000244 MiB 00:05:52.600 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_26 00:05:52.600 element at address: 0x200000372fc0 with size: 0.000244 MiB 00:05:52.600 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_54 00:05:52.600 element at address: 0x200000372e00 with size: 0.000244 MiB 00:05:52.600 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_55 00:05:52.600 element at address: 0x200000372b80 with size: 0.000244 MiB 00:05:52.600 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_27 00:05:52.600 element at address: 0x20000036f500 with size: 0.000244 MiB 00:05:52.600 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_56 00:05:52.600 element at address: 0x20000036f340 with size: 0.000244 MiB 00:05:52.600 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_57 00:05:52.600 element at address: 0x20000036f0c0 with size: 0.000244 MiB 00:05:52.600 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_28 00:05:52.600 element at address: 0x20000036ba40 with size: 0.000244 MiB 00:05:52.600 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_58 00:05:52.600 element at address: 0x20000036b880 with size: 0.000244 MiB 00:05:52.600 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_59 00:05:52.600 element at address: 0x20000036b600 with size: 0.000244 MiB 00:05:52.600 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_29 00:05:52.600 element at address: 0x200000367f80 with size: 0.000244 MiB 00:05:52.600 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_60 00:05:52.600 element at address: 0x200000367dc0 with size: 0.000244 MiB 00:05:52.600 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_61 00:05:52.600 element at address: 0x200000367b40 with size: 0.000244 MiB 00:05:52.600 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_30 00:05:52.600 element at address: 0x2000003644c0 with size: 0.000244 MiB 00:05:52.600 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_62 00:05:52.600 element at address: 0x200000364300 with size: 0.000244 MiB 00:05:52.600 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_63 00:05:52.600 element at address: 0x200000364080 with size: 0.000244 MiB 00:05:52.600 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_31 00:05:52.600 element at address: 0x200000360a00 with size: 0.000244 MiB 00:05:52.600 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_64 00:05:52.600 element at address: 0x200000360840 with size: 0.000244 MiB 00:05:52.600 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_65 00:05:52.600 element at address: 0x2000003605c0 with size: 0.000244 MiB 00:05:52.600 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_32 00:05:52.600 element at address: 0x20000035cf40 with size: 0.000244 MiB 00:05:52.600 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_66 00:05:52.600 element at address: 0x20000035cd80 with size: 0.000244 MiB 00:05:52.600 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_67 00:05:52.600 element at address: 0x20000035cb00 with size: 0.000244 MiB 00:05:52.600 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_33 00:05:52.600 element at address: 0x200000359480 with size: 0.000244 MiB 00:05:52.600 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_68 00:05:52.600 element at address: 0x2000003592c0 with size: 0.000244 MiB 00:05:52.600 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_69 00:05:52.600 element at address: 0x200000359040 with size: 0.000244 MiB 00:05:52.600 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_34 00:05:52.600 element at address: 0x2000003559c0 with size: 0.000244 MiB 00:05:52.600 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_70 00:05:52.600 element at address: 0x200000355800 with size: 0.000244 MiB 00:05:52.600 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_71 00:05:52.600 element at address: 0x200000355580 with size: 0.000244 MiB 00:05:52.600 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_35 00:05:52.600 element at address: 0x200000351f00 with size: 0.000244 MiB 00:05:52.600 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_72 00:05:52.600 element at address: 0x200000351d40 with size: 0.000244 MiB 00:05:52.600 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_73 00:05:52.600 element at address: 0x200000351ac0 with size: 0.000244 MiB 00:05:52.600 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_36 00:05:52.600 element at address: 0x20000034e440 with size: 0.000244 MiB 00:05:52.600 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_74 00:05:52.600 element at address: 0x20000034e280 with size: 0.000244 MiB 00:05:52.600 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_75 00:05:52.600 element at address: 0x20000034e000 with size: 0.000244 MiB 00:05:52.600 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_37 00:05:52.600 element at address: 0x20000034a980 with size: 0.000244 MiB 00:05:52.600 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_76 00:05:52.600 element at address: 0x20000034a7c0 with size: 0.000244 MiB 00:05:52.600 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_77 00:05:52.600 element at address: 0x20000034a540 with size: 0.000244 MiB 00:05:52.600 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_38 00:05:52.600 element at address: 0x200000346ec0 with size: 0.000244 MiB 00:05:52.600 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_78 00:05:52.600 element at address: 0x200000346d00 with size: 0.000244 MiB 00:05:52.600 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_79 00:05:52.600 element at address: 0x200000346a80 with size: 0.000244 MiB 00:05:52.600 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_39 00:05:52.600 element at address: 0x200000343400 with size: 0.000244 MiB 00:05:52.600 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_80 00:05:52.600 element at address: 0x200000343240 with size: 0.000244 MiB 00:05:52.600 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_81 00:05:52.600 element at address: 0x200000342fc0 with size: 0.000244 MiB 00:05:52.600 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_40 00:05:52.600 element at address: 0x20000033f940 with size: 0.000244 MiB 00:05:52.600 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_82 00:05:52.601 element at address: 0x20000033f780 with size: 0.000244 MiB 00:05:52.601 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_83 00:05:52.601 element at address: 0x20000033f500 with size: 0.000244 MiB 00:05:52.601 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_41 00:05:52.601 element at address: 0x20000033be80 with size: 0.000244 MiB 00:05:52.601 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_84 00:05:52.601 element at address: 0x20000033bcc0 with size: 0.000244 MiB 00:05:52.860 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_85 00:05:52.860 element at address: 0x20000033ba40 with size: 0.000244 MiB 00:05:52.860 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_42 00:05:52.860 element at address: 0x2000003383c0 with size: 0.000244 MiB 00:05:52.860 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_86 00:05:52.860 element at address: 0x200000338200 with size: 0.000244 MiB 00:05:52.860 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_87 00:05:52.860 element at address: 0x200000337f80 with size: 0.000244 MiB 00:05:52.860 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_43 00:05:52.860 element at address: 0x200000334900 with size: 0.000244 MiB 00:05:52.860 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_88 00:05:52.860 element at address: 0x200000334740 with size: 0.000244 MiB 00:05:52.860 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_89 00:05:52.860 element at address: 0x2000003344c0 with size: 0.000244 MiB 00:05:52.860 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_44 00:05:52.860 element at address: 0x200000330e40 with size: 0.000244 MiB 00:05:52.860 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_90 00:05:52.860 element at address: 0x200000330c80 with size: 0.000244 MiB 00:05:52.860 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_91 00:05:52.860 element at address: 0x200000330a00 with size: 0.000244 MiB 00:05:52.860 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_45 00:05:52.860 element at address: 0x20000032d380 with size: 0.000244 MiB 00:05:52.860 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_92 00:05:52.860 element at address: 0x20000032d1c0 with size: 0.000244 MiB 00:05:52.860 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_93 00:05:52.860 element at address: 0x20000032cf40 with size: 0.000244 MiB 00:05:52.860 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_46 00:05:52.860 element at address: 0x2000003298c0 with size: 0.000244 MiB 00:05:52.860 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_94 00:05:52.860 element at address: 0x200000329700 with size: 0.000244 MiB 00:05:52.860 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_95 00:05:52.860 element at address: 0x200000329480 with size: 0.000244 MiB 00:05:52.860 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_47 00:05:52.860 element at address: 0x2000003d5d00 with size: 0.000183 MiB 00:05:52.860 associated memzone info: size: 0.000061 MiB name: QAT_COMP_CAPA_GEN_1 00:05:52.860 23:27:37 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:05:52.860 23:27:37 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 207925 00:05:52.860 23:27:37 dpdk_mem_utility -- common/autotest_common.sh@950 -- # '[' -z 207925 ']' 00:05:52.860 23:27:37 dpdk_mem_utility -- common/autotest_common.sh@954 -- # kill -0 207925 00:05:52.860 23:27:37 dpdk_mem_utility -- common/autotest_common.sh@955 -- # uname 00:05:52.860 23:27:37 dpdk_mem_utility -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:05:52.860 23:27:37 dpdk_mem_utility -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 207925 00:05:52.860 23:27:37 dpdk_mem_utility -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:05:52.860 23:27:37 dpdk_mem_utility -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:05:52.860 23:27:37 dpdk_mem_utility -- common/autotest_common.sh@968 -- # echo 'killing process with pid 207925' 00:05:52.860 killing process with pid 207925 00:05:52.860 23:27:37 dpdk_mem_utility -- common/autotest_common.sh@969 -- # kill 207925 00:05:52.860 23:27:37 dpdk_mem_utility -- common/autotest_common.sh@974 -- # wait 207925 00:05:53.120 00:05:53.120 real 0m1.421s 00:05:53.120 user 0m1.524s 00:05:53.120 sys 0m0.383s 00:05:53.120 23:27:37 dpdk_mem_utility -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:53.120 23:27:37 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:05:53.120 ************************************ 00:05:53.120 END TEST dpdk_mem_utility 00:05:53.120 ************************************ 00:05:53.120 23:27:37 -- spdk/autotest.sh@181 -- # run_test event /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event.sh 00:05:53.120 23:27:37 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:53.120 23:27:37 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:53.120 23:27:37 -- common/autotest_common.sh@10 -- # set +x 00:05:53.120 ************************************ 00:05:53.120 START TEST event 00:05:53.120 ************************************ 00:05:53.120 23:27:38 event -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event.sh 00:05:53.120 * Looking for test storage... 00:05:53.120 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event 00:05:53.120 23:27:38 event -- event/event.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:05:53.120 23:27:38 event -- bdev/nbd_common.sh@6 -- # set -e 00:05:53.120 23:27:38 event -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:05:53.120 23:27:38 event -- common/autotest_common.sh@1101 -- # '[' 6 -le 1 ']' 00:05:53.120 23:27:38 event -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:53.120 23:27:38 event -- common/autotest_common.sh@10 -- # set +x 00:05:53.379 ************************************ 00:05:53.379 START TEST event_perf 00:05:53.379 ************************************ 00:05:53.379 23:27:38 event.event_perf -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:05:53.379 Running I/O for 1 seconds...[2024-07-24 23:27:38.154379] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:05:53.379 [2024-07-24 23:27:38.154451] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid208209 ] 00:05:53.379 [2024-07-24 23:27:38.221546] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:05:53.379 [2024-07-24 23:27:38.298754] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:05:53.379 [2024-07-24 23:27:38.298853] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:05:53.379 [2024-07-24 23:27:38.298943] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:05:53.379 [2024-07-24 23:27:38.298945] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:05:54.788 Running I/O for 1 seconds... 00:05:54.788 lcore 0: 213761 00:05:54.788 lcore 1: 213759 00:05:54.788 lcore 2: 213759 00:05:54.788 lcore 3: 213760 00:05:54.788 done. 00:05:54.788 00:05:54.788 real 0m1.241s 00:05:54.788 user 0m4.150s 00:05:54.788 sys 0m0.087s 00:05:54.788 23:27:39 event.event_perf -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:54.788 23:27:39 event.event_perf -- common/autotest_common.sh@10 -- # set +x 00:05:54.788 ************************************ 00:05:54.788 END TEST event_perf 00:05:54.788 ************************************ 00:05:54.788 23:27:39 event -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:05:54.788 23:27:39 event -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:05:54.788 23:27:39 event -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:54.788 23:27:39 event -- common/autotest_common.sh@10 -- # set +x 00:05:54.788 ************************************ 00:05:54.788 START TEST event_reactor 00:05:54.788 ************************************ 00:05:54.788 23:27:39 event.event_reactor -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:05:54.788 [2024-07-24 23:27:39.460771] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:05:54.788 [2024-07-24 23:27:39.460835] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid208467 ] 00:05:54.788 [2024-07-24 23:27:39.525032] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:54.788 [2024-07-24 23:27:39.597064] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:05:55.725 test_start 00:05:55.725 oneshot 00:05:55.725 tick 100 00:05:55.725 tick 100 00:05:55.725 tick 250 00:05:55.725 tick 100 00:05:55.725 tick 100 00:05:55.725 tick 100 00:05:55.725 tick 250 00:05:55.725 tick 500 00:05:55.725 tick 100 00:05:55.725 tick 100 00:05:55.725 tick 250 00:05:55.725 tick 100 00:05:55.725 tick 100 00:05:55.725 test_end 00:05:55.725 00:05:55.725 real 0m1.231s 00:05:55.725 user 0m1.153s 00:05:55.725 sys 0m0.075s 00:05:55.725 23:27:40 event.event_reactor -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:55.725 23:27:40 event.event_reactor -- common/autotest_common.sh@10 -- # set +x 00:05:55.725 ************************************ 00:05:55.725 END TEST event_reactor 00:05:55.725 ************************************ 00:05:55.725 23:27:40 event -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:05:55.725 23:27:40 event -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:05:55.725 23:27:40 event -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:55.725 23:27:40 event -- common/autotest_common.sh@10 -- # set +x 00:05:55.984 ************************************ 00:05:55.984 START TEST event_reactor_perf 00:05:55.984 ************************************ 00:05:55.984 23:27:40 event.event_reactor_perf -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:05:55.984 [2024-07-24 23:27:40.756107] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:05:55.984 [2024-07-24 23:27:40.756155] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid208711 ] 00:05:55.984 [2024-07-24 23:27:40.819963] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:55.984 [2024-07-24 23:27:40.896158] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:05:57.362 test_start 00:05:57.362 test_end 00:05:57.362 Performance: 506958 events per second 00:05:57.362 00:05:57.362 real 0m1.233s 00:05:57.362 user 0m1.150s 00:05:57.362 sys 0m0.079s 00:05:57.362 23:27:41 event.event_reactor_perf -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:57.362 23:27:41 event.event_reactor_perf -- common/autotest_common.sh@10 -- # set +x 00:05:57.362 ************************************ 00:05:57.362 END TEST event_reactor_perf 00:05:57.362 ************************************ 00:05:57.362 23:27:41 event -- event/event.sh@49 -- # uname -s 00:05:57.362 23:27:42 event -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:05:57.362 23:27:42 event -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:05:57.362 23:27:42 event -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:57.362 23:27:42 event -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:57.362 23:27:42 event -- common/autotest_common.sh@10 -- # set +x 00:05:57.362 ************************************ 00:05:57.362 START TEST event_scheduler 00:05:57.362 ************************************ 00:05:57.362 23:27:42 event.event_scheduler -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:05:57.362 * Looking for test storage... 00:05:57.362 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler 00:05:57.362 23:27:42 event.event_scheduler -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:05:57.362 23:27:42 event.event_scheduler -- scheduler/scheduler.sh@35 -- # scheduler_pid=208983 00:05:57.362 23:27:42 event.event_scheduler -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:05:57.362 23:27:42 event.event_scheduler -- scheduler/scheduler.sh@37 -- # waitforlisten 208983 00:05:57.362 23:27:42 event.event_scheduler -- common/autotest_common.sh@831 -- # '[' -z 208983 ']' 00:05:57.362 23:27:42 event.event_scheduler -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:57.362 23:27:42 event.event_scheduler -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:05:57.362 23:27:42 event.event_scheduler -- common/autotest_common.sh@836 -- # local max_retries=100 00:05:57.362 23:27:42 event.event_scheduler -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:57.362 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:57.362 23:27:42 event.event_scheduler -- common/autotest_common.sh@840 -- # xtrace_disable 00:05:57.362 23:27:42 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:05:57.362 [2024-07-24 23:27:42.151910] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:05:57.362 [2024-07-24 23:27:42.151961] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid208983 ] 00:05:57.362 [2024-07-24 23:27:42.212065] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:05:57.362 [2024-07-24 23:27:42.293431] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:05:57.362 [2024-07-24 23:27:42.293451] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:05:57.362 [2024-07-24 23:27:42.293473] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:05:57.362 [2024-07-24 23:27:42.293473] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:05:58.299 23:27:42 event.event_scheduler -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:05:58.299 23:27:42 event.event_scheduler -- common/autotest_common.sh@864 -- # return 0 00:05:58.299 23:27:42 event.event_scheduler -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:05:58.299 23:27:42 event.event_scheduler -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:58.299 23:27:42 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:05:58.299 [2024-07-24 23:27:42.959881] dpdk_governor.c: 173:_init: *ERROR*: App core mask contains some but not all of a set of SMT siblings 00:05:58.299 [2024-07-24 23:27:42.959897] scheduler_dynamic.c: 270:init: *NOTICE*: Unable to initialize dpdk governor 00:05:58.299 [2024-07-24 23:27:42.959905] scheduler_dynamic.c: 416:set_opts: *NOTICE*: Setting scheduler load limit to 20 00:05:58.299 [2024-07-24 23:27:42.959913] scheduler_dynamic.c: 418:set_opts: *NOTICE*: Setting scheduler core limit to 80 00:05:58.299 [2024-07-24 23:27:42.959918] scheduler_dynamic.c: 420:set_opts: *NOTICE*: Setting scheduler core busy to 95 00:05:58.299 23:27:42 event.event_scheduler -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:58.299 23:27:42 event.event_scheduler -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:05:58.299 23:27:42 event.event_scheduler -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:58.299 23:27:42 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:05:58.299 [2024-07-24 23:27:43.041854] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:05:58.299 23:27:43 event.event_scheduler -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:58.299 23:27:43 event.event_scheduler -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:05:58.299 23:27:43 event.event_scheduler -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:58.299 23:27:43 event.event_scheduler -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:58.299 23:27:43 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:05:58.299 ************************************ 00:05:58.299 START TEST scheduler_create_thread 00:05:58.299 ************************************ 00:05:58.299 23:27:43 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1125 -- # scheduler_create_thread 00:05:58.299 23:27:43 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:05:58.299 23:27:43 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:58.299 23:27:43 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:05:58.299 2 00:05:58.299 23:27:43 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:58.299 23:27:43 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:05:58.299 23:27:43 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:58.299 23:27:43 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:05:58.300 3 00:05:58.300 23:27:43 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:58.300 23:27:43 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:05:58.300 23:27:43 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:58.300 23:27:43 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:05:58.300 4 00:05:58.300 23:27:43 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:58.300 23:27:43 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:05:58.300 23:27:43 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:58.300 23:27:43 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:05:58.300 5 00:05:58.300 23:27:43 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:58.300 23:27:43 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:05:58.300 23:27:43 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:58.300 23:27:43 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:05:58.300 6 00:05:58.300 23:27:43 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:58.300 23:27:43 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:05:58.300 23:27:43 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:58.300 23:27:43 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:05:58.300 7 00:05:58.300 23:27:43 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:58.300 23:27:43 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:05:58.300 23:27:43 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:58.300 23:27:43 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:05:58.300 8 00:05:58.300 23:27:43 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:58.300 23:27:43 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:05:58.300 23:27:43 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:58.300 23:27:43 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:05:58.300 9 00:05:58.300 23:27:43 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:58.300 23:27:43 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:05:58.300 23:27:43 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:58.300 23:27:43 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:05:58.300 10 00:05:58.300 23:27:43 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:58.300 23:27:43 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:05:58.300 23:27:43 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:58.300 23:27:43 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:05:58.300 23:27:43 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:58.300 23:27:43 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # thread_id=11 00:05:58.300 23:27:43 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:05:58.300 23:27:43 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:58.300 23:27:43 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:05:58.300 23:27:43 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:58.300 23:27:43 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:05:58.300 23:27:43 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:58.300 23:27:43 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:05:59.674 23:27:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:59.674 23:27:44 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # thread_id=12 00:05:59.674 23:27:44 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:05:59.674 23:27:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:59.674 23:27:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:01.051 23:27:45 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:01.051 00:06:01.051 real 0m2.617s 00:06:01.051 user 0m0.022s 00:06:01.051 sys 0m0.006s 00:06:01.051 23:27:45 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:01.051 23:27:45 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:01.051 ************************************ 00:06:01.051 END TEST scheduler_create_thread 00:06:01.051 ************************************ 00:06:01.051 23:27:45 event.event_scheduler -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:06:01.051 23:27:45 event.event_scheduler -- scheduler/scheduler.sh@46 -- # killprocess 208983 00:06:01.051 23:27:45 event.event_scheduler -- common/autotest_common.sh@950 -- # '[' -z 208983 ']' 00:06:01.051 23:27:45 event.event_scheduler -- common/autotest_common.sh@954 -- # kill -0 208983 00:06:01.051 23:27:45 event.event_scheduler -- common/autotest_common.sh@955 -- # uname 00:06:01.051 23:27:45 event.event_scheduler -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:06:01.051 23:27:45 event.event_scheduler -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 208983 00:06:01.051 23:27:45 event.event_scheduler -- common/autotest_common.sh@956 -- # process_name=reactor_2 00:06:01.051 23:27:45 event.event_scheduler -- common/autotest_common.sh@960 -- # '[' reactor_2 = sudo ']' 00:06:01.051 23:27:45 event.event_scheduler -- common/autotest_common.sh@968 -- # echo 'killing process with pid 208983' 00:06:01.051 killing process with pid 208983 00:06:01.051 23:27:45 event.event_scheduler -- common/autotest_common.sh@969 -- # kill 208983 00:06:01.051 23:27:45 event.event_scheduler -- common/autotest_common.sh@974 -- # wait 208983 00:06:01.311 [2024-07-24 23:27:46.175906] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:06:01.572 00:06:01.572 real 0m4.334s 00:06:01.572 user 0m8.176s 00:06:01.572 sys 0m0.340s 00:06:01.572 23:27:46 event.event_scheduler -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:01.572 23:27:46 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:01.572 ************************************ 00:06:01.572 END TEST event_scheduler 00:06:01.572 ************************************ 00:06:01.572 23:27:46 event -- event/event.sh@51 -- # modprobe -n nbd 00:06:01.572 23:27:46 event -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:06:01.572 23:27:46 event -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:01.572 23:27:46 event -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:01.572 23:27:46 event -- common/autotest_common.sh@10 -- # set +x 00:06:01.572 ************************************ 00:06:01.572 START TEST app_repeat 00:06:01.572 ************************************ 00:06:01.572 23:27:46 event.app_repeat -- common/autotest_common.sh@1125 -- # app_repeat_test 00:06:01.572 23:27:46 event.app_repeat -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:01.572 23:27:46 event.app_repeat -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:01.572 23:27:46 event.app_repeat -- event/event.sh@13 -- # local nbd_list 00:06:01.572 23:27:46 event.app_repeat -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:01.572 23:27:46 event.app_repeat -- event/event.sh@14 -- # local bdev_list 00:06:01.572 23:27:46 event.app_repeat -- event/event.sh@15 -- # local repeat_times=4 00:06:01.572 23:27:46 event.app_repeat -- event/event.sh@17 -- # modprobe nbd 00:06:01.572 23:27:46 event.app_repeat -- event/event.sh@18 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:06:01.572 23:27:46 event.app_repeat -- event/event.sh@19 -- # repeat_pid=209814 00:06:01.572 23:27:46 event.app_repeat -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:06:01.572 23:27:46 event.app_repeat -- event/event.sh@21 -- # echo 'Process app_repeat pid: 209814' 00:06:01.572 Process app_repeat pid: 209814 00:06:01.573 23:27:46 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:06:01.573 23:27:46 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:06:01.573 spdk_app_start Round 0 00:06:01.573 23:27:46 event.app_repeat -- event/event.sh@25 -- # waitforlisten 209814 /var/tmp/spdk-nbd.sock 00:06:01.573 23:27:46 event.app_repeat -- common/autotest_common.sh@831 -- # '[' -z 209814 ']' 00:06:01.573 23:27:46 event.app_repeat -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:01.573 23:27:46 event.app_repeat -- common/autotest_common.sh@836 -- # local max_retries=100 00:06:01.573 23:27:46 event.app_repeat -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:01.573 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:01.573 23:27:46 event.app_repeat -- common/autotest_common.sh@840 -- # xtrace_disable 00:06:01.573 23:27:46 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:06:01.573 [2024-07-24 23:27:46.453674] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:06:01.573 [2024-07-24 23:27:46.453709] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid209814 ] 00:06:01.573 [2024-07-24 23:27:46.515099] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:01.832 [2024-07-24 23:27:46.597488] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:06:01.832 [2024-07-24 23:27:46.597492] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:01.832 23:27:46 event.app_repeat -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:06:01.832 23:27:46 event.app_repeat -- common/autotest_common.sh@864 -- # return 0 00:06:01.832 23:27:46 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:02.091 Malloc0 00:06:02.091 23:27:46 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:02.091 Malloc1 00:06:02.349 23:27:47 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:02.349 23:27:47 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:02.349 23:27:47 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:02.349 23:27:47 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:06:02.349 23:27:47 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:02.349 23:27:47 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:06:02.349 23:27:47 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:02.349 23:27:47 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:02.349 23:27:47 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:02.349 23:27:47 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:06:02.350 23:27:47 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:02.350 23:27:47 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:06:02.350 23:27:47 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:06:02.350 23:27:47 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:06:02.350 23:27:47 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:02.350 23:27:47 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:06:02.350 /dev/nbd0 00:06:02.350 23:27:47 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:06:02.350 23:27:47 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:06:02.350 23:27:47 event.app_repeat -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:06:02.350 23:27:47 event.app_repeat -- common/autotest_common.sh@869 -- # local i 00:06:02.350 23:27:47 event.app_repeat -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:06:02.350 23:27:47 event.app_repeat -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:06:02.350 23:27:47 event.app_repeat -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:06:02.350 23:27:47 event.app_repeat -- common/autotest_common.sh@873 -- # break 00:06:02.350 23:27:47 event.app_repeat -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:06:02.350 23:27:47 event.app_repeat -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:06:02.350 23:27:47 event.app_repeat -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:02.350 1+0 records in 00:06:02.350 1+0 records out 00:06:02.350 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000229817 s, 17.8 MB/s 00:06:02.350 23:27:47 event.app_repeat -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:02.350 23:27:47 event.app_repeat -- common/autotest_common.sh@886 -- # size=4096 00:06:02.350 23:27:47 event.app_repeat -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:02.350 23:27:47 event.app_repeat -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:06:02.350 23:27:47 event.app_repeat -- common/autotest_common.sh@889 -- # return 0 00:06:02.350 23:27:47 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:02.350 23:27:47 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:02.350 23:27:47 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:06:02.611 /dev/nbd1 00:06:02.611 23:27:47 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:06:02.611 23:27:47 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:06:02.611 23:27:47 event.app_repeat -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:06:02.611 23:27:47 event.app_repeat -- common/autotest_common.sh@869 -- # local i 00:06:02.611 23:27:47 event.app_repeat -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:06:02.611 23:27:47 event.app_repeat -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:06:02.611 23:27:47 event.app_repeat -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:06:02.611 23:27:47 event.app_repeat -- common/autotest_common.sh@873 -- # break 00:06:02.611 23:27:47 event.app_repeat -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:06:02.611 23:27:47 event.app_repeat -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:06:02.611 23:27:47 event.app_repeat -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:02.611 1+0 records in 00:06:02.611 1+0 records out 00:06:02.611 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000208103 s, 19.7 MB/s 00:06:02.611 23:27:47 event.app_repeat -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:02.611 23:27:47 event.app_repeat -- common/autotest_common.sh@886 -- # size=4096 00:06:02.611 23:27:47 event.app_repeat -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:02.611 23:27:47 event.app_repeat -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:06:02.611 23:27:47 event.app_repeat -- common/autotest_common.sh@889 -- # return 0 00:06:02.611 23:27:47 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:02.611 23:27:47 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:02.611 23:27:47 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:02.611 23:27:47 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:02.611 23:27:47 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:02.870 23:27:47 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:06:02.870 { 00:06:02.870 "nbd_device": "/dev/nbd0", 00:06:02.870 "bdev_name": "Malloc0" 00:06:02.870 }, 00:06:02.870 { 00:06:02.870 "nbd_device": "/dev/nbd1", 00:06:02.870 "bdev_name": "Malloc1" 00:06:02.870 } 00:06:02.870 ]' 00:06:02.870 23:27:47 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:06:02.870 { 00:06:02.870 "nbd_device": "/dev/nbd0", 00:06:02.870 "bdev_name": "Malloc0" 00:06:02.870 }, 00:06:02.870 { 00:06:02.870 "nbd_device": "/dev/nbd1", 00:06:02.870 "bdev_name": "Malloc1" 00:06:02.870 } 00:06:02.870 ]' 00:06:02.870 23:27:47 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:02.870 23:27:47 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:06:02.870 /dev/nbd1' 00:06:02.870 23:27:47 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:06:02.870 /dev/nbd1' 00:06:02.870 23:27:47 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:02.870 23:27:47 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:06:02.870 23:27:47 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:06:02.870 23:27:47 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:06:02.870 23:27:47 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:06:02.870 23:27:47 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:06:02.870 23:27:47 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:02.870 23:27:47 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:02.870 23:27:47 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:06:02.870 23:27:47 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:06:02.870 23:27:47 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:06:02.870 23:27:47 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:06:02.870 256+0 records in 00:06:02.870 256+0 records out 00:06:02.870 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0100356 s, 104 MB/s 00:06:02.870 23:27:47 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:02.870 23:27:47 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:06:02.870 256+0 records in 00:06:02.870 256+0 records out 00:06:02.870 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0137333 s, 76.4 MB/s 00:06:02.870 23:27:47 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:02.870 23:27:47 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:06:02.870 256+0 records in 00:06:02.870 256+0 records out 00:06:02.870 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0144107 s, 72.8 MB/s 00:06:02.870 23:27:47 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:06:02.870 23:27:47 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:02.870 23:27:47 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:02.870 23:27:47 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:06:02.870 23:27:47 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:06:02.870 23:27:47 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:06:02.870 23:27:47 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:06:02.870 23:27:47 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:02.870 23:27:47 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:06:02.870 23:27:47 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:02.870 23:27:47 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:06:02.870 23:27:47 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:06:02.870 23:27:47 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:06:02.870 23:27:47 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:02.870 23:27:47 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:02.870 23:27:47 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:02.870 23:27:47 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:06:02.870 23:27:47 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:02.870 23:27:47 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:03.129 23:27:47 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:03.129 23:27:47 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:03.129 23:27:47 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:03.129 23:27:47 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:03.129 23:27:47 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:03.129 23:27:47 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:03.129 23:27:47 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:06:03.129 23:27:47 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:06:03.129 23:27:47 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:03.129 23:27:47 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:03.388 23:27:48 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:03.388 23:27:48 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:03.388 23:27:48 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:03.388 23:27:48 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:03.388 23:27:48 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:03.388 23:27:48 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:03.388 23:27:48 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:06:03.388 23:27:48 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:06:03.388 23:27:48 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:03.388 23:27:48 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:03.388 23:27:48 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:03.388 23:27:48 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:03.388 23:27:48 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:03.388 23:27:48 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:03.647 23:27:48 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:03.647 23:27:48 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:06:03.647 23:27:48 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:03.647 23:27:48 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:06:03.647 23:27:48 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:06:03.647 23:27:48 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:06:03.647 23:27:48 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:06:03.647 23:27:48 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:06:03.647 23:27:48 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:06:03.647 23:27:48 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:06:03.647 23:27:48 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:06:03.906 [2024-07-24 23:27:48.785016] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:03.906 [2024-07-24 23:27:48.850448] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:06:03.906 [2024-07-24 23:27:48.850451] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:03.906 [2024-07-24 23:27:48.890891] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:06:03.906 [2024-07-24 23:27:48.890928] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:07.191 23:27:51 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:06:07.192 23:27:51 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:06:07.192 spdk_app_start Round 1 00:06:07.192 23:27:51 event.app_repeat -- event/event.sh@25 -- # waitforlisten 209814 /var/tmp/spdk-nbd.sock 00:06:07.192 23:27:51 event.app_repeat -- common/autotest_common.sh@831 -- # '[' -z 209814 ']' 00:06:07.192 23:27:51 event.app_repeat -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:07.192 23:27:51 event.app_repeat -- common/autotest_common.sh@836 -- # local max_retries=100 00:06:07.192 23:27:51 event.app_repeat -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:07.192 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:07.192 23:27:51 event.app_repeat -- common/autotest_common.sh@840 -- # xtrace_disable 00:06:07.192 23:27:51 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:06:07.192 23:27:51 event.app_repeat -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:06:07.192 23:27:51 event.app_repeat -- common/autotest_common.sh@864 -- # return 0 00:06:07.192 23:27:51 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:07.192 Malloc0 00:06:07.192 23:27:51 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:07.192 Malloc1 00:06:07.192 23:27:52 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:07.192 23:27:52 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:07.192 23:27:52 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:07.192 23:27:52 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:06:07.192 23:27:52 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:07.192 23:27:52 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:06:07.192 23:27:52 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:07.192 23:27:52 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:07.192 23:27:52 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:07.192 23:27:52 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:06:07.192 23:27:52 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:07.192 23:27:52 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:06:07.192 23:27:52 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:06:07.192 23:27:52 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:06:07.192 23:27:52 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:07.192 23:27:52 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:06:07.449 /dev/nbd0 00:06:07.449 23:27:52 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:06:07.449 23:27:52 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:06:07.449 23:27:52 event.app_repeat -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:06:07.449 23:27:52 event.app_repeat -- common/autotest_common.sh@869 -- # local i 00:06:07.449 23:27:52 event.app_repeat -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:06:07.449 23:27:52 event.app_repeat -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:06:07.449 23:27:52 event.app_repeat -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:06:07.449 23:27:52 event.app_repeat -- common/autotest_common.sh@873 -- # break 00:06:07.449 23:27:52 event.app_repeat -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:06:07.449 23:27:52 event.app_repeat -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:06:07.449 23:27:52 event.app_repeat -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:07.449 1+0 records in 00:06:07.449 1+0 records out 00:06:07.449 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000206776 s, 19.8 MB/s 00:06:07.449 23:27:52 event.app_repeat -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:07.449 23:27:52 event.app_repeat -- common/autotest_common.sh@886 -- # size=4096 00:06:07.449 23:27:52 event.app_repeat -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:07.449 23:27:52 event.app_repeat -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:06:07.449 23:27:52 event.app_repeat -- common/autotest_common.sh@889 -- # return 0 00:06:07.449 23:27:52 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:07.449 23:27:52 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:07.449 23:27:52 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:06:07.707 /dev/nbd1 00:06:07.707 23:27:52 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:06:07.707 23:27:52 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:06:07.707 23:27:52 event.app_repeat -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:06:07.707 23:27:52 event.app_repeat -- common/autotest_common.sh@869 -- # local i 00:06:07.707 23:27:52 event.app_repeat -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:06:07.707 23:27:52 event.app_repeat -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:06:07.707 23:27:52 event.app_repeat -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:06:07.707 23:27:52 event.app_repeat -- common/autotest_common.sh@873 -- # break 00:06:07.707 23:27:52 event.app_repeat -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:06:07.707 23:27:52 event.app_repeat -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:06:07.707 23:27:52 event.app_repeat -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:07.707 1+0 records in 00:06:07.707 1+0 records out 00:06:07.707 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000248843 s, 16.5 MB/s 00:06:07.707 23:27:52 event.app_repeat -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:07.707 23:27:52 event.app_repeat -- common/autotest_common.sh@886 -- # size=4096 00:06:07.707 23:27:52 event.app_repeat -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:07.707 23:27:52 event.app_repeat -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:06:07.707 23:27:52 event.app_repeat -- common/autotest_common.sh@889 -- # return 0 00:06:07.707 23:27:52 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:07.707 23:27:52 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:07.707 23:27:52 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:07.707 23:27:52 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:07.707 23:27:52 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:07.965 23:27:52 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:06:07.965 { 00:06:07.965 "nbd_device": "/dev/nbd0", 00:06:07.965 "bdev_name": "Malloc0" 00:06:07.965 }, 00:06:07.965 { 00:06:07.965 "nbd_device": "/dev/nbd1", 00:06:07.965 "bdev_name": "Malloc1" 00:06:07.965 } 00:06:07.965 ]' 00:06:07.965 23:27:52 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:06:07.965 { 00:06:07.965 "nbd_device": "/dev/nbd0", 00:06:07.965 "bdev_name": "Malloc0" 00:06:07.965 }, 00:06:07.965 { 00:06:07.965 "nbd_device": "/dev/nbd1", 00:06:07.965 "bdev_name": "Malloc1" 00:06:07.965 } 00:06:07.965 ]' 00:06:07.965 23:27:52 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:07.965 23:27:52 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:06:07.965 /dev/nbd1' 00:06:07.965 23:27:52 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:06:07.965 /dev/nbd1' 00:06:07.965 23:27:52 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:07.965 23:27:52 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:06:07.965 23:27:52 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:06:07.965 23:27:52 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:06:07.965 23:27:52 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:06:07.965 23:27:52 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:06:07.965 23:27:52 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:07.965 23:27:52 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:07.965 23:27:52 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:06:07.965 23:27:52 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:06:07.965 23:27:52 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:06:07.965 23:27:52 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:06:07.965 256+0 records in 00:06:07.965 256+0 records out 00:06:07.965 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0104643 s, 100 MB/s 00:06:07.965 23:27:52 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:07.965 23:27:52 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:06:07.965 256+0 records in 00:06:07.965 256+0 records out 00:06:07.965 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0138535 s, 75.7 MB/s 00:06:07.965 23:27:52 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:07.965 23:27:52 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:06:07.965 256+0 records in 00:06:07.965 256+0 records out 00:06:07.965 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0147255 s, 71.2 MB/s 00:06:07.965 23:27:52 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:06:07.965 23:27:52 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:07.965 23:27:52 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:07.965 23:27:52 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:06:07.965 23:27:52 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:06:07.965 23:27:52 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:06:07.965 23:27:52 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:06:07.965 23:27:52 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:07.965 23:27:52 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:06:07.965 23:27:52 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:07.965 23:27:52 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:06:07.965 23:27:52 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:06:07.965 23:27:52 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:06:07.965 23:27:52 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:07.965 23:27:52 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:07.965 23:27:52 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:07.965 23:27:52 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:06:07.965 23:27:52 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:07.965 23:27:52 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:08.223 23:27:53 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:08.223 23:27:53 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:08.223 23:27:53 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:08.223 23:27:53 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:08.223 23:27:53 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:08.223 23:27:53 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:08.223 23:27:53 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:06:08.223 23:27:53 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:06:08.223 23:27:53 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:08.223 23:27:53 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:08.481 23:27:53 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:08.481 23:27:53 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:08.481 23:27:53 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:08.481 23:27:53 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:08.481 23:27:53 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:08.481 23:27:53 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:08.481 23:27:53 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:06:08.481 23:27:53 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:06:08.481 23:27:53 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:08.481 23:27:53 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:08.481 23:27:53 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:08.481 23:27:53 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:08.481 23:27:53 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:08.481 23:27:53 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:08.481 23:27:53 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:08.481 23:27:53 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:06:08.481 23:27:53 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:08.739 23:27:53 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:06:08.739 23:27:53 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:06:08.739 23:27:53 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:06:08.739 23:27:53 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:06:08.739 23:27:53 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:06:08.739 23:27:53 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:06:08.739 23:27:53 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:06:08.739 23:27:53 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:06:08.997 [2024-07-24 23:27:53.854814] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:08.997 [2024-07-24 23:27:53.920381] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:06:08.997 [2024-07-24 23:27:53.920384] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:08.997 [2024-07-24 23:27:53.961470] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:06:08.997 [2024-07-24 23:27:53.961508] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:12.280 23:27:56 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:06:12.280 23:27:56 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:06:12.280 spdk_app_start Round 2 00:06:12.280 23:27:56 event.app_repeat -- event/event.sh@25 -- # waitforlisten 209814 /var/tmp/spdk-nbd.sock 00:06:12.280 23:27:56 event.app_repeat -- common/autotest_common.sh@831 -- # '[' -z 209814 ']' 00:06:12.280 23:27:56 event.app_repeat -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:12.280 23:27:56 event.app_repeat -- common/autotest_common.sh@836 -- # local max_retries=100 00:06:12.280 23:27:56 event.app_repeat -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:12.280 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:12.280 23:27:56 event.app_repeat -- common/autotest_common.sh@840 -- # xtrace_disable 00:06:12.280 23:27:56 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:06:12.280 23:27:56 event.app_repeat -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:06:12.280 23:27:56 event.app_repeat -- common/autotest_common.sh@864 -- # return 0 00:06:12.280 23:27:56 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:12.280 Malloc0 00:06:12.280 23:27:57 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:12.280 Malloc1 00:06:12.280 23:27:57 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:12.280 23:27:57 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:12.280 23:27:57 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:12.280 23:27:57 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:06:12.280 23:27:57 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:12.280 23:27:57 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:06:12.280 23:27:57 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:12.280 23:27:57 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:12.280 23:27:57 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:12.280 23:27:57 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:06:12.280 23:27:57 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:12.280 23:27:57 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:06:12.280 23:27:57 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:06:12.280 23:27:57 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:06:12.280 23:27:57 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:12.280 23:27:57 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:06:12.538 /dev/nbd0 00:06:12.538 23:27:57 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:06:12.538 23:27:57 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:06:12.538 23:27:57 event.app_repeat -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:06:12.538 23:27:57 event.app_repeat -- common/autotest_common.sh@869 -- # local i 00:06:12.538 23:27:57 event.app_repeat -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:06:12.538 23:27:57 event.app_repeat -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:06:12.538 23:27:57 event.app_repeat -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:06:12.538 23:27:57 event.app_repeat -- common/autotest_common.sh@873 -- # break 00:06:12.538 23:27:57 event.app_repeat -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:06:12.538 23:27:57 event.app_repeat -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:06:12.538 23:27:57 event.app_repeat -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:12.538 1+0 records in 00:06:12.538 1+0 records out 00:06:12.538 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000186754 s, 21.9 MB/s 00:06:12.538 23:27:57 event.app_repeat -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:12.538 23:27:57 event.app_repeat -- common/autotest_common.sh@886 -- # size=4096 00:06:12.538 23:27:57 event.app_repeat -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:12.538 23:27:57 event.app_repeat -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:06:12.538 23:27:57 event.app_repeat -- common/autotest_common.sh@889 -- # return 0 00:06:12.538 23:27:57 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:12.538 23:27:57 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:12.538 23:27:57 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:06:12.797 /dev/nbd1 00:06:12.797 23:27:57 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:06:12.797 23:27:57 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:06:12.797 23:27:57 event.app_repeat -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:06:12.797 23:27:57 event.app_repeat -- common/autotest_common.sh@869 -- # local i 00:06:12.797 23:27:57 event.app_repeat -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:06:12.797 23:27:57 event.app_repeat -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:06:12.797 23:27:57 event.app_repeat -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:06:12.797 23:27:57 event.app_repeat -- common/autotest_common.sh@873 -- # break 00:06:12.797 23:27:57 event.app_repeat -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:06:12.797 23:27:57 event.app_repeat -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:06:12.797 23:27:57 event.app_repeat -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:12.797 1+0 records in 00:06:12.797 1+0 records out 00:06:12.797 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000100477 s, 40.8 MB/s 00:06:12.797 23:27:57 event.app_repeat -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:12.797 23:27:57 event.app_repeat -- common/autotest_common.sh@886 -- # size=4096 00:06:12.797 23:27:57 event.app_repeat -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:12.797 23:27:57 event.app_repeat -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:06:12.797 23:27:57 event.app_repeat -- common/autotest_common.sh@889 -- # return 0 00:06:12.797 23:27:57 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:12.797 23:27:57 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:12.797 23:27:57 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:12.797 23:27:57 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:12.797 23:27:57 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:12.797 23:27:57 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:06:12.797 { 00:06:12.797 "nbd_device": "/dev/nbd0", 00:06:12.797 "bdev_name": "Malloc0" 00:06:12.797 }, 00:06:12.797 { 00:06:12.797 "nbd_device": "/dev/nbd1", 00:06:12.797 "bdev_name": "Malloc1" 00:06:12.797 } 00:06:12.797 ]' 00:06:12.797 23:27:57 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:06:12.797 { 00:06:12.797 "nbd_device": "/dev/nbd0", 00:06:12.797 "bdev_name": "Malloc0" 00:06:12.797 }, 00:06:12.797 { 00:06:12.797 "nbd_device": "/dev/nbd1", 00:06:12.797 "bdev_name": "Malloc1" 00:06:12.797 } 00:06:12.797 ]' 00:06:12.797 23:27:57 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:13.056 23:27:57 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:06:13.056 /dev/nbd1' 00:06:13.056 23:27:57 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:06:13.056 /dev/nbd1' 00:06:13.056 23:27:57 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:13.056 23:27:57 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:06:13.056 23:27:57 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:06:13.056 23:27:57 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:06:13.056 23:27:57 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:06:13.056 23:27:57 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:06:13.056 23:27:57 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:13.056 23:27:57 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:13.056 23:27:57 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:06:13.056 23:27:57 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:06:13.056 23:27:57 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:06:13.056 23:27:57 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:06:13.056 256+0 records in 00:06:13.056 256+0 records out 00:06:13.056 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.010349 s, 101 MB/s 00:06:13.056 23:27:57 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:13.056 23:27:57 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:06:13.056 256+0 records in 00:06:13.056 256+0 records out 00:06:13.056 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0135787 s, 77.2 MB/s 00:06:13.056 23:27:57 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:13.056 23:27:57 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:06:13.056 256+0 records in 00:06:13.056 256+0 records out 00:06:13.056 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0147956 s, 70.9 MB/s 00:06:13.056 23:27:57 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:06:13.056 23:27:57 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:13.056 23:27:57 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:13.056 23:27:57 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:06:13.056 23:27:57 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:06:13.056 23:27:57 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:06:13.056 23:27:57 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:06:13.056 23:27:57 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:13.056 23:27:57 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:06:13.056 23:27:57 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:13.056 23:27:57 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:06:13.056 23:27:57 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:06:13.056 23:27:57 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:06:13.056 23:27:57 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:13.056 23:27:57 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:13.056 23:27:57 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:13.056 23:27:57 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:06:13.056 23:27:57 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:13.056 23:27:57 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:13.315 23:27:58 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:13.315 23:27:58 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:13.315 23:27:58 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:13.315 23:27:58 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:13.315 23:27:58 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:13.315 23:27:58 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:13.315 23:27:58 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:06:13.315 23:27:58 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:06:13.315 23:27:58 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:13.315 23:27:58 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:13.315 23:27:58 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:13.315 23:27:58 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:13.315 23:27:58 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:13.315 23:27:58 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:13.315 23:27:58 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:13.315 23:27:58 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:13.315 23:27:58 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:06:13.315 23:27:58 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:06:13.315 23:27:58 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:13.315 23:27:58 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:13.315 23:27:58 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:13.574 23:27:58 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:13.574 23:27:58 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:13.574 23:27:58 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:13.574 23:27:58 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:13.574 23:27:58 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:06:13.574 23:27:58 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:13.574 23:27:58 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:06:13.574 23:27:58 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:06:13.574 23:27:58 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:06:13.574 23:27:58 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:06:13.574 23:27:58 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:06:13.574 23:27:58 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:06:13.574 23:27:58 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:06:13.833 23:27:58 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:06:14.091 [2024-07-24 23:27:58.850128] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:14.091 [2024-07-24 23:27:58.917653] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:06:14.091 [2024-07-24 23:27:58.917656] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:14.091 [2024-07-24 23:27:58.958173] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:06:14.091 [2024-07-24 23:27:58.958212] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:17.379 23:28:01 event.app_repeat -- event/event.sh@38 -- # waitforlisten 209814 /var/tmp/spdk-nbd.sock 00:06:17.379 23:28:01 event.app_repeat -- common/autotest_common.sh@831 -- # '[' -z 209814 ']' 00:06:17.379 23:28:01 event.app_repeat -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:17.379 23:28:01 event.app_repeat -- common/autotest_common.sh@836 -- # local max_retries=100 00:06:17.379 23:28:01 event.app_repeat -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:17.379 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:17.379 23:28:01 event.app_repeat -- common/autotest_common.sh@840 -- # xtrace_disable 00:06:17.379 23:28:01 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:06:17.379 23:28:01 event.app_repeat -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:06:17.379 23:28:01 event.app_repeat -- common/autotest_common.sh@864 -- # return 0 00:06:17.379 23:28:01 event.app_repeat -- event/event.sh@39 -- # killprocess 209814 00:06:17.379 23:28:01 event.app_repeat -- common/autotest_common.sh@950 -- # '[' -z 209814 ']' 00:06:17.379 23:28:01 event.app_repeat -- common/autotest_common.sh@954 -- # kill -0 209814 00:06:17.379 23:28:01 event.app_repeat -- common/autotest_common.sh@955 -- # uname 00:06:17.379 23:28:01 event.app_repeat -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:06:17.379 23:28:01 event.app_repeat -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 209814 00:06:17.379 23:28:01 event.app_repeat -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:06:17.379 23:28:01 event.app_repeat -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:06:17.379 23:28:01 event.app_repeat -- common/autotest_common.sh@968 -- # echo 'killing process with pid 209814' 00:06:17.379 killing process with pid 209814 00:06:17.379 23:28:01 event.app_repeat -- common/autotest_common.sh@969 -- # kill 209814 00:06:17.379 23:28:01 event.app_repeat -- common/autotest_common.sh@974 -- # wait 209814 00:06:17.379 spdk_app_start is called in Round 0. 00:06:17.379 Shutdown signal received, stop current app iteration 00:06:17.379 Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 reinitialization... 00:06:17.379 spdk_app_start is called in Round 1. 00:06:17.379 Shutdown signal received, stop current app iteration 00:06:17.379 Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 reinitialization... 00:06:17.379 spdk_app_start is called in Round 2. 00:06:17.379 Shutdown signal received, stop current app iteration 00:06:17.379 Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 reinitialization... 00:06:17.379 spdk_app_start is called in Round 3. 00:06:17.379 Shutdown signal received, stop current app iteration 00:06:17.379 23:28:02 event.app_repeat -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:06:17.379 23:28:02 event.app_repeat -- event/event.sh@42 -- # return 0 00:06:17.379 00:06:17.379 real 0m15.593s 00:06:17.379 user 0m33.753s 00:06:17.379 sys 0m2.268s 00:06:17.379 23:28:02 event.app_repeat -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:17.379 23:28:02 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:06:17.379 ************************************ 00:06:17.379 END TEST app_repeat 00:06:17.379 ************************************ 00:06:17.379 23:28:02 event -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:06:17.379 00:06:17.379 real 0m24.052s 00:06:17.379 user 0m48.544s 00:06:17.379 sys 0m3.131s 00:06:17.379 23:28:02 event -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:17.379 23:28:02 event -- common/autotest_common.sh@10 -- # set +x 00:06:17.379 ************************************ 00:06:17.379 END TEST event 00:06:17.379 ************************************ 00:06:17.379 23:28:02 -- spdk/autotest.sh@182 -- # run_test thread /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/thread.sh 00:06:17.379 23:28:02 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:17.379 23:28:02 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:17.379 23:28:02 -- common/autotest_common.sh@10 -- # set +x 00:06:17.379 ************************************ 00:06:17.379 START TEST thread 00:06:17.379 ************************************ 00:06:17.379 23:28:02 thread -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/thread.sh 00:06:17.379 * Looking for test storage... 00:06:17.379 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread 00:06:17.379 23:28:02 thread -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:06:17.379 23:28:02 thread -- common/autotest_common.sh@1101 -- # '[' 8 -le 1 ']' 00:06:17.379 23:28:02 thread -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:17.379 23:28:02 thread -- common/autotest_common.sh@10 -- # set +x 00:06:17.379 ************************************ 00:06:17.379 START TEST thread_poller_perf 00:06:17.379 ************************************ 00:06:17.379 23:28:02 thread.thread_poller_perf -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:06:17.379 [2024-07-24 23:28:02.257782] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:06:17.379 [2024-07-24 23:28:02.257829] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid212675 ] 00:06:17.379 [2024-07-24 23:28:02.323344] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:17.638 [2024-07-24 23:28:02.395722] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:17.638 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:06:18.576 ====================================== 00:06:18.576 busy:2106089640 (cyc) 00:06:18.576 total_run_count: 420000 00:06:18.576 tsc_hz: 2100000000 (cyc) 00:06:18.576 ====================================== 00:06:18.576 poller_cost: 5014 (cyc), 2387 (nsec) 00:06:18.576 00:06:18.576 real 0m1.235s 00:06:18.576 user 0m1.146s 00:06:18.576 sys 0m0.085s 00:06:18.576 23:28:03 thread.thread_poller_perf -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:18.576 23:28:03 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:06:18.576 ************************************ 00:06:18.576 END TEST thread_poller_perf 00:06:18.576 ************************************ 00:06:18.576 23:28:03 thread -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:06:18.576 23:28:03 thread -- common/autotest_common.sh@1101 -- # '[' 8 -le 1 ']' 00:06:18.576 23:28:03 thread -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:18.576 23:28:03 thread -- common/autotest_common.sh@10 -- # set +x 00:06:18.576 ************************************ 00:06:18.576 START TEST thread_poller_perf 00:06:18.576 ************************************ 00:06:18.576 23:28:03 thread.thread_poller_perf -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:06:18.576 [2024-07-24 23:28:03.563816] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:06:18.576 [2024-07-24 23:28:03.563870] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid212921 ] 00:06:18.888 [2024-07-24 23:28:03.631103] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:18.888 [2024-07-24 23:28:03.701531] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:18.888 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:06:19.825 ====================================== 00:06:19.825 busy:2101575108 (cyc) 00:06:19.825 total_run_count: 5560000 00:06:19.825 tsc_hz: 2100000000 (cyc) 00:06:19.825 ====================================== 00:06:19.825 poller_cost: 377 (cyc), 179 (nsec) 00:06:19.826 00:06:19.826 real 0m1.232s 00:06:19.826 user 0m1.148s 00:06:19.826 sys 0m0.079s 00:06:19.826 23:28:04 thread.thread_poller_perf -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:19.826 23:28:04 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:06:19.826 ************************************ 00:06:19.826 END TEST thread_poller_perf 00:06:19.826 ************************************ 00:06:19.826 23:28:04 thread -- thread/thread.sh@17 -- # [[ y != \y ]] 00:06:19.826 00:06:19.826 real 0m2.682s 00:06:19.826 user 0m2.394s 00:06:19.826 sys 0m0.295s 00:06:19.826 23:28:04 thread -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:19.826 23:28:04 thread -- common/autotest_common.sh@10 -- # set +x 00:06:19.826 ************************************ 00:06:19.826 END TEST thread 00:06:19.826 ************************************ 00:06:20.085 23:28:04 -- spdk/autotest.sh@184 -- # [[ 1 -eq 1 ]] 00:06:20.085 23:28:04 -- spdk/autotest.sh@185 -- # run_test accel /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel.sh 00:06:20.085 23:28:04 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:20.085 23:28:04 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:20.085 23:28:04 -- common/autotest_common.sh@10 -- # set +x 00:06:20.085 ************************************ 00:06:20.085 START TEST accel 00:06:20.085 ************************************ 00:06:20.085 23:28:04 accel -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel.sh 00:06:20.085 * Looking for test storage... 00:06:20.085 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel 00:06:20.085 23:28:04 accel -- accel/accel.sh@81 -- # declare -A expected_opcs 00:06:20.085 23:28:04 accel -- accel/accel.sh@82 -- # get_expected_opcs 00:06:20.085 23:28:04 accel -- accel/accel.sh@60 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:06:20.085 23:28:04 accel -- accel/accel.sh@62 -- # spdk_tgt_pid=213211 00:06:20.085 23:28:04 accel -- accel/accel.sh@63 -- # waitforlisten 213211 00:06:20.085 23:28:04 accel -- common/autotest_common.sh@831 -- # '[' -z 213211 ']' 00:06:20.085 23:28:04 accel -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:20.085 23:28:04 accel -- accel/accel.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:06:20.085 23:28:04 accel -- common/autotest_common.sh@836 -- # local max_retries=100 00:06:20.085 23:28:04 accel -- accel/accel.sh@61 -- # build_accel_config 00:06:20.085 23:28:04 accel -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:20.085 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:20.085 23:28:04 accel -- common/autotest_common.sh@840 -- # xtrace_disable 00:06:20.085 23:28:04 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:20.085 23:28:04 accel -- common/autotest_common.sh@10 -- # set +x 00:06:20.085 23:28:04 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:20.085 23:28:04 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:20.085 23:28:04 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:20.085 23:28:04 accel -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:20.085 23:28:04 accel -- accel/accel.sh@40 -- # local IFS=, 00:06:20.085 23:28:04 accel -- accel/accel.sh@41 -- # jq -r . 00:06:20.085 [2024-07-24 23:28:05.020187] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:06:20.085 [2024-07-24 23:28:05.020233] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid213211 ] 00:06:20.085 [2024-07-24 23:28:05.084492] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:20.345 [2024-07-24 23:28:05.162012] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:20.912 23:28:05 accel -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:06:20.912 23:28:05 accel -- common/autotest_common.sh@864 -- # return 0 00:06:20.912 23:28:05 accel -- accel/accel.sh@65 -- # [[ 0 -gt 0 ]] 00:06:20.912 23:28:05 accel -- accel/accel.sh@66 -- # [[ 0 -gt 0 ]] 00:06:20.912 23:28:05 accel -- accel/accel.sh@67 -- # [[ 0 -gt 0 ]] 00:06:20.912 23:28:05 accel -- accel/accel.sh@68 -- # [[ -n '' ]] 00:06:20.912 23:28:05 accel -- accel/accel.sh@70 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:06:20.912 23:28:05 accel -- accel/accel.sh@70 -- # rpc_cmd accel_get_opc_assignments 00:06:20.912 23:28:05 accel -- accel/accel.sh@70 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:06:20.912 23:28:05 accel -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:20.912 23:28:05 accel -- common/autotest_common.sh@10 -- # set +x 00:06:20.912 23:28:05 accel -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:20.912 23:28:05 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:20.912 23:28:05 accel -- accel/accel.sh@72 -- # IFS== 00:06:20.912 23:28:05 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:20.912 23:28:05 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:20.912 23:28:05 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:20.912 23:28:05 accel -- accel/accel.sh@72 -- # IFS== 00:06:20.912 23:28:05 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:20.912 23:28:05 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:20.912 23:28:05 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:20.912 23:28:05 accel -- accel/accel.sh@72 -- # IFS== 00:06:20.912 23:28:05 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:20.912 23:28:05 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:20.912 23:28:05 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:20.912 23:28:05 accel -- accel/accel.sh@72 -- # IFS== 00:06:20.912 23:28:05 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:20.912 23:28:05 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:20.912 23:28:05 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:20.912 23:28:05 accel -- accel/accel.sh@72 -- # IFS== 00:06:20.912 23:28:05 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:20.912 23:28:05 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:20.912 23:28:05 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:20.912 23:28:05 accel -- accel/accel.sh@72 -- # IFS== 00:06:20.913 23:28:05 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:20.913 23:28:05 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:20.913 23:28:05 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:20.913 23:28:05 accel -- accel/accel.sh@72 -- # IFS== 00:06:20.913 23:28:05 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:20.913 23:28:05 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:20.913 23:28:05 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:20.913 23:28:05 accel -- accel/accel.sh@72 -- # IFS== 00:06:20.913 23:28:05 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:20.913 23:28:05 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:20.913 23:28:05 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:20.913 23:28:05 accel -- accel/accel.sh@72 -- # IFS== 00:06:20.913 23:28:05 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:20.913 23:28:05 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:20.913 23:28:05 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:20.913 23:28:05 accel -- accel/accel.sh@72 -- # IFS== 00:06:20.913 23:28:05 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:20.913 23:28:05 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:20.913 23:28:05 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:20.913 23:28:05 accel -- accel/accel.sh@72 -- # IFS== 00:06:20.913 23:28:05 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:20.913 23:28:05 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:20.913 23:28:05 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:20.913 23:28:05 accel -- accel/accel.sh@72 -- # IFS== 00:06:20.913 23:28:05 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:20.913 23:28:05 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:20.913 23:28:05 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:20.913 23:28:05 accel -- accel/accel.sh@72 -- # IFS== 00:06:20.913 23:28:05 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:20.913 23:28:05 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:20.913 23:28:05 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:20.913 23:28:05 accel -- accel/accel.sh@72 -- # IFS== 00:06:20.913 23:28:05 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:20.913 23:28:05 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:20.913 23:28:05 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:20.913 23:28:05 accel -- accel/accel.sh@72 -- # IFS== 00:06:20.913 23:28:05 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:20.913 23:28:05 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:20.913 23:28:05 accel -- accel/accel.sh@75 -- # killprocess 213211 00:06:20.913 23:28:05 accel -- common/autotest_common.sh@950 -- # '[' -z 213211 ']' 00:06:20.913 23:28:05 accel -- common/autotest_common.sh@954 -- # kill -0 213211 00:06:20.913 23:28:05 accel -- common/autotest_common.sh@955 -- # uname 00:06:20.913 23:28:05 accel -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:06:20.913 23:28:05 accel -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 213211 00:06:20.913 23:28:05 accel -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:06:20.913 23:28:05 accel -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:06:20.913 23:28:05 accel -- common/autotest_common.sh@968 -- # echo 'killing process with pid 213211' 00:06:20.913 killing process with pid 213211 00:06:20.913 23:28:05 accel -- common/autotest_common.sh@969 -- # kill 213211 00:06:20.913 23:28:05 accel -- common/autotest_common.sh@974 -- # wait 213211 00:06:21.481 23:28:06 accel -- accel/accel.sh@76 -- # trap - ERR 00:06:21.481 23:28:06 accel -- accel/accel.sh@89 -- # run_test accel_help accel_perf -h 00:06:21.481 23:28:06 accel -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:06:21.481 23:28:06 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:21.481 23:28:06 accel -- common/autotest_common.sh@10 -- # set +x 00:06:21.481 23:28:06 accel.accel_help -- common/autotest_common.sh@1125 -- # accel_perf -h 00:06:21.481 23:28:06 accel.accel_help -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -h 00:06:21.481 23:28:06 accel.accel_help -- accel/accel.sh@12 -- # build_accel_config 00:06:21.481 23:28:06 accel.accel_help -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:21.481 23:28:06 accel.accel_help -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:21.481 23:28:06 accel.accel_help -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:21.481 23:28:06 accel.accel_help -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:21.482 23:28:06 accel.accel_help -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:21.482 23:28:06 accel.accel_help -- accel/accel.sh@40 -- # local IFS=, 00:06:21.482 23:28:06 accel.accel_help -- accel/accel.sh@41 -- # jq -r . 00:06:21.482 23:28:06 accel.accel_help -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:21.482 23:28:06 accel.accel_help -- common/autotest_common.sh@10 -- # set +x 00:06:21.482 23:28:06 accel -- accel/accel.sh@91 -- # run_test accel_missing_filename NOT accel_perf -t 1 -w compress 00:06:21.482 23:28:06 accel -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:06:21.482 23:28:06 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:21.482 23:28:06 accel -- common/autotest_common.sh@10 -- # set +x 00:06:21.482 ************************************ 00:06:21.482 START TEST accel_missing_filename 00:06:21.482 ************************************ 00:06:21.482 23:28:06 accel.accel_missing_filename -- common/autotest_common.sh@1125 -- # NOT accel_perf -t 1 -w compress 00:06:21.482 23:28:06 accel.accel_missing_filename -- common/autotest_common.sh@650 -- # local es=0 00:06:21.482 23:28:06 accel.accel_missing_filename -- common/autotest_common.sh@652 -- # valid_exec_arg accel_perf -t 1 -w compress 00:06:21.482 23:28:06 accel.accel_missing_filename -- common/autotest_common.sh@638 -- # local arg=accel_perf 00:06:21.482 23:28:06 accel.accel_missing_filename -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:21.482 23:28:06 accel.accel_missing_filename -- common/autotest_common.sh@642 -- # type -t accel_perf 00:06:21.482 23:28:06 accel.accel_missing_filename -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:21.482 23:28:06 accel.accel_missing_filename -- common/autotest_common.sh@653 -- # accel_perf -t 1 -w compress 00:06:21.482 23:28:06 accel.accel_missing_filename -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress 00:06:21.482 23:28:06 accel.accel_missing_filename -- accel/accel.sh@12 -- # build_accel_config 00:06:21.482 23:28:06 accel.accel_missing_filename -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:21.482 23:28:06 accel.accel_missing_filename -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:21.482 23:28:06 accel.accel_missing_filename -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:21.482 23:28:06 accel.accel_missing_filename -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:21.482 23:28:06 accel.accel_missing_filename -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:21.482 23:28:06 accel.accel_missing_filename -- accel/accel.sh@40 -- # local IFS=, 00:06:21.482 23:28:06 accel.accel_missing_filename -- accel/accel.sh@41 -- # jq -r . 00:06:21.482 [2024-07-24 23:28:06.372386] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:06:21.482 [2024-07-24 23:28:06.372442] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid213474 ] 00:06:21.482 [2024-07-24 23:28:06.437226] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:21.741 [2024-07-24 23:28:06.509825] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:21.741 [2024-07-24 23:28:06.564545] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:21.741 [2024-07-24 23:28:06.624657] accel_perf.c:1463:main: *ERROR*: ERROR starting application 00:06:21.741 A filename is required. 00:06:21.741 23:28:06 accel.accel_missing_filename -- common/autotest_common.sh@653 -- # es=234 00:06:21.741 23:28:06 accel.accel_missing_filename -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:21.741 23:28:06 accel.accel_missing_filename -- common/autotest_common.sh@662 -- # es=106 00:06:21.741 23:28:06 accel.accel_missing_filename -- common/autotest_common.sh@663 -- # case "$es" in 00:06:21.741 23:28:06 accel.accel_missing_filename -- common/autotest_common.sh@670 -- # es=1 00:06:21.741 23:28:06 accel.accel_missing_filename -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:21.741 00:06:21.741 real 0m0.353s 00:06:21.741 user 0m0.260s 00:06:21.741 sys 0m0.122s 00:06:21.741 23:28:06 accel.accel_missing_filename -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:21.741 23:28:06 accel.accel_missing_filename -- common/autotest_common.sh@10 -- # set +x 00:06:21.741 ************************************ 00:06:21.741 END TEST accel_missing_filename 00:06:21.741 ************************************ 00:06:21.741 23:28:06 accel -- accel/accel.sh@93 -- # run_test accel_compress_verify NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:06:21.741 23:28:06 accel -- common/autotest_common.sh@1101 -- # '[' 10 -le 1 ']' 00:06:21.741 23:28:06 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:21.741 23:28:06 accel -- common/autotest_common.sh@10 -- # set +x 00:06:22.001 ************************************ 00:06:22.001 START TEST accel_compress_verify 00:06:22.001 ************************************ 00:06:22.001 23:28:06 accel.accel_compress_verify -- common/autotest_common.sh@1125 -- # NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:06:22.001 23:28:06 accel.accel_compress_verify -- common/autotest_common.sh@650 -- # local es=0 00:06:22.001 23:28:06 accel.accel_compress_verify -- common/autotest_common.sh@652 -- # valid_exec_arg accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:06:22.001 23:28:06 accel.accel_compress_verify -- common/autotest_common.sh@638 -- # local arg=accel_perf 00:06:22.001 23:28:06 accel.accel_compress_verify -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:22.001 23:28:06 accel.accel_compress_verify -- common/autotest_common.sh@642 -- # type -t accel_perf 00:06:22.001 23:28:06 accel.accel_compress_verify -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:22.001 23:28:06 accel.accel_compress_verify -- common/autotest_common.sh@653 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:06:22.001 23:28:06 accel.accel_compress_verify -- accel/accel.sh@12 -- # build_accel_config 00:06:22.001 23:28:06 accel.accel_compress_verify -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:22.001 23:28:06 accel.accel_compress_verify -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:06:22.001 23:28:06 accel.accel_compress_verify -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:22.001 23:28:06 accel.accel_compress_verify -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:22.001 23:28:06 accel.accel_compress_verify -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:22.001 23:28:06 accel.accel_compress_verify -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:22.001 23:28:06 accel.accel_compress_verify -- accel/accel.sh@40 -- # local IFS=, 00:06:22.001 23:28:06 accel.accel_compress_verify -- accel/accel.sh@41 -- # jq -r . 00:06:22.001 [2024-07-24 23:28:06.769831] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:06:22.001 [2024-07-24 23:28:06.769867] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid213514 ] 00:06:22.001 [2024-07-24 23:28:06.832336] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:22.001 [2024-07-24 23:28:06.903889] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:22.001 [2024-07-24 23:28:06.961361] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:22.261 [2024-07-24 23:28:07.022280] accel_perf.c:1463:main: *ERROR*: ERROR starting application 00:06:22.261 00:06:22.261 Compression does not support the verify option, aborting. 00:06:22.261 23:28:07 accel.accel_compress_verify -- common/autotest_common.sh@653 -- # es=161 00:06:22.261 23:28:07 accel.accel_compress_verify -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:22.261 23:28:07 accel.accel_compress_verify -- common/autotest_common.sh@662 -- # es=33 00:06:22.261 23:28:07 accel.accel_compress_verify -- common/autotest_common.sh@663 -- # case "$es" in 00:06:22.261 23:28:07 accel.accel_compress_verify -- common/autotest_common.sh@670 -- # es=1 00:06:22.261 23:28:07 accel.accel_compress_verify -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:22.261 00:06:22.261 real 0m0.348s 00:06:22.261 user 0m0.235s 00:06:22.261 sys 0m0.119s 00:06:22.261 23:28:07 accel.accel_compress_verify -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:22.261 23:28:07 accel.accel_compress_verify -- common/autotest_common.sh@10 -- # set +x 00:06:22.261 ************************************ 00:06:22.261 END TEST accel_compress_verify 00:06:22.261 ************************************ 00:06:22.261 23:28:07 accel -- accel/accel.sh@95 -- # run_test accel_wrong_workload NOT accel_perf -t 1 -w foobar 00:06:22.261 23:28:07 accel -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:06:22.261 23:28:07 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:22.261 23:28:07 accel -- common/autotest_common.sh@10 -- # set +x 00:06:22.261 ************************************ 00:06:22.261 START TEST accel_wrong_workload 00:06:22.261 ************************************ 00:06:22.261 23:28:07 accel.accel_wrong_workload -- common/autotest_common.sh@1125 -- # NOT accel_perf -t 1 -w foobar 00:06:22.261 23:28:07 accel.accel_wrong_workload -- common/autotest_common.sh@650 -- # local es=0 00:06:22.261 23:28:07 accel.accel_wrong_workload -- common/autotest_common.sh@652 -- # valid_exec_arg accel_perf -t 1 -w foobar 00:06:22.261 23:28:07 accel.accel_wrong_workload -- common/autotest_common.sh@638 -- # local arg=accel_perf 00:06:22.261 23:28:07 accel.accel_wrong_workload -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:22.261 23:28:07 accel.accel_wrong_workload -- common/autotest_common.sh@642 -- # type -t accel_perf 00:06:22.261 23:28:07 accel.accel_wrong_workload -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:22.261 23:28:07 accel.accel_wrong_workload -- common/autotest_common.sh@653 -- # accel_perf -t 1 -w foobar 00:06:22.261 23:28:07 accel.accel_wrong_workload -- accel/accel.sh@12 -- # build_accel_config 00:06:22.261 23:28:07 accel.accel_wrong_workload -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:22.261 23:28:07 accel.accel_wrong_workload -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w foobar 00:06:22.261 23:28:07 accel.accel_wrong_workload -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:22.261 23:28:07 accel.accel_wrong_workload -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:22.261 23:28:07 accel.accel_wrong_workload -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:22.261 23:28:07 accel.accel_wrong_workload -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:22.261 23:28:07 accel.accel_wrong_workload -- accel/accel.sh@40 -- # local IFS=, 00:06:22.261 23:28:07 accel.accel_wrong_workload -- accel/accel.sh@41 -- # jq -r . 00:06:22.261 Unsupported workload type: foobar 00:06:22.261 [2024-07-24 23:28:07.179411] app.c:1451:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'w' failed: 1 00:06:22.261 accel_perf options: 00:06:22.261 [-h help message] 00:06:22.261 [-q queue depth per core] 00:06:22.261 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:06:22.261 [-T number of threads per core 00:06:22.261 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:06:22.261 [-t time in seconds] 00:06:22.261 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:06:22.261 [ dif_verify, dif_verify_copy, dif_generate, dif_generate_copy 00:06:22.261 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:06:22.261 [-l for compress/decompress workloads, name of uncompressed input file 00:06:22.261 [-S for crc32c workload, use this seed value (default 0) 00:06:22.261 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:06:22.261 [-f for fill workload, use this BYTE value (default 255) 00:06:22.261 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:06:22.261 [-y verify result if this switch is on] 00:06:22.261 [-a tasks to allocate per core (default: same value as -q)] 00:06:22.261 Can be used to spread operations across a wider range of memory. 00:06:22.261 23:28:07 accel.accel_wrong_workload -- common/autotest_common.sh@653 -- # es=1 00:06:22.261 23:28:07 accel.accel_wrong_workload -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:22.261 23:28:07 accel.accel_wrong_workload -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:06:22.261 23:28:07 accel.accel_wrong_workload -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:22.261 00:06:22.261 real 0m0.037s 00:06:22.261 user 0m0.040s 00:06:22.261 sys 0m0.019s 00:06:22.261 23:28:07 accel.accel_wrong_workload -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:22.261 23:28:07 accel.accel_wrong_workload -- common/autotest_common.sh@10 -- # set +x 00:06:22.261 ************************************ 00:06:22.261 END TEST accel_wrong_workload 00:06:22.261 ************************************ 00:06:22.261 23:28:07 accel -- accel/accel.sh@97 -- # run_test accel_negative_buffers NOT accel_perf -t 1 -w xor -y -x -1 00:06:22.261 23:28:07 accel -- common/autotest_common.sh@1101 -- # '[' 10 -le 1 ']' 00:06:22.261 23:28:07 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:22.261 23:28:07 accel -- common/autotest_common.sh@10 -- # set +x 00:06:22.261 ************************************ 00:06:22.261 START TEST accel_negative_buffers 00:06:22.261 ************************************ 00:06:22.261 23:28:07 accel.accel_negative_buffers -- common/autotest_common.sh@1125 -- # NOT accel_perf -t 1 -w xor -y -x -1 00:06:22.261 23:28:07 accel.accel_negative_buffers -- common/autotest_common.sh@650 -- # local es=0 00:06:22.262 23:28:07 accel.accel_negative_buffers -- common/autotest_common.sh@652 -- # valid_exec_arg accel_perf -t 1 -w xor -y -x -1 00:06:22.262 23:28:07 accel.accel_negative_buffers -- common/autotest_common.sh@638 -- # local arg=accel_perf 00:06:22.262 23:28:07 accel.accel_negative_buffers -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:22.262 23:28:07 accel.accel_negative_buffers -- common/autotest_common.sh@642 -- # type -t accel_perf 00:06:22.262 23:28:07 accel.accel_negative_buffers -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:22.262 23:28:07 accel.accel_negative_buffers -- common/autotest_common.sh@653 -- # accel_perf -t 1 -w xor -y -x -1 00:06:22.262 23:28:07 accel.accel_negative_buffers -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x -1 00:06:22.262 23:28:07 accel.accel_negative_buffers -- accel/accel.sh@12 -- # build_accel_config 00:06:22.262 23:28:07 accel.accel_negative_buffers -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:22.262 23:28:07 accel.accel_negative_buffers -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:22.262 23:28:07 accel.accel_negative_buffers -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:22.262 23:28:07 accel.accel_negative_buffers -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:22.262 23:28:07 accel.accel_negative_buffers -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:22.262 23:28:07 accel.accel_negative_buffers -- accel/accel.sh@40 -- # local IFS=, 00:06:22.262 23:28:07 accel.accel_negative_buffers -- accel/accel.sh@41 -- # jq -r . 00:06:22.521 -x option must be non-negative. 00:06:22.521 [2024-07-24 23:28:07.275863] app.c:1451:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'x' failed: 1 00:06:22.521 accel_perf options: 00:06:22.521 [-h help message] 00:06:22.521 [-q queue depth per core] 00:06:22.521 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:06:22.521 [-T number of threads per core 00:06:22.521 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:06:22.521 [-t time in seconds] 00:06:22.521 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:06:22.521 [ dif_verify, dif_verify_copy, dif_generate, dif_generate_copy 00:06:22.521 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:06:22.521 [-l for compress/decompress workloads, name of uncompressed input file 00:06:22.521 [-S for crc32c workload, use this seed value (default 0) 00:06:22.521 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:06:22.521 [-f for fill workload, use this BYTE value (default 255) 00:06:22.521 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:06:22.521 [-y verify result if this switch is on] 00:06:22.521 [-a tasks to allocate per core (default: same value as -q)] 00:06:22.521 Can be used to spread operations across a wider range of memory. 00:06:22.521 23:28:07 accel.accel_negative_buffers -- common/autotest_common.sh@653 -- # es=1 00:06:22.521 23:28:07 accel.accel_negative_buffers -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:22.521 23:28:07 accel.accel_negative_buffers -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:06:22.521 23:28:07 accel.accel_negative_buffers -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:22.521 00:06:22.521 real 0m0.037s 00:06:22.521 user 0m0.024s 00:06:22.521 sys 0m0.013s 00:06:22.521 23:28:07 accel.accel_negative_buffers -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:22.521 23:28:07 accel.accel_negative_buffers -- common/autotest_common.sh@10 -- # set +x 00:06:22.521 ************************************ 00:06:22.521 END TEST accel_negative_buffers 00:06:22.521 ************************************ 00:06:22.522 Error: writing output failed: Broken pipe 00:06:22.522 23:28:07 accel -- accel/accel.sh@101 -- # run_test accel_crc32c accel_test -t 1 -w crc32c -S 32 -y 00:06:22.522 23:28:07 accel -- common/autotest_common.sh@1101 -- # '[' 9 -le 1 ']' 00:06:22.522 23:28:07 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:22.522 23:28:07 accel -- common/autotest_common.sh@10 -- # set +x 00:06:22.522 ************************************ 00:06:22.522 START TEST accel_crc32c 00:06:22.522 ************************************ 00:06:22.522 23:28:07 accel.accel_crc32c -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w crc32c -S 32 -y 00:06:22.522 23:28:07 accel.accel_crc32c -- accel/accel.sh@16 -- # local accel_opc 00:06:22.522 23:28:07 accel.accel_crc32c -- accel/accel.sh@17 -- # local accel_module 00:06:22.522 23:28:07 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:22.522 23:28:07 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:22.522 23:28:07 accel.accel_crc32c -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:06:22.522 23:28:07 accel.accel_crc32c -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:06:22.522 23:28:07 accel.accel_crc32c -- accel/accel.sh@12 -- # build_accel_config 00:06:22.522 23:28:07 accel.accel_crc32c -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:22.522 23:28:07 accel.accel_crc32c -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:22.522 23:28:07 accel.accel_crc32c -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:22.522 23:28:07 accel.accel_crc32c -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:22.522 23:28:07 accel.accel_crc32c -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:22.522 23:28:07 accel.accel_crc32c -- accel/accel.sh@40 -- # local IFS=, 00:06:22.522 23:28:07 accel.accel_crc32c -- accel/accel.sh@41 -- # jq -r . 00:06:22.522 [2024-07-24 23:28:07.381101] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:06:22.522 [2024-07-24 23:28:07.381154] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid213776 ] 00:06:22.522 [2024-07-24 23:28:07.448971] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:22.781 [2024-07-24 23:28:07.526270] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:22.781 23:28:07 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:22.781 23:28:07 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:22.781 23:28:07 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:22.781 23:28:07 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:22.781 23:28:07 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:22.782 23:28:07 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:22.782 23:28:07 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:22.782 23:28:07 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:22.782 23:28:07 accel.accel_crc32c -- accel/accel.sh@20 -- # val=0x1 00:06:22.782 23:28:07 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:22.782 23:28:07 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:22.782 23:28:07 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:22.782 23:28:07 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:22.782 23:28:07 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:22.782 23:28:07 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:22.782 23:28:07 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:22.782 23:28:07 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:22.782 23:28:07 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:22.782 23:28:07 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:22.782 23:28:07 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:22.782 23:28:07 accel.accel_crc32c -- accel/accel.sh@20 -- # val=crc32c 00:06:22.782 23:28:07 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:22.782 23:28:07 accel.accel_crc32c -- accel/accel.sh@23 -- # accel_opc=crc32c 00:06:22.782 23:28:07 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:22.782 23:28:07 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:22.782 23:28:07 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:06:22.782 23:28:07 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:22.782 23:28:07 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:22.782 23:28:07 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:22.782 23:28:07 accel.accel_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:22.782 23:28:07 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:22.782 23:28:07 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:22.782 23:28:07 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:22.782 23:28:07 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:22.782 23:28:07 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:22.782 23:28:07 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:22.782 23:28:07 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:22.782 23:28:07 accel.accel_crc32c -- accel/accel.sh@20 -- # val=software 00:06:22.782 23:28:07 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:22.782 23:28:07 accel.accel_crc32c -- accel/accel.sh@22 -- # accel_module=software 00:06:22.782 23:28:07 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:22.782 23:28:07 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:22.782 23:28:07 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:06:22.782 23:28:07 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:22.782 23:28:07 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:22.782 23:28:07 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:22.782 23:28:07 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:06:22.782 23:28:07 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:22.782 23:28:07 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:22.782 23:28:07 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:22.782 23:28:07 accel.accel_crc32c -- accel/accel.sh@20 -- # val=1 00:06:22.782 23:28:07 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:22.782 23:28:07 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:22.782 23:28:07 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:22.782 23:28:07 accel.accel_crc32c -- accel/accel.sh@20 -- # val='1 seconds' 00:06:22.782 23:28:07 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:22.782 23:28:07 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:22.782 23:28:07 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:22.782 23:28:07 accel.accel_crc32c -- accel/accel.sh@20 -- # val=Yes 00:06:22.782 23:28:07 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:22.782 23:28:07 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:22.782 23:28:07 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:22.782 23:28:07 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:22.782 23:28:07 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:22.782 23:28:07 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:22.782 23:28:07 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:22.782 23:28:07 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:22.782 23:28:07 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:22.782 23:28:07 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:22.782 23:28:07 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:23.719 23:28:08 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:23.719 23:28:08 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:23.719 23:28:08 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:23.719 23:28:08 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:23.719 23:28:08 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:23.719 23:28:08 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:23.719 23:28:08 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:23.719 23:28:08 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:23.719 23:28:08 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:23.719 23:28:08 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:23.719 23:28:08 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:23.719 23:28:08 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:23.719 23:28:08 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:23.719 23:28:08 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:23.719 23:28:08 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:23.719 23:28:08 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:23.719 23:28:08 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:23.719 23:28:08 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:23.719 23:28:08 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:23.719 23:28:08 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:23.719 23:28:08 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:23.719 23:28:08 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:23.719 23:28:08 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:23.719 23:28:08 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:23.719 23:28:08 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:23.719 23:28:08 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:06:23.719 23:28:08 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:23.719 00:06:23.719 real 0m1.365s 00:06:23.719 user 0m1.247s 00:06:23.719 sys 0m0.124s 00:06:23.719 23:28:08 accel.accel_crc32c -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:23.719 23:28:08 accel.accel_crc32c -- common/autotest_common.sh@10 -- # set +x 00:06:23.719 ************************************ 00:06:23.719 END TEST accel_crc32c 00:06:23.719 ************************************ 00:06:23.978 23:28:08 accel -- accel/accel.sh@102 -- # run_test accel_crc32c_C2 accel_test -t 1 -w crc32c -y -C 2 00:06:23.978 23:28:08 accel -- common/autotest_common.sh@1101 -- # '[' 9 -le 1 ']' 00:06:23.978 23:28:08 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:23.978 23:28:08 accel -- common/autotest_common.sh@10 -- # set +x 00:06:23.978 ************************************ 00:06:23.978 START TEST accel_crc32c_C2 00:06:23.978 ************************************ 00:06:23.978 23:28:08 accel.accel_crc32c_C2 -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w crc32c -y -C 2 00:06:23.978 23:28:08 accel.accel_crc32c_C2 -- accel/accel.sh@16 -- # local accel_opc 00:06:23.978 23:28:08 accel.accel_crc32c_C2 -- accel/accel.sh@17 -- # local accel_module 00:06:23.978 23:28:08 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:23.978 23:28:08 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:23.978 23:28:08 accel.accel_crc32c_C2 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -y -C 2 00:06:23.978 23:28:08 accel.accel_crc32c_C2 -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:06:23.978 23:28:08 accel.accel_crc32c_C2 -- accel/accel.sh@12 -- # build_accel_config 00:06:23.978 23:28:08 accel.accel_crc32c_C2 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:23.978 23:28:08 accel.accel_crc32c_C2 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:23.978 23:28:08 accel.accel_crc32c_C2 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:23.978 23:28:08 accel.accel_crc32c_C2 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:23.978 23:28:08 accel.accel_crc32c_C2 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:23.978 23:28:08 accel.accel_crc32c_C2 -- accel/accel.sh@40 -- # local IFS=, 00:06:23.978 23:28:08 accel.accel_crc32c_C2 -- accel/accel.sh@41 -- # jq -r . 00:06:23.978 [2024-07-24 23:28:08.805981] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:06:23.978 [2024-07-24 23:28:08.806026] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid214029 ] 00:06:23.978 [2024-07-24 23:28:08.871298] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:23.978 [2024-07-24 23:28:08.942004] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:24.237 23:28:08 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:24.238 23:28:08 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:24.238 23:28:08 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:24.238 23:28:09 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:24.238 23:28:09 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:24.238 23:28:09 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:24.238 23:28:09 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:24.238 23:28:09 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:24.238 23:28:09 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=0x1 00:06:24.238 23:28:09 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:24.238 23:28:09 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:24.238 23:28:09 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:24.238 23:28:09 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:24.238 23:28:09 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:24.238 23:28:09 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:24.238 23:28:09 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:24.238 23:28:09 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:24.238 23:28:09 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:24.238 23:28:09 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:24.238 23:28:09 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:24.238 23:28:09 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=crc32c 00:06:24.238 23:28:09 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:24.238 23:28:09 accel.accel_crc32c_C2 -- accel/accel.sh@23 -- # accel_opc=crc32c 00:06:24.238 23:28:09 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:24.238 23:28:09 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:24.238 23:28:09 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=0 00:06:24.238 23:28:09 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:24.238 23:28:09 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:24.238 23:28:09 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:24.238 23:28:09 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:24.238 23:28:09 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:24.238 23:28:09 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:24.238 23:28:09 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:24.238 23:28:09 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:24.238 23:28:09 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:24.238 23:28:09 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:24.238 23:28:09 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:24.238 23:28:09 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=software 00:06:24.238 23:28:09 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:24.238 23:28:09 accel.accel_crc32c_C2 -- accel/accel.sh@22 -- # accel_module=software 00:06:24.238 23:28:09 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:24.238 23:28:09 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:24.238 23:28:09 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:06:24.238 23:28:09 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:24.238 23:28:09 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:24.238 23:28:09 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:24.238 23:28:09 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:06:24.238 23:28:09 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:24.238 23:28:09 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:24.238 23:28:09 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:24.238 23:28:09 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=1 00:06:24.238 23:28:09 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:24.238 23:28:09 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:24.238 23:28:09 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:24.238 23:28:09 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val='1 seconds' 00:06:24.238 23:28:09 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:24.238 23:28:09 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:24.238 23:28:09 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:24.238 23:28:09 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=Yes 00:06:24.238 23:28:09 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:24.238 23:28:09 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:24.238 23:28:09 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:24.238 23:28:09 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:24.238 23:28:09 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:24.238 23:28:09 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:24.238 23:28:09 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:24.238 23:28:09 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:24.238 23:28:09 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:24.238 23:28:09 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:24.238 23:28:09 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:25.175 23:28:10 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:25.175 23:28:10 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:25.175 23:28:10 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:25.175 23:28:10 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:25.175 23:28:10 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:25.175 23:28:10 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:25.175 23:28:10 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:25.175 23:28:10 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:25.175 23:28:10 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:25.175 23:28:10 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:25.175 23:28:10 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:25.175 23:28:10 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:25.175 23:28:10 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:25.175 23:28:10 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:25.175 23:28:10 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:25.175 23:28:10 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:25.175 23:28:10 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:25.175 23:28:10 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:25.175 23:28:10 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:25.175 23:28:10 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:25.175 23:28:10 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:25.175 23:28:10 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:25.175 23:28:10 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:25.175 23:28:10 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:25.175 23:28:10 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:25.175 23:28:10 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:06:25.175 23:28:10 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:25.175 00:06:25.175 real 0m1.372s 00:06:25.175 user 0m1.250s 00:06:25.175 sys 0m0.121s 00:06:25.175 23:28:10 accel.accel_crc32c_C2 -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:25.175 23:28:10 accel.accel_crc32c_C2 -- common/autotest_common.sh@10 -- # set +x 00:06:25.175 ************************************ 00:06:25.175 END TEST accel_crc32c_C2 00:06:25.175 ************************************ 00:06:25.434 23:28:10 accel -- accel/accel.sh@103 -- # run_test accel_copy accel_test -t 1 -w copy -y 00:06:25.434 23:28:10 accel -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:06:25.434 23:28:10 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:25.434 23:28:10 accel -- common/autotest_common.sh@10 -- # set +x 00:06:25.434 ************************************ 00:06:25.434 START TEST accel_copy 00:06:25.434 ************************************ 00:06:25.434 23:28:10 accel.accel_copy -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w copy -y 00:06:25.434 23:28:10 accel.accel_copy -- accel/accel.sh@16 -- # local accel_opc 00:06:25.434 23:28:10 accel.accel_copy -- accel/accel.sh@17 -- # local accel_module 00:06:25.434 23:28:10 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:25.434 23:28:10 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:25.434 23:28:10 accel.accel_copy -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy -y 00:06:25.434 23:28:10 accel.accel_copy -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:06:25.434 23:28:10 accel.accel_copy -- accel/accel.sh@12 -- # build_accel_config 00:06:25.434 23:28:10 accel.accel_copy -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:25.434 23:28:10 accel.accel_copy -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:25.434 23:28:10 accel.accel_copy -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:25.434 23:28:10 accel.accel_copy -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:25.434 23:28:10 accel.accel_copy -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:25.434 23:28:10 accel.accel_copy -- accel/accel.sh@40 -- # local IFS=, 00:06:25.434 23:28:10 accel.accel_copy -- accel/accel.sh@41 -- # jq -r . 00:06:25.434 [2024-07-24 23:28:10.243687] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:06:25.434 [2024-07-24 23:28:10.243744] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid214271 ] 00:06:25.434 [2024-07-24 23:28:10.311033] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:25.434 [2024-07-24 23:28:10.384417] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:25.694 23:28:10 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:25.694 23:28:10 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:25.694 23:28:10 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:25.694 23:28:10 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:25.694 23:28:10 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:25.694 23:28:10 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:25.694 23:28:10 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:25.694 23:28:10 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:25.694 23:28:10 accel.accel_copy -- accel/accel.sh@20 -- # val=0x1 00:06:25.694 23:28:10 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:25.694 23:28:10 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:25.694 23:28:10 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:25.694 23:28:10 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:25.694 23:28:10 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:25.694 23:28:10 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:25.694 23:28:10 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:25.694 23:28:10 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:25.694 23:28:10 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:25.694 23:28:10 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:25.694 23:28:10 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:25.694 23:28:10 accel.accel_copy -- accel/accel.sh@20 -- # val=copy 00:06:25.694 23:28:10 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:25.694 23:28:10 accel.accel_copy -- accel/accel.sh@23 -- # accel_opc=copy 00:06:25.694 23:28:10 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:25.694 23:28:10 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:25.694 23:28:10 accel.accel_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:25.694 23:28:10 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:25.694 23:28:10 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:25.694 23:28:10 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:25.694 23:28:10 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:25.694 23:28:10 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:25.694 23:28:10 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:25.694 23:28:10 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:25.694 23:28:10 accel.accel_copy -- accel/accel.sh@20 -- # val=software 00:06:25.694 23:28:10 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:25.694 23:28:10 accel.accel_copy -- accel/accel.sh@22 -- # accel_module=software 00:06:25.694 23:28:10 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:25.694 23:28:10 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:25.694 23:28:10 accel.accel_copy -- accel/accel.sh@20 -- # val=32 00:06:25.694 23:28:10 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:25.694 23:28:10 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:25.694 23:28:10 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:25.694 23:28:10 accel.accel_copy -- accel/accel.sh@20 -- # val=32 00:06:25.694 23:28:10 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:25.694 23:28:10 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:25.694 23:28:10 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:25.694 23:28:10 accel.accel_copy -- accel/accel.sh@20 -- # val=1 00:06:25.694 23:28:10 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:25.694 23:28:10 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:25.694 23:28:10 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:25.694 23:28:10 accel.accel_copy -- accel/accel.sh@20 -- # val='1 seconds' 00:06:25.694 23:28:10 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:25.694 23:28:10 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:25.694 23:28:10 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:25.694 23:28:10 accel.accel_copy -- accel/accel.sh@20 -- # val=Yes 00:06:25.694 23:28:10 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:25.694 23:28:10 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:25.694 23:28:10 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:25.694 23:28:10 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:25.694 23:28:10 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:25.694 23:28:10 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:25.694 23:28:10 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:25.694 23:28:10 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:25.694 23:28:10 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:25.694 23:28:10 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:25.694 23:28:10 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:26.632 23:28:11 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:26.632 23:28:11 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:26.632 23:28:11 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:26.632 23:28:11 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:26.632 23:28:11 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:26.632 23:28:11 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:26.632 23:28:11 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:26.632 23:28:11 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:26.632 23:28:11 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:26.632 23:28:11 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:26.632 23:28:11 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:26.632 23:28:11 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:26.632 23:28:11 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:26.632 23:28:11 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:26.632 23:28:11 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:26.632 23:28:11 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:26.632 23:28:11 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:26.632 23:28:11 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:26.632 23:28:11 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:26.632 23:28:11 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:26.632 23:28:11 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:26.632 23:28:11 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:26.632 23:28:11 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:26.632 23:28:11 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:26.632 23:28:11 accel.accel_copy -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:26.632 23:28:11 accel.accel_copy -- accel/accel.sh@27 -- # [[ -n copy ]] 00:06:26.632 23:28:11 accel.accel_copy -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:26.632 00:06:26.632 real 0m1.363s 00:06:26.632 user 0m1.245s 00:06:26.632 sys 0m0.124s 00:06:26.632 23:28:11 accel.accel_copy -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:26.632 23:28:11 accel.accel_copy -- common/autotest_common.sh@10 -- # set +x 00:06:26.632 ************************************ 00:06:26.632 END TEST accel_copy 00:06:26.632 ************************************ 00:06:26.632 23:28:11 accel -- accel/accel.sh@104 -- # run_test accel_fill accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:26.632 23:28:11 accel -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:06:26.632 23:28:11 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:26.632 23:28:11 accel -- common/autotest_common.sh@10 -- # set +x 00:06:26.892 ************************************ 00:06:26.892 START TEST accel_fill 00:06:26.892 ************************************ 00:06:26.892 23:28:11 accel.accel_fill -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:26.892 23:28:11 accel.accel_fill -- accel/accel.sh@16 -- # local accel_opc 00:06:26.892 23:28:11 accel.accel_fill -- accel/accel.sh@17 -- # local accel_module 00:06:26.892 23:28:11 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:26.892 23:28:11 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:26.892 23:28:11 accel.accel_fill -- accel/accel.sh@15 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:26.892 23:28:11 accel.accel_fill -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:26.892 23:28:11 accel.accel_fill -- accel/accel.sh@12 -- # build_accel_config 00:06:26.892 23:28:11 accel.accel_fill -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:26.892 23:28:11 accel.accel_fill -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:26.892 23:28:11 accel.accel_fill -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:26.892 23:28:11 accel.accel_fill -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:26.892 23:28:11 accel.accel_fill -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:26.892 23:28:11 accel.accel_fill -- accel/accel.sh@40 -- # local IFS=, 00:06:26.892 23:28:11 accel.accel_fill -- accel/accel.sh@41 -- # jq -r . 00:06:26.892 [2024-07-24 23:28:11.668241] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:06:26.892 [2024-07-24 23:28:11.668286] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid214519 ] 00:06:26.892 [2024-07-24 23:28:11.731553] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:26.892 [2024-07-24 23:28:11.803187] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:26.892 23:28:11 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:26.892 23:28:11 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:26.892 23:28:11 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:26.892 23:28:11 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:26.892 23:28:11 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:26.892 23:28:11 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:26.892 23:28:11 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:26.892 23:28:11 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:26.892 23:28:11 accel.accel_fill -- accel/accel.sh@20 -- # val=0x1 00:06:26.892 23:28:11 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:26.892 23:28:11 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:26.892 23:28:11 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:26.892 23:28:11 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:26.892 23:28:11 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:26.892 23:28:11 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:26.892 23:28:11 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:26.892 23:28:11 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:26.892 23:28:11 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:26.892 23:28:11 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:26.892 23:28:11 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:26.892 23:28:11 accel.accel_fill -- accel/accel.sh@20 -- # val=fill 00:06:26.892 23:28:11 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:26.892 23:28:11 accel.accel_fill -- accel/accel.sh@23 -- # accel_opc=fill 00:06:26.892 23:28:11 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:26.892 23:28:11 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:26.892 23:28:11 accel.accel_fill -- accel/accel.sh@20 -- # val=0x80 00:06:26.892 23:28:11 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:26.892 23:28:11 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:26.892 23:28:11 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:26.892 23:28:11 accel.accel_fill -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:26.892 23:28:11 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:26.892 23:28:11 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:26.892 23:28:11 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:26.892 23:28:11 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:26.892 23:28:11 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:26.892 23:28:11 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:26.892 23:28:11 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:26.892 23:28:11 accel.accel_fill -- accel/accel.sh@20 -- # val=software 00:06:26.892 23:28:11 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:26.892 23:28:11 accel.accel_fill -- accel/accel.sh@22 -- # accel_module=software 00:06:26.892 23:28:11 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:26.893 23:28:11 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:26.893 23:28:11 accel.accel_fill -- accel/accel.sh@20 -- # val=64 00:06:26.893 23:28:11 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:26.893 23:28:11 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:26.893 23:28:11 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:26.893 23:28:11 accel.accel_fill -- accel/accel.sh@20 -- # val=64 00:06:26.893 23:28:11 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:26.893 23:28:11 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:26.893 23:28:11 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:26.893 23:28:11 accel.accel_fill -- accel/accel.sh@20 -- # val=1 00:06:26.893 23:28:11 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:26.893 23:28:11 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:26.893 23:28:11 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:26.893 23:28:11 accel.accel_fill -- accel/accel.sh@20 -- # val='1 seconds' 00:06:26.893 23:28:11 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:26.893 23:28:11 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:26.893 23:28:11 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:26.893 23:28:11 accel.accel_fill -- accel/accel.sh@20 -- # val=Yes 00:06:26.893 23:28:11 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:26.893 23:28:11 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:26.893 23:28:11 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:26.893 23:28:11 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:26.893 23:28:11 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:26.893 23:28:11 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:26.893 23:28:11 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:26.893 23:28:11 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:26.893 23:28:11 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:26.893 23:28:11 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:26.893 23:28:11 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:28.269 23:28:12 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:28.269 23:28:12 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:28.269 23:28:12 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:28.269 23:28:12 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:28.269 23:28:12 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:28.269 23:28:12 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:28.269 23:28:12 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:28.269 23:28:12 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:28.269 23:28:12 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:28.269 23:28:12 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:28.269 23:28:12 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:28.269 23:28:12 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:28.269 23:28:12 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:28.269 23:28:12 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:28.269 23:28:12 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:28.269 23:28:12 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:28.269 23:28:12 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:28.269 23:28:12 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:28.269 23:28:12 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:28.269 23:28:12 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:28.269 23:28:12 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:28.269 23:28:12 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:28.269 23:28:12 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:28.269 23:28:12 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:28.269 23:28:12 accel.accel_fill -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:28.269 23:28:12 accel.accel_fill -- accel/accel.sh@27 -- # [[ -n fill ]] 00:06:28.269 23:28:12 accel.accel_fill -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:28.269 00:06:28.269 real 0m1.359s 00:06:28.269 user 0m1.236s 00:06:28.269 sys 0m0.124s 00:06:28.269 23:28:12 accel.accel_fill -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:28.269 23:28:12 accel.accel_fill -- common/autotest_common.sh@10 -- # set +x 00:06:28.269 ************************************ 00:06:28.269 END TEST accel_fill 00:06:28.269 ************************************ 00:06:28.269 23:28:13 accel -- accel/accel.sh@105 -- # run_test accel_copy_crc32c accel_test -t 1 -w copy_crc32c -y 00:06:28.269 23:28:13 accel -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:06:28.269 23:28:13 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:28.269 23:28:13 accel -- common/autotest_common.sh@10 -- # set +x 00:06:28.269 ************************************ 00:06:28.269 START TEST accel_copy_crc32c 00:06:28.269 ************************************ 00:06:28.269 23:28:13 accel.accel_copy_crc32c -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w copy_crc32c -y 00:06:28.269 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@16 -- # local accel_opc 00:06:28.269 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@17 -- # local accel_module 00:06:28.269 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:28.269 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:28.269 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y 00:06:28.269 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:06:28.269 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@12 -- # build_accel_config 00:06:28.269 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:28.269 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:28.269 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:28.269 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:28.269 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:28.269 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@40 -- # local IFS=, 00:06:28.269 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@41 -- # jq -r . 00:06:28.269 [2024-07-24 23:28:13.094053] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:06:28.269 [2024-07-24 23:28:13.094100] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid214771 ] 00:06:28.269 [2024-07-24 23:28:13.159309] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:28.269 [2024-07-24 23:28:13.229732] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=0x1 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=copy_crc32c 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=0 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=software 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@22 -- # accel_module=software 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=32 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=32 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=1 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='1 seconds' 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=Yes 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:28.526 23:28:13 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:29.459 23:28:14 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:29.459 23:28:14 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:29.459 23:28:14 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:29.459 23:28:14 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:29.459 23:28:14 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:29.459 23:28:14 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:29.459 23:28:14 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:29.459 23:28:14 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:29.459 23:28:14 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:29.459 23:28:14 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:29.459 23:28:14 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:29.459 23:28:14 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:29.459 23:28:14 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:29.459 23:28:14 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:29.459 23:28:14 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:29.459 23:28:14 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:29.459 23:28:14 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:29.459 23:28:14 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:29.459 23:28:14 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:29.459 23:28:14 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:29.459 23:28:14 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:29.459 23:28:14 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:29.459 23:28:14 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:29.459 23:28:14 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:29.459 23:28:14 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:29.459 23:28:14 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:06:29.459 23:28:14 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:29.459 00:06:29.459 real 0m1.362s 00:06:29.459 user 0m1.240s 00:06:29.459 sys 0m0.122s 00:06:29.459 23:28:14 accel.accel_copy_crc32c -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:29.459 23:28:14 accel.accel_copy_crc32c -- common/autotest_common.sh@10 -- # set +x 00:06:29.459 ************************************ 00:06:29.459 END TEST accel_copy_crc32c 00:06:29.459 ************************************ 00:06:29.717 23:28:14 accel -- accel/accel.sh@106 -- # run_test accel_copy_crc32c_C2 accel_test -t 1 -w copy_crc32c -y -C 2 00:06:29.717 23:28:14 accel -- common/autotest_common.sh@1101 -- # '[' 9 -le 1 ']' 00:06:29.717 23:28:14 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:29.717 23:28:14 accel -- common/autotest_common.sh@10 -- # set +x 00:06:29.717 ************************************ 00:06:29.717 START TEST accel_copy_crc32c_C2 00:06:29.717 ************************************ 00:06:29.717 23:28:14 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w copy_crc32c -y -C 2 00:06:29.717 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@16 -- # local accel_opc 00:06:29.717 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@17 -- # local accel_module 00:06:29.717 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:29.717 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:29.717 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:06:29.717 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:06:29.717 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@12 -- # build_accel_config 00:06:29.717 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:29.717 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:29.717 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:29.717 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:29.718 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:29.718 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@40 -- # local IFS=, 00:06:29.718 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@41 -- # jq -r . 00:06:29.718 [2024-07-24 23:28:14.528133] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:06:29.718 [2024-07-24 23:28:14.528187] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid215015 ] 00:06:29.718 [2024-07-24 23:28:14.593540] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:29.718 [2024-07-24 23:28:14.664381] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:29.975 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:29.975 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:29.975 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:29.975 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:29.975 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:29.975 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:29.975 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:29.975 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:29.975 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=0x1 00:06:29.975 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:29.975 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:29.975 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:29.975 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:29.975 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:29.975 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:29.975 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:29.975 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:29.975 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:29.975 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:29.975 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:29.975 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=copy_crc32c 00:06:29.975 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:29.975 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:06:29.976 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:29.976 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:29.976 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=0 00:06:29.976 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:29.976 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:29.976 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:29.976 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:29.976 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:29.976 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:29.976 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:29.976 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='8192 bytes' 00:06:29.976 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:29.976 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:29.976 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:29.976 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:29.976 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:29.976 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:29.976 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:29.976 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=software 00:06:29.976 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:29.976 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@22 -- # accel_module=software 00:06:29.976 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:29.976 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:29.976 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:06:29.976 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:29.976 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:29.976 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:29.976 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:06:29.976 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:29.976 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:29.976 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:29.976 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=1 00:06:29.976 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:29.976 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:29.976 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:29.976 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='1 seconds' 00:06:29.976 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:29.976 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:29.976 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:29.976 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=Yes 00:06:29.976 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:29.976 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:29.976 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:29.976 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:29.976 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:29.976 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:29.976 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:29.976 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:29.976 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:29.976 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:29.976 23:28:14 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:30.911 23:28:15 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:30.911 23:28:15 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:30.911 23:28:15 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:30.911 23:28:15 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:30.911 23:28:15 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:30.911 23:28:15 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:30.911 23:28:15 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:30.911 23:28:15 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:30.911 23:28:15 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:30.911 23:28:15 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:30.911 23:28:15 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:30.911 23:28:15 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:30.911 23:28:15 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:30.911 23:28:15 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:30.911 23:28:15 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:30.912 23:28:15 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:30.912 23:28:15 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:30.912 23:28:15 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:30.912 23:28:15 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:30.912 23:28:15 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:30.912 23:28:15 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:30.912 23:28:15 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:30.912 23:28:15 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:30.912 23:28:15 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:30.912 23:28:15 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:30.912 23:28:15 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:06:30.912 23:28:15 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:30.912 00:06:30.912 real 0m1.371s 00:06:30.912 user 0m1.245s 00:06:30.912 sys 0m0.121s 00:06:30.912 23:28:15 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:30.912 23:28:15 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@10 -- # set +x 00:06:30.912 ************************************ 00:06:30.912 END TEST accel_copy_crc32c_C2 00:06:30.912 ************************************ 00:06:30.912 23:28:15 accel -- accel/accel.sh@107 -- # run_test accel_dualcast accel_test -t 1 -w dualcast -y 00:06:30.912 23:28:15 accel -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:06:30.912 23:28:15 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:30.912 23:28:15 accel -- common/autotest_common.sh@10 -- # set +x 00:06:31.171 ************************************ 00:06:31.171 START TEST accel_dualcast 00:06:31.171 ************************************ 00:06:31.171 23:28:15 accel.accel_dualcast -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w dualcast -y 00:06:31.171 23:28:15 accel.accel_dualcast -- accel/accel.sh@16 -- # local accel_opc 00:06:31.171 23:28:15 accel.accel_dualcast -- accel/accel.sh@17 -- # local accel_module 00:06:31.171 23:28:15 accel.accel_dualcast -- accel/accel.sh@15 -- # accel_perf -t 1 -w dualcast -y 00:06:31.171 23:28:15 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:31.171 23:28:15 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:31.171 23:28:15 accel.accel_dualcast -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:06:31.171 23:28:15 accel.accel_dualcast -- accel/accel.sh@12 -- # build_accel_config 00:06:31.171 23:28:15 accel.accel_dualcast -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:31.171 23:28:15 accel.accel_dualcast -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:31.171 23:28:15 accel.accel_dualcast -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:31.171 23:28:15 accel.accel_dualcast -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:31.171 23:28:15 accel.accel_dualcast -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:31.171 23:28:15 accel.accel_dualcast -- accel/accel.sh@40 -- # local IFS=, 00:06:31.171 23:28:15 accel.accel_dualcast -- accel/accel.sh@41 -- # jq -r . 00:06:31.171 [2024-07-24 23:28:15.942957] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:06:31.171 [2024-07-24 23:28:15.942992] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid215263 ] 00:06:31.171 [2024-07-24 23:28:16.003959] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:31.171 [2024-07-24 23:28:16.074331] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:31.171 23:28:16 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:06:31.171 23:28:16 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:31.171 23:28:16 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:31.171 23:28:16 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:31.171 23:28:16 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:06:31.171 23:28:16 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:31.171 23:28:16 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:31.171 23:28:16 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:31.171 23:28:16 accel.accel_dualcast -- accel/accel.sh@20 -- # val=0x1 00:06:31.171 23:28:16 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:31.171 23:28:16 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:31.171 23:28:16 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:31.171 23:28:16 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:06:31.171 23:28:16 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:31.171 23:28:16 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:31.171 23:28:16 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:31.171 23:28:16 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:06:31.171 23:28:16 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:31.171 23:28:16 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:31.171 23:28:16 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:31.171 23:28:16 accel.accel_dualcast -- accel/accel.sh@20 -- # val=dualcast 00:06:31.171 23:28:16 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:31.171 23:28:16 accel.accel_dualcast -- accel/accel.sh@23 -- # accel_opc=dualcast 00:06:31.171 23:28:16 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:31.171 23:28:16 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:31.171 23:28:16 accel.accel_dualcast -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:31.171 23:28:16 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:31.171 23:28:16 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:31.171 23:28:16 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:31.171 23:28:16 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:06:31.171 23:28:16 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:31.171 23:28:16 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:31.171 23:28:16 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:31.171 23:28:16 accel.accel_dualcast -- accel/accel.sh@20 -- # val=software 00:06:31.171 23:28:16 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:31.171 23:28:16 accel.accel_dualcast -- accel/accel.sh@22 -- # accel_module=software 00:06:31.171 23:28:16 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:31.171 23:28:16 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:31.171 23:28:16 accel.accel_dualcast -- accel/accel.sh@20 -- # val=32 00:06:31.171 23:28:16 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:31.171 23:28:16 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:31.171 23:28:16 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:31.171 23:28:16 accel.accel_dualcast -- accel/accel.sh@20 -- # val=32 00:06:31.171 23:28:16 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:31.171 23:28:16 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:31.171 23:28:16 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:31.171 23:28:16 accel.accel_dualcast -- accel/accel.sh@20 -- # val=1 00:06:31.171 23:28:16 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:31.171 23:28:16 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:31.171 23:28:16 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:31.171 23:28:16 accel.accel_dualcast -- accel/accel.sh@20 -- # val='1 seconds' 00:06:31.171 23:28:16 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:31.171 23:28:16 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:31.171 23:28:16 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:31.171 23:28:16 accel.accel_dualcast -- accel/accel.sh@20 -- # val=Yes 00:06:31.171 23:28:16 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:31.171 23:28:16 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:31.171 23:28:16 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:31.171 23:28:16 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:06:31.171 23:28:16 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:31.171 23:28:16 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:31.171 23:28:16 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:31.171 23:28:16 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:06:31.171 23:28:16 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:31.172 23:28:16 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:31.172 23:28:16 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:32.548 23:28:17 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:06:32.548 23:28:17 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:32.548 23:28:17 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:32.548 23:28:17 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:32.548 23:28:17 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:06:32.548 23:28:17 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:32.548 23:28:17 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:32.548 23:28:17 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:32.548 23:28:17 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:06:32.548 23:28:17 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:32.548 23:28:17 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:32.548 23:28:17 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:32.548 23:28:17 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:06:32.548 23:28:17 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:32.548 23:28:17 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:32.548 23:28:17 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:32.548 23:28:17 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:06:32.548 23:28:17 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:32.548 23:28:17 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:32.548 23:28:17 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:32.548 23:28:17 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:06:32.548 23:28:17 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:32.548 23:28:17 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:32.548 23:28:17 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:32.548 23:28:17 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:32.548 23:28:17 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ -n dualcast ]] 00:06:32.548 23:28:17 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:32.548 00:06:32.548 real 0m1.337s 00:06:32.548 user 0m1.229s 00:06:32.548 sys 0m0.115s 00:06:32.548 23:28:17 accel.accel_dualcast -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:32.548 23:28:17 accel.accel_dualcast -- common/autotest_common.sh@10 -- # set +x 00:06:32.548 ************************************ 00:06:32.548 END TEST accel_dualcast 00:06:32.548 ************************************ 00:06:32.548 23:28:17 accel -- accel/accel.sh@108 -- # run_test accel_compare accel_test -t 1 -w compare -y 00:06:32.548 23:28:17 accel -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:06:32.548 23:28:17 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:32.548 23:28:17 accel -- common/autotest_common.sh@10 -- # set +x 00:06:32.548 ************************************ 00:06:32.548 START TEST accel_compare 00:06:32.548 ************************************ 00:06:32.548 23:28:17 accel.accel_compare -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w compare -y 00:06:32.548 23:28:17 accel.accel_compare -- accel/accel.sh@16 -- # local accel_opc 00:06:32.548 23:28:17 accel.accel_compare -- accel/accel.sh@17 -- # local accel_module 00:06:32.548 23:28:17 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:32.548 23:28:17 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:32.548 23:28:17 accel.accel_compare -- accel/accel.sh@15 -- # accel_perf -t 1 -w compare -y 00:06:32.548 23:28:17 accel.accel_compare -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:06:32.548 23:28:17 accel.accel_compare -- accel/accel.sh@12 -- # build_accel_config 00:06:32.549 23:28:17 accel.accel_compare -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:32.549 23:28:17 accel.accel_compare -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:32.549 23:28:17 accel.accel_compare -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:32.549 23:28:17 accel.accel_compare -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:32.549 23:28:17 accel.accel_compare -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:32.549 23:28:17 accel.accel_compare -- accel/accel.sh@40 -- # local IFS=, 00:06:32.549 23:28:17 accel.accel_compare -- accel/accel.sh@41 -- # jq -r . 00:06:32.549 [2024-07-24 23:28:17.348862] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:06:32.549 [2024-07-24 23:28:17.348906] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid215508 ] 00:06:32.549 [2024-07-24 23:28:17.412126] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:32.549 [2024-07-24 23:28:17.482906] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:32.549 23:28:17 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:06:32.549 23:28:17 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:32.549 23:28:17 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:32.549 23:28:17 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:32.549 23:28:17 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:06:32.549 23:28:17 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:32.549 23:28:17 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:32.549 23:28:17 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:32.549 23:28:17 accel.accel_compare -- accel/accel.sh@20 -- # val=0x1 00:06:32.549 23:28:17 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:32.549 23:28:17 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:32.549 23:28:17 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:32.549 23:28:17 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:06:32.549 23:28:17 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:32.549 23:28:17 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:32.549 23:28:17 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:32.549 23:28:17 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:06:32.549 23:28:17 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:32.549 23:28:17 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:32.549 23:28:17 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:32.549 23:28:17 accel.accel_compare -- accel/accel.sh@20 -- # val=compare 00:06:32.549 23:28:17 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:32.549 23:28:17 accel.accel_compare -- accel/accel.sh@23 -- # accel_opc=compare 00:06:32.549 23:28:17 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:32.549 23:28:17 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:32.549 23:28:17 accel.accel_compare -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:32.549 23:28:17 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:32.549 23:28:17 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:32.549 23:28:17 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:32.549 23:28:17 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:06:32.549 23:28:17 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:32.549 23:28:17 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:32.549 23:28:17 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:32.549 23:28:17 accel.accel_compare -- accel/accel.sh@20 -- # val=software 00:06:32.549 23:28:17 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:32.549 23:28:17 accel.accel_compare -- accel/accel.sh@22 -- # accel_module=software 00:06:32.549 23:28:17 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:32.549 23:28:17 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:32.549 23:28:17 accel.accel_compare -- accel/accel.sh@20 -- # val=32 00:06:32.549 23:28:17 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:32.549 23:28:17 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:32.549 23:28:17 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:32.549 23:28:17 accel.accel_compare -- accel/accel.sh@20 -- # val=32 00:06:32.549 23:28:17 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:32.549 23:28:17 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:32.549 23:28:17 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:32.549 23:28:17 accel.accel_compare -- accel/accel.sh@20 -- # val=1 00:06:32.549 23:28:17 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:32.549 23:28:17 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:32.549 23:28:17 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:32.549 23:28:17 accel.accel_compare -- accel/accel.sh@20 -- # val='1 seconds' 00:06:32.549 23:28:17 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:32.549 23:28:17 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:32.549 23:28:17 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:32.808 23:28:17 accel.accel_compare -- accel/accel.sh@20 -- # val=Yes 00:06:32.808 23:28:17 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:32.808 23:28:17 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:32.808 23:28:17 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:32.808 23:28:17 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:06:32.808 23:28:17 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:32.808 23:28:17 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:32.808 23:28:17 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:32.808 23:28:17 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:06:32.808 23:28:17 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:32.808 23:28:17 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:32.808 23:28:17 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:33.746 23:28:18 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:06:33.746 23:28:18 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:33.746 23:28:18 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:33.746 23:28:18 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:33.746 23:28:18 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:06:33.746 23:28:18 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:33.746 23:28:18 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:33.746 23:28:18 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:33.746 23:28:18 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:06:33.746 23:28:18 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:33.746 23:28:18 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:33.746 23:28:18 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:33.746 23:28:18 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:06:33.746 23:28:18 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:33.746 23:28:18 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:33.746 23:28:18 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:33.746 23:28:18 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:06:33.746 23:28:18 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:33.746 23:28:18 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:33.746 23:28:18 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:33.746 23:28:18 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:06:33.746 23:28:18 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:33.746 23:28:18 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:33.746 23:28:18 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:33.746 23:28:18 accel.accel_compare -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:33.746 23:28:18 accel.accel_compare -- accel/accel.sh@27 -- # [[ -n compare ]] 00:06:33.746 23:28:18 accel.accel_compare -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:33.746 00:06:33.746 real 0m1.354s 00:06:33.746 user 0m1.234s 00:06:33.746 sys 0m0.126s 00:06:33.746 23:28:18 accel.accel_compare -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:33.746 23:28:18 accel.accel_compare -- common/autotest_common.sh@10 -- # set +x 00:06:33.746 ************************************ 00:06:33.746 END TEST accel_compare 00:06:33.746 ************************************ 00:06:33.746 23:28:18 accel -- accel/accel.sh@109 -- # run_test accel_xor accel_test -t 1 -w xor -y 00:06:33.746 23:28:18 accel -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:06:33.746 23:28:18 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:33.746 23:28:18 accel -- common/autotest_common.sh@10 -- # set +x 00:06:33.747 ************************************ 00:06:33.747 START TEST accel_xor 00:06:33.747 ************************************ 00:06:33.747 23:28:18 accel.accel_xor -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w xor -y 00:06:33.747 23:28:18 accel.accel_xor -- accel/accel.sh@16 -- # local accel_opc 00:06:33.747 23:28:18 accel.accel_xor -- accel/accel.sh@17 -- # local accel_module 00:06:33.747 23:28:18 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:33.747 23:28:18 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:33.747 23:28:18 accel.accel_xor -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y 00:06:33.747 23:28:18 accel.accel_xor -- accel/accel.sh@12 -- # build_accel_config 00:06:33.747 23:28:18 accel.accel_xor -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:06:33.747 23:28:18 accel.accel_xor -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:33.747 23:28:18 accel.accel_xor -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:33.747 23:28:18 accel.accel_xor -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:33.747 23:28:18 accel.accel_xor -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:33.747 23:28:18 accel.accel_xor -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:33.747 23:28:18 accel.accel_xor -- accel/accel.sh@40 -- # local IFS=, 00:06:33.747 23:28:18 accel.accel_xor -- accel/accel.sh@41 -- # jq -r . 00:06:34.006 [2024-07-24 23:28:18.766554] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:06:34.006 [2024-07-24 23:28:18.766599] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid215752 ] 00:06:34.006 [2024-07-24 23:28:18.831375] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:34.006 [2024-07-24 23:28:18.902138] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:34.006 23:28:18 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:34.006 23:28:18 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:34.006 23:28:18 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:34.006 23:28:18 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:34.006 23:28:18 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:34.006 23:28:18 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:34.006 23:28:18 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:34.006 23:28:18 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:34.006 23:28:18 accel.accel_xor -- accel/accel.sh@20 -- # val=0x1 00:06:34.006 23:28:18 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:34.006 23:28:18 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:34.006 23:28:18 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:34.006 23:28:18 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:34.006 23:28:18 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:34.006 23:28:18 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:34.006 23:28:18 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:34.006 23:28:18 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:34.006 23:28:18 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:34.006 23:28:18 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:34.006 23:28:18 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:34.006 23:28:18 accel.accel_xor -- accel/accel.sh@20 -- # val=xor 00:06:34.006 23:28:18 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:34.006 23:28:18 accel.accel_xor -- accel/accel.sh@23 -- # accel_opc=xor 00:06:34.006 23:28:18 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:34.006 23:28:18 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:34.006 23:28:18 accel.accel_xor -- accel/accel.sh@20 -- # val=2 00:06:34.006 23:28:18 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:34.006 23:28:18 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:34.006 23:28:18 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:34.006 23:28:18 accel.accel_xor -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:34.006 23:28:18 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:34.006 23:28:18 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:34.006 23:28:18 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:34.006 23:28:18 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:34.006 23:28:18 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:34.006 23:28:18 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:34.006 23:28:18 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:34.006 23:28:18 accel.accel_xor -- accel/accel.sh@20 -- # val=software 00:06:34.006 23:28:18 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:34.006 23:28:18 accel.accel_xor -- accel/accel.sh@22 -- # accel_module=software 00:06:34.006 23:28:18 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:34.006 23:28:18 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:34.006 23:28:18 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:06:34.006 23:28:18 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:34.006 23:28:18 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:34.006 23:28:18 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:34.006 23:28:18 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:06:34.006 23:28:18 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:34.006 23:28:18 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:34.006 23:28:18 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:34.006 23:28:18 accel.accel_xor -- accel/accel.sh@20 -- # val=1 00:06:34.006 23:28:18 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:34.006 23:28:18 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:34.006 23:28:18 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:34.006 23:28:18 accel.accel_xor -- accel/accel.sh@20 -- # val='1 seconds' 00:06:34.006 23:28:18 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:34.006 23:28:18 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:34.006 23:28:18 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:34.006 23:28:18 accel.accel_xor -- accel/accel.sh@20 -- # val=Yes 00:06:34.006 23:28:18 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:34.006 23:28:18 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:34.006 23:28:18 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:34.006 23:28:18 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:34.006 23:28:18 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:34.006 23:28:18 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:34.006 23:28:18 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:34.006 23:28:18 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:34.006 23:28:18 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:34.006 23:28:18 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:34.006 23:28:18 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:35.383 23:28:20 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:35.383 23:28:20 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:35.383 23:28:20 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:35.383 23:28:20 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:35.383 23:28:20 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:35.383 23:28:20 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:35.383 23:28:20 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:35.383 23:28:20 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:35.383 23:28:20 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:35.383 23:28:20 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:35.383 23:28:20 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:35.383 23:28:20 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:35.383 23:28:20 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:35.383 23:28:20 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:35.383 23:28:20 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:35.383 23:28:20 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:35.383 23:28:20 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:35.383 23:28:20 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:35.383 23:28:20 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:35.383 23:28:20 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:35.383 23:28:20 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:35.383 23:28:20 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:35.383 23:28:20 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:35.383 23:28:20 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:35.383 23:28:20 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:35.383 23:28:20 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n xor ]] 00:06:35.383 23:28:20 accel.accel_xor -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:35.383 00:06:35.383 real 0m1.364s 00:06:35.383 user 0m1.245s 00:06:35.383 sys 0m0.123s 00:06:35.383 23:28:20 accel.accel_xor -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:35.383 23:28:20 accel.accel_xor -- common/autotest_common.sh@10 -- # set +x 00:06:35.383 ************************************ 00:06:35.383 END TEST accel_xor 00:06:35.383 ************************************ 00:06:35.383 23:28:20 accel -- accel/accel.sh@110 -- # run_test accel_xor accel_test -t 1 -w xor -y -x 3 00:06:35.383 23:28:20 accel -- common/autotest_common.sh@1101 -- # '[' 9 -le 1 ']' 00:06:35.383 23:28:20 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:35.383 23:28:20 accel -- common/autotest_common.sh@10 -- # set +x 00:06:35.384 ************************************ 00:06:35.384 START TEST accel_xor 00:06:35.384 ************************************ 00:06:35.384 23:28:20 accel.accel_xor -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w xor -y -x 3 00:06:35.384 23:28:20 accel.accel_xor -- accel/accel.sh@16 -- # local accel_opc 00:06:35.384 23:28:20 accel.accel_xor -- accel/accel.sh@17 -- # local accel_module 00:06:35.384 23:28:20 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:35.384 23:28:20 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:35.384 23:28:20 accel.accel_xor -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y -x 3 00:06:35.384 23:28:20 accel.accel_xor -- accel/accel.sh@12 -- # build_accel_config 00:06:35.384 23:28:20 accel.accel_xor -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:06:35.384 23:28:20 accel.accel_xor -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:35.384 23:28:20 accel.accel_xor -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:35.384 23:28:20 accel.accel_xor -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:35.384 23:28:20 accel.accel_xor -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:35.384 23:28:20 accel.accel_xor -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:35.384 23:28:20 accel.accel_xor -- accel/accel.sh@40 -- # local IFS=, 00:06:35.384 23:28:20 accel.accel_xor -- accel/accel.sh@41 -- # jq -r . 00:06:35.384 [2024-07-24 23:28:20.191184] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:06:35.384 [2024-07-24 23:28:20.191227] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid216000 ] 00:06:35.384 [2024-07-24 23:28:20.254077] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:35.384 [2024-07-24 23:28:20.325193] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:35.384 23:28:20 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:35.384 23:28:20 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:35.384 23:28:20 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:35.384 23:28:20 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:35.384 23:28:20 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:35.384 23:28:20 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:35.384 23:28:20 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:35.384 23:28:20 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:35.384 23:28:20 accel.accel_xor -- accel/accel.sh@20 -- # val=0x1 00:06:35.384 23:28:20 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:35.384 23:28:20 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:35.384 23:28:20 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:35.384 23:28:20 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:35.384 23:28:20 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:35.384 23:28:20 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:35.384 23:28:20 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:35.384 23:28:20 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:35.384 23:28:20 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:35.384 23:28:20 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:35.384 23:28:20 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:35.384 23:28:20 accel.accel_xor -- accel/accel.sh@20 -- # val=xor 00:06:35.384 23:28:20 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:35.384 23:28:20 accel.accel_xor -- accel/accel.sh@23 -- # accel_opc=xor 00:06:35.384 23:28:20 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:35.384 23:28:20 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:35.384 23:28:20 accel.accel_xor -- accel/accel.sh@20 -- # val=3 00:06:35.384 23:28:20 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:35.384 23:28:20 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:35.384 23:28:20 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:35.384 23:28:20 accel.accel_xor -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:35.384 23:28:20 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:35.384 23:28:20 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:35.384 23:28:20 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:35.384 23:28:20 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:35.384 23:28:20 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:35.384 23:28:20 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:35.384 23:28:20 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:35.384 23:28:20 accel.accel_xor -- accel/accel.sh@20 -- # val=software 00:06:35.643 23:28:20 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:35.643 23:28:20 accel.accel_xor -- accel/accel.sh@22 -- # accel_module=software 00:06:35.643 23:28:20 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:35.643 23:28:20 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:35.643 23:28:20 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:06:35.643 23:28:20 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:35.643 23:28:20 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:35.643 23:28:20 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:35.643 23:28:20 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:06:35.643 23:28:20 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:35.643 23:28:20 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:35.643 23:28:20 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:35.643 23:28:20 accel.accel_xor -- accel/accel.sh@20 -- # val=1 00:06:35.643 23:28:20 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:35.643 23:28:20 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:35.643 23:28:20 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:35.643 23:28:20 accel.accel_xor -- accel/accel.sh@20 -- # val='1 seconds' 00:06:35.643 23:28:20 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:35.643 23:28:20 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:35.643 23:28:20 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:35.643 23:28:20 accel.accel_xor -- accel/accel.sh@20 -- # val=Yes 00:06:35.643 23:28:20 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:35.643 23:28:20 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:35.643 23:28:20 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:35.643 23:28:20 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:35.643 23:28:20 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:35.643 23:28:20 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:35.643 23:28:20 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:35.643 23:28:20 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:35.643 23:28:20 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:35.643 23:28:20 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:35.643 23:28:20 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:36.578 23:28:21 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:36.578 23:28:21 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:36.578 23:28:21 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:36.578 23:28:21 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:36.578 23:28:21 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:36.578 23:28:21 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:36.578 23:28:21 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:36.578 23:28:21 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:36.578 23:28:21 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:36.578 23:28:21 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:36.578 23:28:21 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:36.578 23:28:21 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:36.578 23:28:21 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:36.578 23:28:21 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:36.578 23:28:21 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:36.578 23:28:21 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:36.578 23:28:21 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:36.578 23:28:21 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:36.578 23:28:21 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:36.578 23:28:21 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:36.578 23:28:21 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:36.578 23:28:21 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:36.578 23:28:21 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:36.578 23:28:21 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:36.578 23:28:21 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:36.578 23:28:21 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n xor ]] 00:06:36.578 23:28:21 accel.accel_xor -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:36.578 00:06:36.578 real 0m1.350s 00:06:36.578 user 0m1.247s 00:06:36.578 sys 0m0.111s 00:06:36.578 23:28:21 accel.accel_xor -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:36.578 23:28:21 accel.accel_xor -- common/autotest_common.sh@10 -- # set +x 00:06:36.578 ************************************ 00:06:36.578 END TEST accel_xor 00:06:36.578 ************************************ 00:06:36.578 23:28:21 accel -- accel/accel.sh@111 -- # run_test accel_dif_verify accel_test -t 1 -w dif_verify 00:06:36.578 23:28:21 accel -- common/autotest_common.sh@1101 -- # '[' 6 -le 1 ']' 00:06:36.578 23:28:21 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:36.578 23:28:21 accel -- common/autotest_common.sh@10 -- # set +x 00:06:36.578 ************************************ 00:06:36.578 START TEST accel_dif_verify 00:06:36.578 ************************************ 00:06:36.578 23:28:21 accel.accel_dif_verify -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w dif_verify 00:06:36.578 23:28:21 accel.accel_dif_verify -- accel/accel.sh@16 -- # local accel_opc 00:06:36.578 23:28:21 accel.accel_dif_verify -- accel/accel.sh@17 -- # local accel_module 00:06:36.578 23:28:21 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:36.578 23:28:21 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:36.578 23:28:21 accel.accel_dif_verify -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_verify 00:06:36.578 23:28:21 accel.accel_dif_verify -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:06:36.578 23:28:21 accel.accel_dif_verify -- accel/accel.sh@12 -- # build_accel_config 00:06:36.578 23:28:21 accel.accel_dif_verify -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:36.578 23:28:21 accel.accel_dif_verify -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:36.578 23:28:21 accel.accel_dif_verify -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:36.579 23:28:21 accel.accel_dif_verify -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:36.579 23:28:21 accel.accel_dif_verify -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:36.579 23:28:21 accel.accel_dif_verify -- accel/accel.sh@40 -- # local IFS=, 00:06:36.579 23:28:21 accel.accel_dif_verify -- accel/accel.sh@41 -- # jq -r . 00:06:36.837 [2024-07-24 23:28:21.596070] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:06:36.838 [2024-07-24 23:28:21.596116] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid216250 ] 00:06:36.838 [2024-07-24 23:28:21.659503] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:36.838 [2024-07-24 23:28:21.729555] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=0x1 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=dif_verify 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@23 -- # accel_opc=dif_verify 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='512 bytes' 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='8 bytes' 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=software 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@22 -- # accel_module=software 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=32 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=32 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=1 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='1 seconds' 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=No 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:36.838 23:28:21 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:38.214 23:28:22 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:06:38.214 23:28:22 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:38.214 23:28:22 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:38.214 23:28:22 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:38.214 23:28:22 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:06:38.214 23:28:22 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:38.214 23:28:22 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:38.214 23:28:22 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:38.214 23:28:22 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:06:38.214 23:28:22 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:38.214 23:28:22 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:38.214 23:28:22 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:38.214 23:28:22 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:06:38.214 23:28:22 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:38.214 23:28:22 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:38.214 23:28:22 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:38.214 23:28:22 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:06:38.214 23:28:22 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:38.214 23:28:22 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:38.214 23:28:22 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:38.214 23:28:22 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:06:38.214 23:28:22 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:38.214 23:28:22 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:38.214 23:28:22 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:38.214 23:28:22 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:38.214 23:28:22 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ -n dif_verify ]] 00:06:38.214 23:28:22 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:38.214 00:06:38.214 real 0m1.349s 00:06:38.214 user 0m1.241s 00:06:38.214 sys 0m0.117s 00:06:38.214 23:28:22 accel.accel_dif_verify -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:38.214 23:28:22 accel.accel_dif_verify -- common/autotest_common.sh@10 -- # set +x 00:06:38.214 ************************************ 00:06:38.214 END TEST accel_dif_verify 00:06:38.214 ************************************ 00:06:38.214 23:28:22 accel -- accel/accel.sh@112 -- # run_test accel_dif_generate accel_test -t 1 -w dif_generate 00:06:38.214 23:28:22 accel -- common/autotest_common.sh@1101 -- # '[' 6 -le 1 ']' 00:06:38.214 23:28:22 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:38.214 23:28:22 accel -- common/autotest_common.sh@10 -- # set +x 00:06:38.214 ************************************ 00:06:38.214 START TEST accel_dif_generate 00:06:38.214 ************************************ 00:06:38.214 23:28:22 accel.accel_dif_generate -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w dif_generate 00:06:38.214 23:28:22 accel.accel_dif_generate -- accel/accel.sh@16 -- # local accel_opc 00:06:38.214 23:28:22 accel.accel_dif_generate -- accel/accel.sh@17 -- # local accel_module 00:06:38.214 23:28:22 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:38.214 23:28:22 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:38.214 23:28:22 accel.accel_dif_generate -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate 00:06:38.214 23:28:22 accel.accel_dif_generate -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:06:38.214 23:28:22 accel.accel_dif_generate -- accel/accel.sh@12 -- # build_accel_config 00:06:38.214 23:28:22 accel.accel_dif_generate -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:38.214 23:28:22 accel.accel_dif_generate -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:38.214 23:28:22 accel.accel_dif_generate -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:38.214 23:28:22 accel.accel_dif_generate -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:38.214 23:28:22 accel.accel_dif_generate -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:38.214 23:28:22 accel.accel_dif_generate -- accel/accel.sh@40 -- # local IFS=, 00:06:38.214 23:28:22 accel.accel_dif_generate -- accel/accel.sh@41 -- # jq -r . 00:06:38.214 [2024-07-24 23:28:22.999033] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:06:38.214 [2024-07-24 23:28:22.999080] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid216494 ] 00:06:38.214 [2024-07-24 23:28:23.062093] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:38.214 [2024-07-24 23:28:23.133541] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:38.214 23:28:23 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:06:38.214 23:28:23 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:38.214 23:28:23 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:38.214 23:28:23 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:38.214 23:28:23 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=0x1 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=dif_generate 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@23 -- # accel_opc=dif_generate 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='512 bytes' 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='8 bytes' 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=software 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@22 -- # accel_module=software 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=32 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=32 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=1 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='1 seconds' 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=No 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:38.215 23:28:23 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:39.585 23:28:24 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:06:39.585 23:28:24 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:39.585 23:28:24 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:39.585 23:28:24 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:39.585 23:28:24 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:06:39.585 23:28:24 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:39.585 23:28:24 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:39.585 23:28:24 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:39.585 23:28:24 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:06:39.585 23:28:24 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:39.585 23:28:24 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:39.585 23:28:24 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:39.585 23:28:24 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:06:39.585 23:28:24 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:39.585 23:28:24 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:39.585 23:28:24 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:39.585 23:28:24 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:06:39.585 23:28:24 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:39.585 23:28:24 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:39.585 23:28:24 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:39.585 23:28:24 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:06:39.585 23:28:24 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:39.585 23:28:24 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:39.585 23:28:24 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:39.585 23:28:24 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:39.585 23:28:24 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ -n dif_generate ]] 00:06:39.585 23:28:24 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:39.585 00:06:39.585 real 0m1.356s 00:06:39.585 user 0m1.240s 00:06:39.585 sys 0m0.121s 00:06:39.585 23:28:24 accel.accel_dif_generate -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:39.585 23:28:24 accel.accel_dif_generate -- common/autotest_common.sh@10 -- # set +x 00:06:39.585 ************************************ 00:06:39.585 END TEST accel_dif_generate 00:06:39.585 ************************************ 00:06:39.585 23:28:24 accel -- accel/accel.sh@113 -- # run_test accel_dif_generate_copy accel_test -t 1 -w dif_generate_copy 00:06:39.585 23:28:24 accel -- common/autotest_common.sh@1101 -- # '[' 6 -le 1 ']' 00:06:39.585 23:28:24 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:39.585 23:28:24 accel -- common/autotest_common.sh@10 -- # set +x 00:06:39.585 ************************************ 00:06:39.585 START TEST accel_dif_generate_copy 00:06:39.585 ************************************ 00:06:39.585 23:28:24 accel.accel_dif_generate_copy -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w dif_generate_copy 00:06:39.585 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@16 -- # local accel_opc 00:06:39.585 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@17 -- # local accel_module 00:06:39.585 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:39.585 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:39.585 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate_copy 00:06:39.585 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:06:39.585 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@12 -- # build_accel_config 00:06:39.585 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:39.585 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:39.585 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:39.585 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:39.585 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:39.585 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@40 -- # local IFS=, 00:06:39.585 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@41 -- # jq -r . 00:06:39.585 [2024-07-24 23:28:24.421462] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:06:39.585 [2024-07-24 23:28:24.421513] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid216744 ] 00:06:39.585 [2024-07-24 23:28:24.484693] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:39.585 [2024-07-24 23:28:24.555670] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:39.882 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:06:39.882 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:39.882 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:39.882 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:39.882 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:06:39.882 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:39.882 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:39.882 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:39.882 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=0x1 00:06:39.882 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:39.882 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:39.882 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:39.882 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:06:39.882 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:39.882 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:39.882 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:39.882 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:06:39.882 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:39.882 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:39.882 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:39.882 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=dif_generate_copy 00:06:39.882 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:39.882 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@23 -- # accel_opc=dif_generate_copy 00:06:39.882 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:39.882 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:39.882 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:39.882 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:39.882 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:39.882 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:39.882 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:39.882 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:39.883 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:39.883 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:39.883 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:06:39.883 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:39.883 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:39.883 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:39.883 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=software 00:06:39.883 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:39.883 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@22 -- # accel_module=software 00:06:39.883 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:39.883 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:39.883 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=32 00:06:39.883 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:39.883 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:39.883 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:39.883 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=32 00:06:39.883 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:39.883 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:39.883 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:39.883 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=1 00:06:39.883 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:39.883 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:39.883 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:39.883 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='1 seconds' 00:06:39.883 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:39.883 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:39.883 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:39.883 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=No 00:06:39.883 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:39.883 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:39.883 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:39.883 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:06:39.883 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:39.883 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:39.883 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:39.883 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:06:39.883 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:39.883 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:39.883 23:28:24 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:40.836 23:28:25 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:06:40.836 23:28:25 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:40.836 23:28:25 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:40.836 23:28:25 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:40.836 23:28:25 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:06:40.836 23:28:25 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:40.836 23:28:25 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:40.836 23:28:25 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:40.836 23:28:25 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:06:40.836 23:28:25 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:40.836 23:28:25 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:40.836 23:28:25 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:40.836 23:28:25 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:06:40.836 23:28:25 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:40.836 23:28:25 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:40.836 23:28:25 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:40.836 23:28:25 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:06:40.836 23:28:25 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:40.836 23:28:25 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:40.836 23:28:25 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:40.836 23:28:25 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:06:40.836 23:28:25 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:40.836 23:28:25 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:40.836 23:28:25 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:40.836 23:28:25 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:40.836 23:28:25 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ -n dif_generate_copy ]] 00:06:40.836 23:28:25 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:40.836 00:06:40.836 real 0m1.358s 00:06:40.836 user 0m1.239s 00:06:40.836 sys 0m0.124s 00:06:40.836 23:28:25 accel.accel_dif_generate_copy -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:40.836 23:28:25 accel.accel_dif_generate_copy -- common/autotest_common.sh@10 -- # set +x 00:06:40.836 ************************************ 00:06:40.836 END TEST accel_dif_generate_copy 00:06:40.836 ************************************ 00:06:40.836 23:28:25 accel -- accel/accel.sh@115 -- # [[ y == y ]] 00:06:40.836 23:28:25 accel -- accel/accel.sh@116 -- # run_test accel_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:06:40.836 23:28:25 accel -- common/autotest_common.sh@1101 -- # '[' 8 -le 1 ']' 00:06:40.836 23:28:25 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:40.836 23:28:25 accel -- common/autotest_common.sh@10 -- # set +x 00:06:40.836 ************************************ 00:06:40.836 START TEST accel_comp 00:06:40.836 ************************************ 00:06:40.836 23:28:25 accel.accel_comp -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:06:40.836 23:28:25 accel.accel_comp -- accel/accel.sh@16 -- # local accel_opc 00:06:40.836 23:28:25 accel.accel_comp -- accel/accel.sh@17 -- # local accel_module 00:06:40.836 23:28:25 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:40.836 23:28:25 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:40.836 23:28:25 accel.accel_comp -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:06:40.836 23:28:25 accel.accel_comp -- accel/accel.sh@12 -- # build_accel_config 00:06:40.836 23:28:25 accel.accel_comp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:06:40.836 23:28:25 accel.accel_comp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:40.836 23:28:25 accel.accel_comp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:40.836 23:28:25 accel.accel_comp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:40.836 23:28:25 accel.accel_comp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:40.836 23:28:25 accel.accel_comp -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:40.836 23:28:25 accel.accel_comp -- accel/accel.sh@40 -- # local IFS=, 00:06:40.836 23:28:25 accel.accel_comp -- accel/accel.sh@41 -- # jq -r . 00:06:41.094 [2024-07-24 23:28:25.841209] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:06:41.094 [2024-07-24 23:28:25.841255] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid216990 ] 00:06:41.094 [2024-07-24 23:28:25.906762] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:41.094 [2024-07-24 23:28:25.978279] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:41.094 23:28:26 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:06:41.094 23:28:26 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:41.094 23:28:26 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:41.094 23:28:26 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:41.094 23:28:26 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:06:41.094 23:28:26 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:41.094 23:28:26 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:41.094 23:28:26 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:41.094 23:28:26 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:06:41.094 23:28:26 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:41.094 23:28:26 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:41.094 23:28:26 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:41.094 23:28:26 accel.accel_comp -- accel/accel.sh@20 -- # val=0x1 00:06:41.094 23:28:26 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:41.094 23:28:26 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:41.094 23:28:26 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:41.094 23:28:26 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:06:41.094 23:28:26 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:41.094 23:28:26 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:41.094 23:28:26 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:41.094 23:28:26 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:06:41.094 23:28:26 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:41.094 23:28:26 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:41.094 23:28:26 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:41.094 23:28:26 accel.accel_comp -- accel/accel.sh@20 -- # val=compress 00:06:41.094 23:28:26 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:41.094 23:28:26 accel.accel_comp -- accel/accel.sh@23 -- # accel_opc=compress 00:06:41.094 23:28:26 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:41.094 23:28:26 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:41.094 23:28:26 accel.accel_comp -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:41.095 23:28:26 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:41.095 23:28:26 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:41.095 23:28:26 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:41.095 23:28:26 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:06:41.095 23:28:26 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:41.095 23:28:26 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:41.095 23:28:26 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:41.095 23:28:26 accel.accel_comp -- accel/accel.sh@20 -- # val=software 00:06:41.095 23:28:26 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:41.095 23:28:26 accel.accel_comp -- accel/accel.sh@22 -- # accel_module=software 00:06:41.095 23:28:26 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:41.095 23:28:26 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:41.095 23:28:26 accel.accel_comp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:06:41.095 23:28:26 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:41.095 23:28:26 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:41.095 23:28:26 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:41.095 23:28:26 accel.accel_comp -- accel/accel.sh@20 -- # val=32 00:06:41.095 23:28:26 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:41.095 23:28:26 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:41.095 23:28:26 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:41.095 23:28:26 accel.accel_comp -- accel/accel.sh@20 -- # val=32 00:06:41.095 23:28:26 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:41.095 23:28:26 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:41.095 23:28:26 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:41.095 23:28:26 accel.accel_comp -- accel/accel.sh@20 -- # val=1 00:06:41.095 23:28:26 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:41.095 23:28:26 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:41.095 23:28:26 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:41.095 23:28:26 accel.accel_comp -- accel/accel.sh@20 -- # val='1 seconds' 00:06:41.095 23:28:26 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:41.095 23:28:26 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:41.095 23:28:26 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:41.095 23:28:26 accel.accel_comp -- accel/accel.sh@20 -- # val=No 00:06:41.095 23:28:26 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:41.095 23:28:26 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:41.095 23:28:26 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:41.095 23:28:26 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:06:41.095 23:28:26 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:41.095 23:28:26 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:41.095 23:28:26 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:41.095 23:28:26 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:06:41.095 23:28:26 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:41.095 23:28:26 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:41.095 23:28:26 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:42.465 23:28:27 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:06:42.465 23:28:27 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:42.465 23:28:27 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:42.465 23:28:27 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:42.465 23:28:27 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:06:42.465 23:28:27 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:42.465 23:28:27 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:42.465 23:28:27 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:42.465 23:28:27 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:06:42.465 23:28:27 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:42.465 23:28:27 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:42.465 23:28:27 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:42.465 23:28:27 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:06:42.465 23:28:27 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:42.465 23:28:27 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:42.465 23:28:27 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:42.465 23:28:27 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:06:42.465 23:28:27 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:42.465 23:28:27 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:42.465 23:28:27 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:42.465 23:28:27 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:06:42.465 23:28:27 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:42.465 23:28:27 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:42.465 23:28:27 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:42.465 23:28:27 accel.accel_comp -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:42.465 23:28:27 accel.accel_comp -- accel/accel.sh@27 -- # [[ -n compress ]] 00:06:42.465 23:28:27 accel.accel_comp -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:42.465 00:06:42.465 real 0m1.360s 00:06:42.465 user 0m1.239s 00:06:42.465 sys 0m0.126s 00:06:42.465 23:28:27 accel.accel_comp -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:42.465 23:28:27 accel.accel_comp -- common/autotest_common.sh@10 -- # set +x 00:06:42.465 ************************************ 00:06:42.465 END TEST accel_comp 00:06:42.465 ************************************ 00:06:42.465 23:28:27 accel -- accel/accel.sh@117 -- # run_test accel_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:06:42.465 23:28:27 accel -- common/autotest_common.sh@1101 -- # '[' 9 -le 1 ']' 00:06:42.465 23:28:27 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:42.465 23:28:27 accel -- common/autotest_common.sh@10 -- # set +x 00:06:42.465 ************************************ 00:06:42.465 START TEST accel_decomp 00:06:42.465 ************************************ 00:06:42.465 23:28:27 accel.accel_decomp -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:06:42.465 23:28:27 accel.accel_decomp -- accel/accel.sh@16 -- # local accel_opc 00:06:42.465 23:28:27 accel.accel_decomp -- accel/accel.sh@17 -- # local accel_module 00:06:42.465 23:28:27 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:42.465 23:28:27 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:42.465 23:28:27 accel.accel_decomp -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:06:42.465 23:28:27 accel.accel_decomp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:06:42.465 23:28:27 accel.accel_decomp -- accel/accel.sh@12 -- # build_accel_config 00:06:42.466 23:28:27 accel.accel_decomp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:42.466 23:28:27 accel.accel_decomp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:42.466 23:28:27 accel.accel_decomp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:42.466 23:28:27 accel.accel_decomp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:42.466 23:28:27 accel.accel_decomp -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:42.466 23:28:27 accel.accel_decomp -- accel/accel.sh@40 -- # local IFS=, 00:06:42.466 23:28:27 accel.accel_decomp -- accel/accel.sh@41 -- # jq -r . 00:06:42.466 [2024-07-24 23:28:27.265686] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:06:42.466 [2024-07-24 23:28:27.265733] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid217243 ] 00:06:42.466 [2024-07-24 23:28:27.331421] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:42.466 [2024-07-24 23:28:27.402539] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:42.466 23:28:27 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:06:42.466 23:28:27 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:42.466 23:28:27 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:42.466 23:28:27 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:42.466 23:28:27 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:06:42.466 23:28:27 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:42.466 23:28:27 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:42.466 23:28:27 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:42.466 23:28:27 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:06:42.466 23:28:27 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:42.466 23:28:27 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:42.466 23:28:27 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:42.466 23:28:27 accel.accel_decomp -- accel/accel.sh@20 -- # val=0x1 00:06:42.466 23:28:27 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:42.466 23:28:27 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:42.466 23:28:27 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:42.466 23:28:27 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:06:42.466 23:28:27 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:42.466 23:28:27 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:42.466 23:28:27 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:42.466 23:28:27 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:06:42.723 23:28:27 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:42.723 23:28:27 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:42.723 23:28:27 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:42.723 23:28:27 accel.accel_decomp -- accel/accel.sh@20 -- # val=decompress 00:06:42.723 23:28:27 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:42.723 23:28:27 accel.accel_decomp -- accel/accel.sh@23 -- # accel_opc=decompress 00:06:42.723 23:28:27 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:42.723 23:28:27 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:42.723 23:28:27 accel.accel_decomp -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:42.723 23:28:27 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:42.723 23:28:27 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:42.723 23:28:27 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:42.723 23:28:27 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:06:42.723 23:28:27 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:42.723 23:28:27 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:42.723 23:28:27 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:42.723 23:28:27 accel.accel_decomp -- accel/accel.sh@20 -- # val=software 00:06:42.723 23:28:27 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:42.723 23:28:27 accel.accel_decomp -- accel/accel.sh@22 -- # accel_module=software 00:06:42.723 23:28:27 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:42.723 23:28:27 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:42.723 23:28:27 accel.accel_decomp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:06:42.723 23:28:27 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:42.723 23:28:27 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:42.723 23:28:27 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:42.723 23:28:27 accel.accel_decomp -- accel/accel.sh@20 -- # val=32 00:06:42.723 23:28:27 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:42.723 23:28:27 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:42.724 23:28:27 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:42.724 23:28:27 accel.accel_decomp -- accel/accel.sh@20 -- # val=32 00:06:42.724 23:28:27 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:42.724 23:28:27 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:42.724 23:28:27 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:42.724 23:28:27 accel.accel_decomp -- accel/accel.sh@20 -- # val=1 00:06:42.724 23:28:27 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:42.724 23:28:27 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:42.724 23:28:27 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:42.724 23:28:27 accel.accel_decomp -- accel/accel.sh@20 -- # val='1 seconds' 00:06:42.724 23:28:27 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:42.724 23:28:27 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:42.724 23:28:27 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:42.724 23:28:27 accel.accel_decomp -- accel/accel.sh@20 -- # val=Yes 00:06:42.724 23:28:27 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:42.724 23:28:27 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:42.724 23:28:27 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:42.724 23:28:27 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:06:42.724 23:28:27 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:42.724 23:28:27 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:42.724 23:28:27 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:42.724 23:28:27 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:06:42.724 23:28:27 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:42.724 23:28:27 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:42.724 23:28:27 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:43.656 23:28:28 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:06:43.656 23:28:28 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:43.656 23:28:28 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:43.656 23:28:28 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:43.656 23:28:28 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:06:43.656 23:28:28 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:43.656 23:28:28 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:43.656 23:28:28 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:43.657 23:28:28 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:06:43.657 23:28:28 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:43.657 23:28:28 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:43.657 23:28:28 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:43.657 23:28:28 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:06:43.657 23:28:28 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:43.657 23:28:28 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:43.657 23:28:28 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:43.657 23:28:28 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:06:43.657 23:28:28 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:43.657 23:28:28 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:43.657 23:28:28 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:43.657 23:28:28 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:06:43.657 23:28:28 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:43.657 23:28:28 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:43.657 23:28:28 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:43.657 23:28:28 accel.accel_decomp -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:43.657 23:28:28 accel.accel_decomp -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:06:43.657 23:28:28 accel.accel_decomp -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:43.657 00:06:43.657 real 0m1.361s 00:06:43.657 user 0m1.242s 00:06:43.657 sys 0m0.124s 00:06:43.657 23:28:28 accel.accel_decomp -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:43.657 23:28:28 accel.accel_decomp -- common/autotest_common.sh@10 -- # set +x 00:06:43.657 ************************************ 00:06:43.657 END TEST accel_decomp 00:06:43.657 ************************************ 00:06:43.657 23:28:28 accel -- accel/accel.sh@118 -- # run_test accel_decomp_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:43.657 23:28:28 accel -- common/autotest_common.sh@1101 -- # '[' 11 -le 1 ']' 00:06:43.657 23:28:28 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:43.657 23:28:28 accel -- common/autotest_common.sh@10 -- # set +x 00:06:43.915 ************************************ 00:06:43.915 START TEST accel_decomp_full 00:06:43.915 ************************************ 00:06:43.915 23:28:28 accel.accel_decomp_full -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:43.915 23:28:28 accel.accel_decomp_full -- accel/accel.sh@16 -- # local accel_opc 00:06:43.915 23:28:28 accel.accel_decomp_full -- accel/accel.sh@17 -- # local accel_module 00:06:43.915 23:28:28 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:43.915 23:28:28 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:43.915 23:28:28 accel.accel_decomp_full -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:43.915 23:28:28 accel.accel_decomp_full -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:43.915 23:28:28 accel.accel_decomp_full -- accel/accel.sh@12 -- # build_accel_config 00:06:43.915 23:28:28 accel.accel_decomp_full -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:43.915 23:28:28 accel.accel_decomp_full -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:43.915 23:28:28 accel.accel_decomp_full -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:43.915 23:28:28 accel.accel_decomp_full -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:43.915 23:28:28 accel.accel_decomp_full -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:43.915 23:28:28 accel.accel_decomp_full -- accel/accel.sh@40 -- # local IFS=, 00:06:43.915 23:28:28 accel.accel_decomp_full -- accel/accel.sh@41 -- # jq -r . 00:06:43.915 [2024-07-24 23:28:28.694722] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:06:43.915 [2024-07-24 23:28:28.694766] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid217491 ] 00:06:43.915 [2024-07-24 23:28:28.757540] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:43.915 [2024-07-24 23:28:28.828503] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:43.915 23:28:28 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:06:43.915 23:28:28 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:43.915 23:28:28 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:43.915 23:28:28 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:43.915 23:28:28 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:06:43.915 23:28:28 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:43.915 23:28:28 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:43.915 23:28:28 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:43.915 23:28:28 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:06:43.915 23:28:28 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:43.915 23:28:28 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:43.915 23:28:28 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:43.915 23:28:28 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=0x1 00:06:43.915 23:28:28 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:43.915 23:28:28 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:43.915 23:28:28 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:43.915 23:28:28 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:06:43.915 23:28:28 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:43.915 23:28:28 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:43.915 23:28:28 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:43.915 23:28:28 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:06:43.915 23:28:28 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:43.915 23:28:28 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:43.915 23:28:28 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:43.915 23:28:28 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=decompress 00:06:43.915 23:28:28 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:43.915 23:28:28 accel.accel_decomp_full -- accel/accel.sh@23 -- # accel_opc=decompress 00:06:43.915 23:28:28 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:43.915 23:28:28 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:43.915 23:28:28 accel.accel_decomp_full -- accel/accel.sh@20 -- # val='111250 bytes' 00:06:43.915 23:28:28 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:43.915 23:28:28 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:43.915 23:28:28 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:43.915 23:28:28 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:06:43.915 23:28:28 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:43.915 23:28:28 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:43.915 23:28:28 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:43.915 23:28:28 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=software 00:06:43.915 23:28:28 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:43.915 23:28:28 accel.accel_decomp_full -- accel/accel.sh@22 -- # accel_module=software 00:06:43.915 23:28:28 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:43.915 23:28:28 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:43.915 23:28:28 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:06:43.915 23:28:28 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:43.915 23:28:28 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:43.915 23:28:28 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:43.915 23:28:28 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=32 00:06:43.915 23:28:28 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:43.915 23:28:28 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:43.915 23:28:28 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:43.915 23:28:28 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=32 00:06:43.916 23:28:28 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:43.916 23:28:28 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:43.916 23:28:28 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:43.916 23:28:28 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=1 00:06:43.916 23:28:28 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:43.916 23:28:28 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:43.916 23:28:28 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:43.916 23:28:28 accel.accel_decomp_full -- accel/accel.sh@20 -- # val='1 seconds' 00:06:43.916 23:28:28 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:43.916 23:28:28 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:43.916 23:28:28 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:43.916 23:28:28 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=Yes 00:06:43.916 23:28:28 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:43.916 23:28:28 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:43.916 23:28:28 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:43.916 23:28:28 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:06:43.916 23:28:28 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:43.916 23:28:28 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:43.916 23:28:28 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:43.916 23:28:28 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:06:43.916 23:28:28 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:43.916 23:28:28 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:43.916 23:28:28 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:45.287 23:28:30 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:06:45.287 23:28:30 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:45.287 23:28:30 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:45.287 23:28:30 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:45.287 23:28:30 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:06:45.287 23:28:30 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:45.287 23:28:30 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:45.287 23:28:30 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:45.287 23:28:30 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:06:45.287 23:28:30 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:45.287 23:28:30 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:45.287 23:28:30 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:45.287 23:28:30 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:06:45.288 23:28:30 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:45.288 23:28:30 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:45.288 23:28:30 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:45.288 23:28:30 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:06:45.288 23:28:30 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:45.288 23:28:30 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:45.288 23:28:30 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:45.288 23:28:30 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:06:45.288 23:28:30 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:45.288 23:28:30 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:45.288 23:28:30 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:45.288 23:28:30 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:45.288 23:28:30 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:06:45.288 23:28:30 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:45.288 00:06:45.288 real 0m1.372s 00:06:45.288 user 0m1.243s 00:06:45.288 sys 0m0.131s 00:06:45.288 23:28:30 accel.accel_decomp_full -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:45.288 23:28:30 accel.accel_decomp_full -- common/autotest_common.sh@10 -- # set +x 00:06:45.288 ************************************ 00:06:45.288 END TEST accel_decomp_full 00:06:45.288 ************************************ 00:06:45.288 23:28:30 accel -- accel/accel.sh@119 -- # run_test accel_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:45.288 23:28:30 accel -- common/autotest_common.sh@1101 -- # '[' 11 -le 1 ']' 00:06:45.288 23:28:30 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:45.288 23:28:30 accel -- common/autotest_common.sh@10 -- # set +x 00:06:45.288 ************************************ 00:06:45.288 START TEST accel_decomp_mcore 00:06:45.288 ************************************ 00:06:45.288 23:28:30 accel.accel_decomp_mcore -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:45.288 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@16 -- # local accel_opc 00:06:45.288 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@17 -- # local accel_module 00:06:45.288 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:45.288 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:45.288 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:45.288 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:45.288 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@12 -- # build_accel_config 00:06:45.288 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:45.288 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:45.288 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:45.288 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:45.288 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:45.288 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@40 -- # local IFS=, 00:06:45.288 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@41 -- # jq -r . 00:06:45.288 [2024-07-24 23:28:30.134575] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:06:45.288 [2024-07-24 23:28:30.134629] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid217766 ] 00:06:45.288 [2024-07-24 23:28:30.201731] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:45.288 [2024-07-24 23:28:30.275594] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:06:45.288 [2024-07-24 23:28:30.275689] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:06:45.288 [2024-07-24 23:28:30.275777] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:06:45.288 [2024-07-24 23:28:30.275779] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=0xf 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=decompress 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=software 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@22 -- # accel_module=software 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=1 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=Yes 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:45.546 23:28:30 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:46.479 23:28:31 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:46.479 23:28:31 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:46.479 23:28:31 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:46.479 23:28:31 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:46.479 23:28:31 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:46.479 23:28:31 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:46.479 23:28:31 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:46.479 23:28:31 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:46.479 23:28:31 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:46.479 23:28:31 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:46.479 23:28:31 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:46.479 23:28:31 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:46.479 23:28:31 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:46.479 23:28:31 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:46.479 23:28:31 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:46.479 23:28:31 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:46.479 23:28:31 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:46.479 23:28:31 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:46.479 23:28:31 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:46.479 23:28:31 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:46.479 23:28:31 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:46.479 23:28:31 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:46.479 23:28:31 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:46.479 23:28:31 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:46.479 23:28:31 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:46.479 23:28:31 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:46.479 23:28:31 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:46.479 23:28:31 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:46.479 23:28:31 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:46.479 23:28:31 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:46.479 23:28:31 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:46.479 23:28:31 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:46.479 23:28:31 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:46.479 23:28:31 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:46.479 23:28:31 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:46.479 23:28:31 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:46.479 23:28:31 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:46.479 23:28:31 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:06:46.479 23:28:31 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:46.479 00:06:46.479 real 0m1.373s 00:06:46.479 user 0m4.606s 00:06:46.479 sys 0m0.132s 00:06:46.479 23:28:31 accel.accel_decomp_mcore -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:46.479 23:28:31 accel.accel_decomp_mcore -- common/autotest_common.sh@10 -- # set +x 00:06:46.479 ************************************ 00:06:46.479 END TEST accel_decomp_mcore 00:06:46.479 ************************************ 00:06:46.737 23:28:31 accel -- accel/accel.sh@120 -- # run_test accel_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:46.737 23:28:31 accel -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:06:46.737 23:28:31 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:46.737 23:28:31 accel -- common/autotest_common.sh@10 -- # set +x 00:06:46.737 ************************************ 00:06:46.737 START TEST accel_decomp_full_mcore 00:06:46.737 ************************************ 00:06:46.737 23:28:31 accel.accel_decomp_full_mcore -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:46.737 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@16 -- # local accel_opc 00:06:46.737 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@17 -- # local accel_module 00:06:46.737 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:46.737 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:46.737 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:46.737 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:46.737 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@12 -- # build_accel_config 00:06:46.737 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:46.737 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:46.737 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:46.737 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:46.737 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:46.737 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@40 -- # local IFS=, 00:06:46.737 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@41 -- # jq -r . 00:06:46.737 [2024-07-24 23:28:31.558465] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:06:46.737 [2024-07-24 23:28:31.558508] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid218068 ] 00:06:46.737 [2024-07-24 23:28:31.617755] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:46.737 [2024-07-24 23:28:31.696746] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:06:46.737 [2024-07-24 23:28:31.696843] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:06:46.737 [2024-07-24 23:28:31.696934] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:06:46.737 [2024-07-24 23:28:31.696935] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=0xf 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=decompress 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val='111250 bytes' 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=software 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@22 -- # accel_module=software 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=1 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=Yes 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:46.996 23:28:31 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:47.929 23:28:32 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:47.929 23:28:32 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:47.929 23:28:32 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:47.929 23:28:32 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:47.929 23:28:32 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:47.929 23:28:32 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:47.929 23:28:32 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:47.929 23:28:32 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:47.929 23:28:32 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:47.929 23:28:32 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:47.929 23:28:32 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:47.929 23:28:32 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:47.929 23:28:32 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:47.929 23:28:32 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:47.929 23:28:32 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:47.929 23:28:32 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:47.929 23:28:32 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:47.929 23:28:32 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:47.929 23:28:32 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:47.929 23:28:32 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:47.929 23:28:32 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:47.929 23:28:32 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:47.929 23:28:32 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:47.929 23:28:32 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:47.929 23:28:32 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:47.929 23:28:32 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:47.929 23:28:32 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:47.929 23:28:32 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:47.929 23:28:32 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:47.929 23:28:32 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:47.929 23:28:32 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:47.929 23:28:32 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:47.929 23:28:32 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:47.929 23:28:32 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:47.929 23:28:32 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:47.929 23:28:32 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:47.929 23:28:32 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:47.929 23:28:32 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:06:47.929 23:28:32 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:47.929 00:06:47.929 real 0m1.369s 00:06:47.929 user 0m4.630s 00:06:47.929 sys 0m0.131s 00:06:47.929 23:28:32 accel.accel_decomp_full_mcore -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:47.929 23:28:32 accel.accel_decomp_full_mcore -- common/autotest_common.sh@10 -- # set +x 00:06:47.929 ************************************ 00:06:47.929 END TEST accel_decomp_full_mcore 00:06:47.929 ************************************ 00:06:48.187 23:28:32 accel -- accel/accel.sh@121 -- # run_test accel_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:06:48.187 23:28:32 accel -- common/autotest_common.sh@1101 -- # '[' 11 -le 1 ']' 00:06:48.187 23:28:32 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:48.187 23:28:32 accel -- common/autotest_common.sh@10 -- # set +x 00:06:48.187 ************************************ 00:06:48.187 START TEST accel_decomp_mthread 00:06:48.187 ************************************ 00:06:48.187 23:28:32 accel.accel_decomp_mthread -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:06:48.188 23:28:32 accel.accel_decomp_mthread -- accel/accel.sh@16 -- # local accel_opc 00:06:48.188 23:28:32 accel.accel_decomp_mthread -- accel/accel.sh@17 -- # local accel_module 00:06:48.188 23:28:32 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:48.188 23:28:32 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:48.188 23:28:32 accel.accel_decomp_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:06:48.188 23:28:32 accel.accel_decomp_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:06:48.188 23:28:32 accel.accel_decomp_mthread -- accel/accel.sh@12 -- # build_accel_config 00:06:48.188 23:28:32 accel.accel_decomp_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:48.188 23:28:32 accel.accel_decomp_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:48.188 23:28:32 accel.accel_decomp_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:48.188 23:28:32 accel.accel_decomp_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:48.188 23:28:32 accel.accel_decomp_mthread -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:48.188 23:28:32 accel.accel_decomp_mthread -- accel/accel.sh@40 -- # local IFS=, 00:06:48.188 23:28:32 accel.accel_decomp_mthread -- accel/accel.sh@41 -- # jq -r . 00:06:48.188 [2024-07-24 23:28:33.001952] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:06:48.188 [2024-07-24 23:28:33.001998] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid218348 ] 00:06:48.188 [2024-07-24 23:28:33.065252] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:48.188 [2024-07-24 23:28:33.137170] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:48.446 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:06:48.446 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:48.446 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:48.446 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:48.446 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:06:48.446 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:48.446 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:48.446 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:48.446 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:06:48.446 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:48.446 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:48.446 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:48.446 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=0x1 00:06:48.446 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:48.446 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:48.446 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:48.446 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:06:48.446 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:48.446 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:48.446 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:48.446 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:06:48.446 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:48.446 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:48.446 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:48.446 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=decompress 00:06:48.446 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:48.446 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:06:48.446 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:48.446 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:48.446 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:48.446 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:48.446 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:48.446 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:48.446 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:06:48.446 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:48.446 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:48.446 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:48.446 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=software 00:06:48.446 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:48.446 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@22 -- # accel_module=software 00:06:48.446 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:48.446 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:48.446 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:06:48.446 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:48.446 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:48.446 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:48.446 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:06:48.446 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:48.446 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:48.446 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:48.446 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:06:48.446 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:48.446 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:48.446 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:48.446 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=2 00:06:48.446 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:48.446 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:48.446 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:48.446 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:06:48.446 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:48.447 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:48.447 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:48.447 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=Yes 00:06:48.447 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:48.447 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:48.447 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:48.447 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:06:48.447 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:48.447 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:48.447 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:48.447 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:06:48.447 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:48.447 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:48.447 23:28:33 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:49.378 23:28:34 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:06:49.378 23:28:34 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:49.378 23:28:34 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:49.378 23:28:34 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:49.378 23:28:34 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:06:49.378 23:28:34 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:49.378 23:28:34 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:49.378 23:28:34 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:49.378 23:28:34 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:06:49.378 23:28:34 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:49.378 23:28:34 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:49.378 23:28:34 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:49.378 23:28:34 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:06:49.378 23:28:34 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:49.378 23:28:34 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:49.378 23:28:34 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:49.378 23:28:34 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:06:49.378 23:28:34 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:49.378 23:28:34 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:49.378 23:28:34 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:49.378 23:28:34 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:06:49.378 23:28:34 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:49.378 23:28:34 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:49.378 23:28:34 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:49.378 23:28:34 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:06:49.378 23:28:34 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:49.378 23:28:34 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:49.378 23:28:34 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:49.378 23:28:34 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:49.378 23:28:34 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:06:49.378 23:28:34 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:49.378 00:06:49.378 real 0m1.368s 00:06:49.378 user 0m1.248s 00:06:49.378 sys 0m0.120s 00:06:49.378 23:28:34 accel.accel_decomp_mthread -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:49.378 23:28:34 accel.accel_decomp_mthread -- common/autotest_common.sh@10 -- # set +x 00:06:49.378 ************************************ 00:06:49.378 END TEST accel_decomp_mthread 00:06:49.378 ************************************ 00:06:49.378 23:28:34 accel -- accel/accel.sh@122 -- # run_test accel_decomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:06:49.378 23:28:34 accel -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:06:49.378 23:28:34 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:49.378 23:28:34 accel -- common/autotest_common.sh@10 -- # set +x 00:06:49.636 ************************************ 00:06:49.636 START TEST accel_decomp_full_mthread 00:06:49.636 ************************************ 00:06:49.636 23:28:34 accel.accel_decomp_full_mthread -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:06:49.636 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@16 -- # local accel_opc 00:06:49.636 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@17 -- # local accel_module 00:06:49.636 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:06:49.636 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:49.636 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:49.636 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:06:49.636 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@12 -- # build_accel_config 00:06:49.636 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:49.636 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:49.636 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:49.636 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:49.636 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:49.636 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@40 -- # local IFS=, 00:06:49.636 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@41 -- # jq -r . 00:06:49.636 [2024-07-24 23:28:34.426570] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:06:49.636 [2024-07-24 23:28:34.426608] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid218633 ] 00:06:49.636 [2024-07-24 23:28:34.487629] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:49.636 [2024-07-24 23:28:34.558900] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:49.636 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=0x1 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=decompress 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val='111250 bytes' 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=software 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@22 -- # accel_module=software 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=2 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=Yes 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:49.637 23:28:34 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:51.009 23:28:35 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:06:51.009 23:28:35 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:51.009 23:28:35 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:51.009 23:28:35 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:51.009 23:28:35 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:06:51.009 23:28:35 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:51.009 23:28:35 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:51.009 23:28:35 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:51.009 23:28:35 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:06:51.009 23:28:35 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:51.009 23:28:35 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:51.009 23:28:35 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:51.009 23:28:35 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:06:51.009 23:28:35 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:51.009 23:28:35 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:51.009 23:28:35 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:51.009 23:28:35 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:06:51.009 23:28:35 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:51.009 23:28:35 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:51.009 23:28:35 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:51.009 23:28:35 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:06:51.009 23:28:35 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:51.009 23:28:35 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:51.009 23:28:35 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:51.009 23:28:35 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:06:51.009 23:28:35 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:51.009 23:28:35 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:51.009 23:28:35 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:51.009 23:28:35 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:51.009 23:28:35 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:06:51.009 23:28:35 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:51.009 00:06:51.009 real 0m1.378s 00:06:51.009 user 0m1.262s 00:06:51.009 sys 0m0.118s 00:06:51.009 23:28:35 accel.accel_decomp_full_mthread -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:51.009 23:28:35 accel.accel_decomp_full_mthread -- common/autotest_common.sh@10 -- # set +x 00:06:51.009 ************************************ 00:06:51.009 END TEST accel_decomp_full_mthread 00:06:51.009 ************************************ 00:06:51.009 23:28:35 accel -- accel/accel.sh@124 -- # [[ y == y ]] 00:06:51.009 23:28:35 accel -- accel/accel.sh@125 -- # COMPRESSDEV=1 00:06:51.009 23:28:35 accel -- accel/accel.sh@126 -- # get_expected_opcs 00:06:51.009 23:28:35 accel -- accel/accel.sh@60 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:06:51.009 23:28:35 accel -- accel/accel.sh@62 -- # spdk_tgt_pid=218896 00:06:51.009 23:28:35 accel -- accel/accel.sh@63 -- # waitforlisten 218896 00:06:51.009 23:28:35 accel -- common/autotest_common.sh@831 -- # '[' -z 218896 ']' 00:06:51.009 23:28:35 accel -- accel/accel.sh@61 -- # build_accel_config 00:06:51.009 23:28:35 accel -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:51.009 23:28:35 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:51.009 23:28:35 accel -- accel/accel.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:06:51.009 23:28:35 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:51.009 23:28:35 accel -- common/autotest_common.sh@836 -- # local max_retries=100 00:06:51.009 23:28:35 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:51.009 23:28:35 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:51.009 23:28:35 accel -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:51.009 23:28:35 accel -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:06:51.009 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:51.009 23:28:35 accel -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:06:51.009 23:28:35 accel -- accel/accel.sh@40 -- # local IFS=, 00:06:51.009 23:28:35 accel -- common/autotest_common.sh@840 -- # xtrace_disable 00:06:51.009 23:28:35 accel -- accel/accel.sh@41 -- # jq -r . 00:06:51.009 23:28:35 accel -- common/autotest_common.sh@10 -- # set +x 00:06:51.010 [2024-07-24 23:28:35.871391] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:06:51.010 [2024-07-24 23:28:35.871435] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid218896 ] 00:06:51.010 [2024-07-24 23:28:35.933925] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:51.267 [2024-07-24 23:28:36.011010] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:51.525 [2024-07-24 23:28:36.363294] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:06:51.782 23:28:36 accel -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:06:51.782 23:28:36 accel -- common/autotest_common.sh@864 -- # return 0 00:06:51.782 23:28:36 accel -- accel/accel.sh@65 -- # [[ 0 -gt 0 ]] 00:06:51.782 23:28:36 accel -- accel/accel.sh@66 -- # [[ 0 -gt 0 ]] 00:06:51.782 23:28:36 accel -- accel/accel.sh@67 -- # [[ 0 -gt 0 ]] 00:06:51.782 23:28:36 accel -- accel/accel.sh@68 -- # [[ -n 1 ]] 00:06:51.782 23:28:36 accel -- accel/accel.sh@68 -- # check_save_config compressdev_scan_accel_module 00:06:51.782 23:28:36 accel -- accel/accel.sh@56 -- # rpc_cmd save_config 00:06:51.782 23:28:36 accel -- accel/accel.sh@56 -- # jq -r '.subsystems[] | select(.subsystem=="accel").config[]' 00:06:51.782 23:28:36 accel -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:51.782 23:28:36 accel -- common/autotest_common.sh@10 -- # set +x 00:06:51.782 23:28:36 accel -- accel/accel.sh@56 -- # grep compressdev_scan_accel_module 00:06:52.040 23:28:36 accel -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:52.040 "method": "compressdev_scan_accel_module", 00:06:52.040 23:28:36 accel -- accel/accel.sh@70 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:06:52.040 23:28:36 accel -- accel/accel.sh@70 -- # rpc_cmd accel_get_opc_assignments 00:06:52.040 23:28:36 accel -- accel/accel.sh@70 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:06:52.040 23:28:36 accel -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:52.040 23:28:36 accel -- common/autotest_common.sh@10 -- # set +x 00:06:52.040 23:28:36 accel -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:52.040 23:28:36 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:52.040 23:28:36 accel -- accel/accel.sh@72 -- # IFS== 00:06:52.040 23:28:36 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:52.040 23:28:36 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:52.040 23:28:36 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:52.040 23:28:36 accel -- accel/accel.sh@72 -- # IFS== 00:06:52.040 23:28:36 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:52.040 23:28:36 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:52.040 23:28:36 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:52.040 23:28:36 accel -- accel/accel.sh@72 -- # IFS== 00:06:52.040 23:28:36 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:52.040 23:28:36 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:52.040 23:28:36 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:52.040 23:28:36 accel -- accel/accel.sh@72 -- # IFS== 00:06:52.040 23:28:36 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:52.040 23:28:36 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:52.040 23:28:36 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:52.040 23:28:36 accel -- accel/accel.sh@72 -- # IFS== 00:06:52.040 23:28:36 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:52.040 23:28:36 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:52.040 23:28:36 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:52.040 23:28:36 accel -- accel/accel.sh@72 -- # IFS== 00:06:52.040 23:28:36 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:52.040 23:28:36 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:52.040 23:28:36 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:52.040 23:28:36 accel -- accel/accel.sh@72 -- # IFS== 00:06:52.040 23:28:36 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:52.040 23:28:36 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=dpdk_compressdev 00:06:52.040 23:28:36 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:52.040 23:28:36 accel -- accel/accel.sh@72 -- # IFS== 00:06:52.040 23:28:36 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:52.040 23:28:36 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=dpdk_compressdev 00:06:52.040 23:28:36 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:52.040 23:28:36 accel -- accel/accel.sh@72 -- # IFS== 00:06:52.040 23:28:36 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:52.040 23:28:36 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:52.040 23:28:36 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:52.040 23:28:36 accel -- accel/accel.sh@72 -- # IFS== 00:06:52.040 23:28:36 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:52.040 23:28:36 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:52.040 23:28:36 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:52.040 23:28:36 accel -- accel/accel.sh@72 -- # IFS== 00:06:52.040 23:28:36 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:52.040 23:28:36 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:52.040 23:28:36 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:52.040 23:28:36 accel -- accel/accel.sh@72 -- # IFS== 00:06:52.040 23:28:36 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:52.040 23:28:36 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:52.040 23:28:36 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:52.040 23:28:36 accel -- accel/accel.sh@72 -- # IFS== 00:06:52.040 23:28:36 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:52.040 23:28:36 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:52.040 23:28:36 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:52.040 23:28:36 accel -- accel/accel.sh@72 -- # IFS== 00:06:52.040 23:28:36 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:52.040 23:28:36 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:52.040 23:28:36 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:52.040 23:28:36 accel -- accel/accel.sh@72 -- # IFS== 00:06:52.040 23:28:36 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:52.040 23:28:36 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:52.040 23:28:36 accel -- accel/accel.sh@75 -- # killprocess 218896 00:06:52.040 23:28:36 accel -- common/autotest_common.sh@950 -- # '[' -z 218896 ']' 00:06:52.040 23:28:36 accel -- common/autotest_common.sh@954 -- # kill -0 218896 00:06:52.040 23:28:36 accel -- common/autotest_common.sh@955 -- # uname 00:06:52.040 23:28:36 accel -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:06:52.040 23:28:36 accel -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 218896 00:06:52.040 23:28:36 accel -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:06:52.040 23:28:36 accel -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:06:52.040 23:28:36 accel -- common/autotest_common.sh@968 -- # echo 'killing process with pid 218896' 00:06:52.040 killing process with pid 218896 00:06:52.040 23:28:36 accel -- common/autotest_common.sh@969 -- # kill 218896 00:06:52.040 23:28:36 accel -- common/autotest_common.sh@974 -- # wait 218896 00:06:52.298 23:28:37 accel -- accel/accel.sh@76 -- # trap - ERR 00:06:52.298 23:28:37 accel -- accel/accel.sh@127 -- # run_test accel_cdev_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:06:52.298 23:28:37 accel -- common/autotest_common.sh@1101 -- # '[' 8 -le 1 ']' 00:06:52.298 23:28:37 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:52.298 23:28:37 accel -- common/autotest_common.sh@10 -- # set +x 00:06:52.298 ************************************ 00:06:52.298 START TEST accel_cdev_comp 00:06:52.298 ************************************ 00:06:52.298 23:28:37 accel.accel_cdev_comp -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:06:52.298 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@16 -- # local accel_opc 00:06:52.298 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@17 -- # local accel_module 00:06:52.298 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:06:52.298 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:06:52.299 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:06:52.299 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:06:52.299 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@12 -- # build_accel_config 00:06:52.299 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:52.299 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:52.299 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:52.299 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:52.299 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:06:52.299 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:06:52.299 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@40 -- # local IFS=, 00:06:52.299 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@41 -- # jq -r . 00:06:52.299 [2024-07-24 23:28:37.282833] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:06:52.299 [2024-07-24 23:28:37.282880] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid219183 ] 00:06:52.556 [2024-07-24 23:28:37.348050] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:52.556 [2024-07-24 23:28:37.417611] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:52.814 [2024-07-24 23:28:37.774205] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:06:52.814 [2024-07-24 23:28:37.775790] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x23afa80 PMD being used: compress_qat 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:06:52.814 [2024-07-24 23:28:37.779029] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x25b4830 PMD being used: compress_qat 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=0x1 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=compress 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@23 -- # accel_opc=compress 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=32 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=32 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=1 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val='1 seconds' 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=No 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:06:52.814 23:28:37 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:06:54.185 23:28:38 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:06:54.185 23:28:38 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:54.185 23:28:38 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:06:54.185 23:28:38 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:06:54.185 23:28:38 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:06:54.185 23:28:38 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:54.185 23:28:38 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:06:54.185 23:28:38 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:06:54.185 23:28:38 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:06:54.185 23:28:38 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:54.185 23:28:38 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:06:54.185 23:28:38 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:06:54.185 23:28:38 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:06:54.185 23:28:38 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:54.185 23:28:38 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:06:54.185 23:28:38 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:06:54.185 23:28:38 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:06:54.185 23:28:38 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:54.185 23:28:38 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:06:54.185 23:28:38 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:06:54.185 23:28:38 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:06:54.185 23:28:38 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:54.185 23:28:38 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:06:54.185 23:28:38 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:06:54.185 23:28:38 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:06:54.185 23:28:38 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ -n compress ]] 00:06:54.185 23:28:38 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:06:54.185 00:06:54.185 real 0m1.670s 00:06:54.185 user 0m1.406s 00:06:54.185 sys 0m0.268s 00:06:54.185 23:28:38 accel.accel_cdev_comp -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:54.185 23:28:38 accel.accel_cdev_comp -- common/autotest_common.sh@10 -- # set +x 00:06:54.185 ************************************ 00:06:54.185 END TEST accel_cdev_comp 00:06:54.185 ************************************ 00:06:54.185 23:28:38 accel -- accel/accel.sh@128 -- # run_test accel_cdev_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:06:54.185 23:28:38 accel -- common/autotest_common.sh@1101 -- # '[' 9 -le 1 ']' 00:06:54.185 23:28:38 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:54.185 23:28:38 accel -- common/autotest_common.sh@10 -- # set +x 00:06:54.185 ************************************ 00:06:54.185 START TEST accel_cdev_decomp 00:06:54.185 ************************************ 00:06:54.185 23:28:38 accel.accel_cdev_decomp -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:06:54.185 23:28:38 accel.accel_cdev_decomp -- accel/accel.sh@16 -- # local accel_opc 00:06:54.185 23:28:38 accel.accel_cdev_decomp -- accel/accel.sh@17 -- # local accel_module 00:06:54.185 23:28:38 accel.accel_cdev_decomp -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:06:54.185 23:28:38 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:54.185 23:28:38 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:54.185 23:28:38 accel.accel_cdev_decomp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:06:54.185 23:28:38 accel.accel_cdev_decomp -- accel/accel.sh@12 -- # build_accel_config 00:06:54.185 23:28:38 accel.accel_cdev_decomp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:54.185 23:28:38 accel.accel_cdev_decomp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:54.185 23:28:38 accel.accel_cdev_decomp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:54.185 23:28:38 accel.accel_cdev_decomp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:54.185 23:28:38 accel.accel_cdev_decomp -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:06:54.185 23:28:38 accel.accel_cdev_decomp -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:06:54.185 23:28:38 accel.accel_cdev_decomp -- accel/accel.sh@40 -- # local IFS=, 00:06:54.185 23:28:38 accel.accel_cdev_decomp -- accel/accel.sh@41 -- # jq -r . 00:06:54.185 [2024-07-24 23:28:39.002495] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:06:54.185 [2024-07-24 23:28:39.002530] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid219431 ] 00:06:54.185 [2024-07-24 23:28:39.065430] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:54.185 [2024-07-24 23:28:39.136312] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:54.751 [2024-07-24 23:28:39.493633] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:06:54.751 [2024-07-24 23:28:39.495211] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xfb7a80 PMD being used: compress_qat 00:06:54.751 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:06:54.751 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:54.751 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:54.751 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:54.751 [2024-07-24 23:28:39.498542] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x11bc830 PMD being used: compress_qat 00:06:54.751 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:06:54.751 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:54.751 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:54.751 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:54.751 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:06:54.751 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:54.751 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:54.751 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:54.751 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=0x1 00:06:54.751 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:54.751 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:54.751 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:54.751 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:06:54.751 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:54.751 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:54.751 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:54.751 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:06:54.751 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:54.751 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:54.751 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:54.751 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=decompress 00:06:54.751 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:54.751 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@23 -- # accel_opc=decompress 00:06:54.751 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:54.751 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:54.751 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:54.751 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:54.751 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:54.751 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:54.751 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:06:54.752 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:54.752 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:54.752 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:54.752 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:06:54.752 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:54.752 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:06:54.752 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:54.752 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:54.752 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:06:54.752 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:54.752 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:54.752 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:54.752 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=32 00:06:54.752 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:54.752 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:54.752 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:54.752 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=32 00:06:54.752 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:54.752 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:54.752 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:54.752 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=1 00:06:54.752 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:54.752 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:54.752 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:54.752 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val='1 seconds' 00:06:54.752 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:54.752 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:54.752 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:54.752 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=Yes 00:06:54.752 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:54.752 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:54.752 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:54.752 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:06:54.752 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:54.752 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:54.752 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:54.752 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:06:54.752 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:54.752 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:54.752 23:28:39 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:55.685 23:28:40 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:06:55.685 23:28:40 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:55.685 23:28:40 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:55.685 23:28:40 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:55.685 23:28:40 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:06:55.685 23:28:40 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:55.685 23:28:40 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:55.685 23:28:40 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:55.685 23:28:40 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:06:55.685 23:28:40 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:55.685 23:28:40 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:55.685 23:28:40 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:55.685 23:28:40 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:06:55.685 23:28:40 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:55.685 23:28:40 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:55.685 23:28:40 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:55.685 23:28:40 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:06:55.685 23:28:40 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:55.685 23:28:40 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:55.685 23:28:40 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:55.685 23:28:40 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:06:55.685 23:28:40 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:55.685 23:28:40 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:55.685 23:28:40 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:55.685 23:28:40 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:06:55.685 23:28:40 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:06:55.685 23:28:40 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:06:55.685 00:06:55.685 real 0m1.649s 00:06:55.685 user 0m1.374s 00:06:55.685 sys 0m0.274s 00:06:55.685 23:28:40 accel.accel_cdev_decomp -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:55.685 23:28:40 accel.accel_cdev_decomp -- common/autotest_common.sh@10 -- # set +x 00:06:55.685 ************************************ 00:06:55.685 END TEST accel_cdev_decomp 00:06:55.685 ************************************ 00:06:55.685 23:28:40 accel -- accel/accel.sh@129 -- # run_test accel_cdev_decomp_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:55.685 23:28:40 accel -- common/autotest_common.sh@1101 -- # '[' 11 -le 1 ']' 00:06:55.685 23:28:40 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:55.685 23:28:40 accel -- common/autotest_common.sh@10 -- # set +x 00:06:55.943 ************************************ 00:06:55.943 START TEST accel_cdev_decomp_full 00:06:55.943 ************************************ 00:06:55.943 23:28:40 accel.accel_cdev_decomp_full -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:55.943 23:28:40 accel.accel_cdev_decomp_full -- accel/accel.sh@16 -- # local accel_opc 00:06:55.943 23:28:40 accel.accel_cdev_decomp_full -- accel/accel.sh@17 -- # local accel_module 00:06:55.943 23:28:40 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:55.943 23:28:40 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:55.943 23:28:40 accel.accel_cdev_decomp_full -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:55.943 23:28:40 accel.accel_cdev_decomp_full -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:55.943 23:28:40 accel.accel_cdev_decomp_full -- accel/accel.sh@12 -- # build_accel_config 00:06:55.943 23:28:40 accel.accel_cdev_decomp_full -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:55.943 23:28:40 accel.accel_cdev_decomp_full -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:55.943 23:28:40 accel.accel_cdev_decomp_full -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:55.943 23:28:40 accel.accel_cdev_decomp_full -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:55.943 23:28:40 accel.accel_cdev_decomp_full -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:06:55.943 23:28:40 accel.accel_cdev_decomp_full -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:06:55.943 23:28:40 accel.accel_cdev_decomp_full -- accel/accel.sh@40 -- # local IFS=, 00:06:55.943 23:28:40 accel.accel_cdev_decomp_full -- accel/accel.sh@41 -- # jq -r . 00:06:55.943 [2024-07-24 23:28:40.733406] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:06:55.943 [2024-07-24 23:28:40.733463] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid219684 ] 00:06:55.943 [2024-07-24 23:28:40.797598] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:55.943 [2024-07-24 23:28:40.868769] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:56.509 [2024-07-24 23:28:41.231573] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:06:56.509 [2024-07-24 23:28:41.233180] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x84ea80 PMD being used: compress_qat 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:56.509 [2024-07-24 23:28:41.235732] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x851db0 PMD being used: compress_qat 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=0x1 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=decompress 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@23 -- # accel_opc=decompress 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val='111250 bytes' 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=32 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=32 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=1 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val='1 seconds' 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=Yes 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:56.509 23:28:41 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:57.442 23:28:42 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:06:57.442 23:28:42 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:57.442 23:28:42 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:57.442 23:28:42 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:57.442 23:28:42 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:06:57.442 23:28:42 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:57.442 23:28:42 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:57.442 23:28:42 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:57.442 23:28:42 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:06:57.442 23:28:42 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:57.442 23:28:42 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:57.442 23:28:42 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:57.442 23:28:42 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:06:57.442 23:28:42 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:57.442 23:28:42 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:57.442 23:28:42 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:57.442 23:28:42 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:06:57.442 23:28:42 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:57.442 23:28:42 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:57.442 23:28:42 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:57.442 23:28:42 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:06:57.442 23:28:42 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:06:57.442 23:28:42 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:06:57.442 23:28:42 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:06:57.442 23:28:42 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:06:57.442 23:28:42 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:06:57.442 23:28:42 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:06:57.442 00:06:57.442 real 0m1.678s 00:06:57.442 user 0m1.412s 00:06:57.442 sys 0m0.270s 00:06:57.442 23:28:42 accel.accel_cdev_decomp_full -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:57.442 23:28:42 accel.accel_cdev_decomp_full -- common/autotest_common.sh@10 -- # set +x 00:06:57.442 ************************************ 00:06:57.442 END TEST accel_cdev_decomp_full 00:06:57.442 ************************************ 00:06:57.442 23:28:42 accel -- accel/accel.sh@130 -- # run_test accel_cdev_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:57.442 23:28:42 accel -- common/autotest_common.sh@1101 -- # '[' 11 -le 1 ']' 00:06:57.442 23:28:42 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:57.442 23:28:42 accel -- common/autotest_common.sh@10 -- # set +x 00:06:57.700 ************************************ 00:06:57.700 START TEST accel_cdev_decomp_mcore 00:06:57.700 ************************************ 00:06:57.701 23:28:42 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:57.701 23:28:42 accel.accel_cdev_decomp_mcore -- accel/accel.sh@16 -- # local accel_opc 00:06:57.701 23:28:42 accel.accel_cdev_decomp_mcore -- accel/accel.sh@17 -- # local accel_module 00:06:57.701 23:28:42 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:57.701 23:28:42 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:57.701 23:28:42 accel.accel_cdev_decomp_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:57.701 23:28:42 accel.accel_cdev_decomp_mcore -- accel/accel.sh@12 -- # build_accel_config 00:06:57.701 23:28:42 accel.accel_cdev_decomp_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:57.701 23:28:42 accel.accel_cdev_decomp_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:57.701 23:28:42 accel.accel_cdev_decomp_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:57.701 23:28:42 accel.accel_cdev_decomp_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:57.701 23:28:42 accel.accel_cdev_decomp_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:57.701 23:28:42 accel.accel_cdev_decomp_mcore -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:06:57.701 23:28:42 accel.accel_cdev_decomp_mcore -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:06:57.701 23:28:42 accel.accel_cdev_decomp_mcore -- accel/accel.sh@40 -- # local IFS=, 00:06:57.701 23:28:42 accel.accel_cdev_decomp_mcore -- accel/accel.sh@41 -- # jq -r . 00:06:57.701 [2024-07-24 23:28:42.476582] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:06:57.701 [2024-07-24 23:28:42.476627] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid219954 ] 00:06:57.701 [2024-07-24 23:28:42.542075] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:57.701 [2024-07-24 23:28:42.616208] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:06:57.701 [2024-07-24 23:28:42.616308] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:06:57.701 [2024-07-24 23:28:42.616397] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:06:57.701 [2024-07-24 23:28:42.616443] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:58.266 [2024-07-24 23:28:43.005305] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:06:58.266 [2024-07-24 23:28:43.006956] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x8fe120 PMD being used: compress_qat 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:58.266 [2024-07-24 23:28:43.011339] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f07d419b8b0 PMD being used: compress_qat 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:58.266 [2024-07-24 23:28:43.012250] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f07cc19b8b0 PMD being used: compress_qat 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=0xf 00:06:58.266 [2024-07-24 23:28:43.012807] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xa23ba0 PMD being used: compress_qat 00:06:58.266 [2024-07-24 23:28:43.012888] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f07c419b8b0 PMD being used: compress_qat 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=decompress 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=1 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=Yes 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:58.266 23:28:43 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:59.198 23:28:44 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:59.198 23:28:44 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:59.198 23:28:44 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:59.198 23:28:44 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:59.198 23:28:44 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:59.198 23:28:44 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:59.198 23:28:44 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:59.198 23:28:44 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:59.198 23:28:44 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:59.198 23:28:44 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:59.198 23:28:44 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:59.198 23:28:44 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:59.198 23:28:44 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:59.198 23:28:44 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:59.198 23:28:44 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:59.198 23:28:44 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:59.198 23:28:44 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:59.198 23:28:44 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:59.198 23:28:44 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:59.198 23:28:44 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:59.198 23:28:44 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:59.198 23:28:44 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:59.198 23:28:44 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:59.198 23:28:44 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:59.198 23:28:44 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:59.198 23:28:44 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:59.198 23:28:44 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:59.198 23:28:44 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:59.198 23:28:44 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:59.198 23:28:44 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:59.198 23:28:44 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:59.198 23:28:44 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:59.198 23:28:44 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:59.198 23:28:44 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:59.198 23:28:44 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:59.198 23:28:44 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:59.198 23:28:44 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:06:59.198 23:28:44 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:06:59.198 23:28:44 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:06:59.198 00:06:59.198 real 0m1.725s 00:06:59.198 user 0m5.834s 00:06:59.198 sys 0m0.300s 00:06:59.198 23:28:44 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:59.198 23:28:44 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@10 -- # set +x 00:06:59.198 ************************************ 00:06:59.198 END TEST accel_cdev_decomp_mcore 00:06:59.198 ************************************ 00:06:59.456 23:28:44 accel -- accel/accel.sh@131 -- # run_test accel_cdev_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:59.456 23:28:44 accel -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:06:59.456 23:28:44 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:59.456 23:28:44 accel -- common/autotest_common.sh@10 -- # set +x 00:06:59.456 ************************************ 00:06:59.456 START TEST accel_cdev_decomp_full_mcore 00:06:59.456 ************************************ 00:06:59.456 23:28:44 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:59.456 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@16 -- # local accel_opc 00:06:59.456 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@17 -- # local accel_module 00:06:59.456 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:59.456 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:59.456 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:59.456 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@12 -- # build_accel_config 00:06:59.456 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:59.457 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:59.457 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:59.457 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:59.457 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:59.457 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:06:59.457 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:06:59.457 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@40 -- # local IFS=, 00:06:59.457 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@41 -- # jq -r . 00:06:59.457 [2024-07-24 23:28:44.263873] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:06:59.457 [2024-07-24 23:28:44.263919] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid220394 ] 00:06:59.457 [2024-07-24 23:28:44.327662] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:59.457 [2024-07-24 23:28:44.401764] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:06:59.457 [2024-07-24 23:28:44.401850] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:06:59.457 [2024-07-24 23:28:44.401945] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:06:59.457 [2024-07-24 23:28:44.401947] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:00.023 [2024-07-24 23:28:44.792543] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:07:00.023 [2024-07-24 23:28:44.794220] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x848120 PMD being used: compress_qat 00:07:00.023 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:00.023 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:00.023 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:00.023 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:00.023 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:00.023 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:00.023 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:00.023 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:00.023 [2024-07-24 23:28:44.797834] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f7f6419b8b0 PMD being used: compress_qat 00:07:00.023 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:00.023 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:00.023 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:00.023 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:00.023 [2024-07-24 23:28:44.798846] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f7f5c19b8b0 PMD being used: compress_qat 00:07:00.023 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=0xf 00:07:00.023 [2024-07-24 23:28:44.799403] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x84d620 PMD being used: compress_qat 00:07:00.023 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:00.023 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:00.023 [2024-07-24 23:28:44.799496] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f7f5419b8b0 PMD being used: compress_qat 00:07:00.023 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:00.023 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:00.023 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:00.023 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:00.023 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:00.023 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:00.023 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:00.023 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:00.023 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:00.023 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=decompress 00:07:00.023 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:00.023 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:00.023 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:00.023 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:00.023 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val='111250 bytes' 00:07:00.023 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:00.023 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:00.023 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:00.023 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:00.023 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:00.023 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:00.023 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:00.024 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:07:00.024 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:00.024 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:07:00.024 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:00.024 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:00.024 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:00.024 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:00.024 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:00.024 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:00.024 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:07:00.024 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:00.024 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:00.024 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:00.024 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:07:00.024 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:00.024 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:00.024 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:00.024 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=1 00:07:00.024 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:00.024 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:00.024 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:00.024 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:07:00.024 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:00.024 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:00.024 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:00.024 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=Yes 00:07:00.024 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:00.024 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:00.024 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:00.024 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:00.024 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:00.024 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:00.024 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:00.024 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:00.024 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:00.024 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:00.024 23:28:44 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:00.999 23:28:45 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:00.999 23:28:45 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:00.999 23:28:45 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:00.999 23:28:45 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:00.999 23:28:45 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:00.999 23:28:45 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:00.999 23:28:45 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:00.999 23:28:45 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:00.999 23:28:45 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:00.999 23:28:45 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:00.999 23:28:45 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:00.999 23:28:45 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:00.999 23:28:45 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:00.999 23:28:45 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:00.999 23:28:45 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:00.999 23:28:45 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:00.999 23:28:45 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:00.999 23:28:45 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:00.999 23:28:45 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:00.999 23:28:45 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:01.000 23:28:45 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:01.000 23:28:45 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:01.000 23:28:45 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:01.000 23:28:45 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:01.000 23:28:45 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:01.000 23:28:45 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:01.000 23:28:45 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:01.000 23:28:45 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:01.000 23:28:45 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:01.000 23:28:45 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:01.000 23:28:45 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:01.000 23:28:45 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:01.000 23:28:45 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:01.000 23:28:45 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:01.000 23:28:45 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:01.000 23:28:45 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:01.000 23:28:45 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:07:01.000 23:28:45 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:01.000 23:28:45 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:07:01.000 00:07:01.000 real 0m1.725s 00:07:01.000 user 0m5.837s 00:07:01.000 sys 0m0.299s 00:07:01.000 23:28:45 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:01.000 23:28:45 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@10 -- # set +x 00:07:01.000 ************************************ 00:07:01.000 END TEST accel_cdev_decomp_full_mcore 00:07:01.000 ************************************ 00:07:01.000 23:28:45 accel -- accel/accel.sh@132 -- # run_test accel_cdev_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:01.000 23:28:45 accel -- common/autotest_common.sh@1101 -- # '[' 11 -le 1 ']' 00:07:01.000 23:28:45 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:01.000 23:28:45 accel -- common/autotest_common.sh@10 -- # set +x 00:07:01.302 ************************************ 00:07:01.302 START TEST accel_cdev_decomp_mthread 00:07:01.302 ************************************ 00:07:01.302 23:28:46 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:01.302 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@16 -- # local accel_opc 00:07:01.302 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@17 -- # local accel_module 00:07:01.302 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:01.302 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:01.302 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:01.302 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:01.302 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@12 -- # build_accel_config 00:07:01.302 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:01.302 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:01.302 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:01.302 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:01.302 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:07:01.302 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:07:01.302 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@40 -- # local IFS=, 00:07:01.302 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@41 -- # jq -r . 00:07:01.302 [2024-07-24 23:28:46.059072] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:07:01.302 [2024-07-24 23:28:46.059125] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid220652 ] 00:07:01.302 [2024-07-24 23:28:46.125048] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:01.302 [2024-07-24 23:28:46.195977] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:01.559 [2024-07-24 23:28:46.557385] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:07:01.559 [2024-07-24 23:28:46.559061] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1b02a80 PMD being used: compress_qat 00:07:01.816 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:01.816 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:01.816 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:01.816 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:01.816 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:01.816 [2024-07-24 23:28:46.563027] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1b07c80 PMD being used: compress_qat 00:07:01.816 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:01.816 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:01.816 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:01.816 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:01.816 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:01.816 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:01.816 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:01.816 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=0x1 00:07:01.816 [2024-07-24 23:28:46.564642] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1c2aaf0 PMD being used: compress_qat 00:07:01.816 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:01.816 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:01.816 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:01.816 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:01.816 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:01.816 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:01.816 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:01.816 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:01.816 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:01.816 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:01.816 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:01.816 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=decompress 00:07:01.816 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:01.816 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:01.816 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:01.816 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:01.816 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:01.816 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:01.816 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:01.816 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:01.816 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:01.816 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:01.816 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:01.816 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:01.816 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:07:01.816 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:01.816 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:07:01.816 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:01.816 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:01.816 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:01.816 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:01.816 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:01.816 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:01.816 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:07:01.816 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:01.816 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:01.816 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:01.816 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:07:01.816 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:01.816 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:01.816 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:01.816 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=2 00:07:01.816 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:01.816 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:01.816 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:01.816 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:07:01.816 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:01.816 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:01.816 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:01.817 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=Yes 00:07:01.817 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:01.817 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:01.817 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:01.817 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:01.817 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:01.817 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:01.817 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:01.817 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:01.817 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:01.817 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:01.817 23:28:46 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:02.749 23:28:47 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:02.749 23:28:47 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:02.749 23:28:47 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:02.749 23:28:47 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:02.749 23:28:47 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:02.749 23:28:47 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:02.749 23:28:47 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:02.749 23:28:47 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:02.749 23:28:47 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:02.749 23:28:47 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:02.749 23:28:47 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:02.749 23:28:47 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:02.750 23:28:47 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:02.750 23:28:47 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:02.750 23:28:47 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:02.750 23:28:47 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:02.750 23:28:47 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:02.750 23:28:47 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:02.750 23:28:47 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:02.750 23:28:47 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:02.750 23:28:47 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:02.750 23:28:47 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:02.750 23:28:47 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:02.750 23:28:47 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:02.750 23:28:47 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:02.750 23:28:47 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:02.750 23:28:47 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:02.750 23:28:47 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:02.750 23:28:47 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:07:02.750 23:28:47 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:02.750 23:28:47 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:07:02.750 00:07:02.750 real 0m1.681s 00:07:02.750 user 0m1.417s 00:07:02.750 sys 0m0.271s 00:07:02.750 23:28:47 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:02.750 23:28:47 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@10 -- # set +x 00:07:02.750 ************************************ 00:07:02.750 END TEST accel_cdev_decomp_mthread 00:07:02.750 ************************************ 00:07:02.750 23:28:47 accel -- accel/accel.sh@133 -- # run_test accel_cdev_decomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:02.750 23:28:47 accel -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:07:02.750 23:28:47 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:02.750 23:28:47 accel -- common/autotest_common.sh@10 -- # set +x 00:07:03.008 ************************************ 00:07:03.008 START TEST accel_cdev_decomp_full_mthread 00:07:03.008 ************************************ 00:07:03.008 23:28:47 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:03.008 23:28:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@16 -- # local accel_opc 00:07:03.008 23:28:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@17 -- # local accel_module 00:07:03.008 23:28:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:03.008 23:28:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:03.008 23:28:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:03.008 23:28:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@12 -- # build_accel_config 00:07:03.008 23:28:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:03.008 23:28:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:03.008 23:28:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:03.008 23:28:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:03.008 23:28:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:03.008 23:28:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:07:03.008 23:28:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:07:03.008 23:28:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@40 -- # local IFS=, 00:07:03.008 23:28:47 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@41 -- # jq -r . 00:07:03.008 [2024-07-24 23:28:47.805194] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:07:03.008 [2024-07-24 23:28:47.805240] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid220908 ] 00:07:03.008 [2024-07-24 23:28:47.871585] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:03.008 [2024-07-24 23:28:47.943810] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:03.574 [2024-07-24 23:28:48.312537] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:07:03.574 [2024-07-24 23:28:48.314109] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1c9ba80 PMD being used: compress_qat 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:03.574 [2024-07-24 23:28:48.317254] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1c9bb20 PMD being used: compress_qat 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=0x1 00:07:03.574 [2024-07-24 23:28:48.318915] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1ea0710 PMD being used: compress_qat 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=decompress 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val='111250 bytes' 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=2 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=Yes 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:03.574 23:28:48 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:04.507 23:28:49 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:04.507 23:28:49 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:04.507 23:28:49 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:04.507 23:28:49 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:04.507 23:28:49 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:04.507 23:28:49 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:04.507 23:28:49 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:04.507 23:28:49 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:04.507 23:28:49 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:04.507 23:28:49 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:04.507 23:28:49 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:04.507 23:28:49 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:04.507 23:28:49 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:04.507 23:28:49 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:04.507 23:28:49 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:04.507 23:28:49 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:04.507 23:28:49 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:04.507 23:28:49 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:04.507 23:28:49 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:04.507 23:28:49 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:04.507 23:28:49 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:04.507 23:28:49 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:04.507 23:28:49 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:04.507 23:28:49 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:04.507 23:28:49 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:04.507 23:28:49 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:04.507 23:28:49 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:04.507 23:28:49 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:04.507 23:28:49 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:07:04.507 23:28:49 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:04.507 23:28:49 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:07:04.507 00:07:04.507 real 0m1.691s 00:07:04.507 user 0m1.416s 00:07:04.507 sys 0m0.272s 00:07:04.507 23:28:49 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:04.507 23:28:49 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@10 -- # set +x 00:07:04.507 ************************************ 00:07:04.507 END TEST accel_cdev_decomp_full_mthread 00:07:04.507 ************************************ 00:07:04.507 23:28:49 accel -- accel/accel.sh@134 -- # unset COMPRESSDEV 00:07:04.507 23:28:49 accel -- accel/accel.sh@137 -- # run_test accel_dif_functional_tests /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:07:04.507 23:28:49 accel -- accel/accel.sh@137 -- # build_accel_config 00:07:04.507 23:28:49 accel -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:07:04.507 23:28:49 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:04.507 23:28:49 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:04.507 23:28:49 accel -- common/autotest_common.sh@10 -- # set +x 00:07:04.507 23:28:49 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:04.507 23:28:49 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:04.507 23:28:49 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:04.507 23:28:49 accel -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:04.507 23:28:49 accel -- accel/accel.sh@40 -- # local IFS=, 00:07:04.507 23:28:49 accel -- accel/accel.sh@41 -- # jq -r . 00:07:04.765 ************************************ 00:07:04.765 START TEST accel_dif_functional_tests 00:07:04.765 ************************************ 00:07:04.765 23:28:49 accel.accel_dif_functional_tests -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:07:04.765 [2024-07-24 23:28:49.563199] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:07:04.765 [2024-07-24 23:28:49.563234] [ DPDK EAL parameters: DIF --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid221339 ] 00:07:04.765 [2024-07-24 23:28:49.625176] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:04.765 [2024-07-24 23:28:49.699211] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:04.765 [2024-07-24 23:28:49.699230] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:07:04.765 [2024-07-24 23:28:49.699232] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:05.023 00:07:05.023 00:07:05.023 CUnit - A unit testing framework for C - Version 2.1-3 00:07:05.023 http://cunit.sourceforge.net/ 00:07:05.023 00:07:05.023 00:07:05.023 Suite: accel_dif 00:07:05.023 Test: verify: DIF generated, GUARD check ...passed 00:07:05.023 Test: verify: DIF generated, APPTAG check ...passed 00:07:05.023 Test: verify: DIF generated, REFTAG check ...passed 00:07:05.023 Test: verify: DIF not generated, GUARD check ...[2024-07-24 23:28:49.776451] dif.c: 861:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:07:05.023 passed 00:07:05.023 Test: verify: DIF not generated, APPTAG check ...[2024-07-24 23:28:49.776517] dif.c: 876:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:07:05.023 passed 00:07:05.023 Test: verify: DIF not generated, REFTAG check ...[2024-07-24 23:28:49.776536] dif.c: 811:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:07:05.023 passed 00:07:05.023 Test: verify: APPTAG correct, APPTAG check ...passed 00:07:05.023 Test: verify: APPTAG incorrect, APPTAG check ...[2024-07-24 23:28:49.776579] dif.c: 876:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=30, Expected=28, Actual=14 00:07:05.023 passed 00:07:05.023 Test: verify: APPTAG incorrect, no APPTAG check ...passed 00:07:05.023 Test: verify: REFTAG incorrect, REFTAG ignore ...passed 00:07:05.023 Test: verify: REFTAG_INIT correct, REFTAG check ...passed 00:07:05.023 Test: verify: REFTAG_INIT incorrect, REFTAG check ...[2024-07-24 23:28:49.776675] dif.c: 811:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=10 00:07:05.023 passed 00:07:05.023 Test: verify copy: DIF generated, GUARD check ...passed 00:07:05.023 Test: verify copy: DIF generated, APPTAG check ...passed 00:07:05.023 Test: verify copy: DIF generated, REFTAG check ...passed 00:07:05.023 Test: verify copy: DIF not generated, GUARD check ...[2024-07-24 23:28:49.776783] dif.c: 861:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:07:05.023 passed 00:07:05.023 Test: verify copy: DIF not generated, APPTAG check ...[2024-07-24 23:28:49.776802] dif.c: 876:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:07:05.023 passed 00:07:05.023 Test: verify copy: DIF not generated, REFTAG check ...[2024-07-24 23:28:49.776821] dif.c: 811:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:07:05.023 passed 00:07:05.023 Test: generate copy: DIF generated, GUARD check ...passed 00:07:05.023 Test: generate copy: DIF generated, APTTAG check ...passed 00:07:05.023 Test: generate copy: DIF generated, REFTAG check ...passed 00:07:05.023 Test: generate copy: DIF generated, no GUARD check flag set ...passed 00:07:05.023 Test: generate copy: DIF generated, no APPTAG check flag set ...passed 00:07:05.023 Test: generate copy: DIF generated, no REFTAG check flag set ...passed 00:07:05.023 Test: generate copy: iovecs-len validate ...[2024-07-24 23:28:49.776983] dif.c:1225:spdk_dif_generate_copy: *ERROR*: Size of bounce_iovs arrays are not valid or misaligned with block_size. 00:07:05.023 passed 00:07:05.023 Test: generate copy: buffer alignment validate ...passed 00:07:05.023 00:07:05.023 Run Summary: Type Total Ran Passed Failed Inactive 00:07:05.023 suites 1 1 n/a 0 0 00:07:05.023 tests 26 26 26 0 0 00:07:05.023 asserts 115 115 115 0 n/a 00:07:05.023 00:07:05.023 Elapsed time = 0.002 seconds 00:07:05.023 00:07:05.023 real 0m0.416s 00:07:05.023 user 0m0.634s 00:07:05.023 sys 0m0.144s 00:07:05.023 23:28:49 accel.accel_dif_functional_tests -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:05.024 23:28:49 accel.accel_dif_functional_tests -- common/autotest_common.sh@10 -- # set +x 00:07:05.024 ************************************ 00:07:05.024 END TEST accel_dif_functional_tests 00:07:05.024 ************************************ 00:07:05.024 00:07:05.024 real 0m45.104s 00:07:05.024 user 0m55.208s 00:07:05.024 sys 0m7.110s 00:07:05.024 23:28:49 accel -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:05.024 23:28:49 accel -- common/autotest_common.sh@10 -- # set +x 00:07:05.024 ************************************ 00:07:05.024 END TEST accel 00:07:05.024 ************************************ 00:07:05.024 23:28:50 -- spdk/autotest.sh@186 -- # run_test accel_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel_rpc.sh 00:07:05.024 23:28:50 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:05.024 23:28:50 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:05.024 23:28:50 -- common/autotest_common.sh@10 -- # set +x 00:07:05.282 ************************************ 00:07:05.282 START TEST accel_rpc 00:07:05.282 ************************************ 00:07:05.282 23:28:50 accel_rpc -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel_rpc.sh 00:07:05.282 * Looking for test storage... 00:07:05.282 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel 00:07:05.282 23:28:50 accel_rpc -- accel/accel_rpc.sh@11 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:07:05.282 23:28:50 accel_rpc -- accel/accel_rpc.sh@14 -- # spdk_tgt_pid=221435 00:07:05.282 23:28:50 accel_rpc -- accel/accel_rpc.sh@13 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --wait-for-rpc 00:07:05.282 23:28:50 accel_rpc -- accel/accel_rpc.sh@15 -- # waitforlisten 221435 00:07:05.282 23:28:50 accel_rpc -- common/autotest_common.sh@831 -- # '[' -z 221435 ']' 00:07:05.282 23:28:50 accel_rpc -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:05.282 23:28:50 accel_rpc -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:05.282 23:28:50 accel_rpc -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:05.282 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:05.282 23:28:50 accel_rpc -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:05.282 23:28:50 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:05.282 [2024-07-24 23:28:50.188674] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:07:05.282 [2024-07-24 23:28:50.188718] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid221435 ] 00:07:05.282 [2024-07-24 23:28:50.250287] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:05.540 [2024-07-24 23:28:50.327408] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:06.105 23:28:50 accel_rpc -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:06.105 23:28:50 accel_rpc -- common/autotest_common.sh@864 -- # return 0 00:07:06.105 23:28:50 accel_rpc -- accel/accel_rpc.sh@45 -- # [[ y == y ]] 00:07:06.105 23:28:50 accel_rpc -- accel/accel_rpc.sh@45 -- # [[ 0 -gt 0 ]] 00:07:06.105 23:28:50 accel_rpc -- accel/accel_rpc.sh@49 -- # [[ y == y ]] 00:07:06.105 23:28:50 accel_rpc -- accel/accel_rpc.sh@49 -- # [[ 0 -gt 0 ]] 00:07:06.105 23:28:50 accel_rpc -- accel/accel_rpc.sh@53 -- # run_test accel_assign_opcode accel_assign_opcode_test_suite 00:07:06.105 23:28:50 accel_rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:06.105 23:28:50 accel_rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:06.105 23:28:50 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:06.105 ************************************ 00:07:06.105 START TEST accel_assign_opcode 00:07:06.105 ************************************ 00:07:06.105 23:28:51 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@1125 -- # accel_assign_opcode_test_suite 00:07:06.105 23:28:51 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@38 -- # rpc_cmd accel_assign_opc -o copy -m incorrect 00:07:06.105 23:28:51 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:06.105 23:28:51 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:07:06.105 [2024-07-24 23:28:51.025436] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module incorrect 00:07:06.105 23:28:51 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:06.105 23:28:51 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@40 -- # rpc_cmd accel_assign_opc -o copy -m software 00:07:06.105 23:28:51 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:06.105 23:28:51 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:07:06.105 [2024-07-24 23:28:51.033450] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module software 00:07:06.105 23:28:51 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:06.105 23:28:51 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@41 -- # rpc_cmd framework_start_init 00:07:06.105 23:28:51 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:06.105 23:28:51 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:07:06.363 23:28:51 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:06.363 23:28:51 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # rpc_cmd accel_get_opc_assignments 00:07:06.363 23:28:51 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:06.363 23:28:51 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:07:06.363 23:28:51 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # jq -r .copy 00:07:06.363 23:28:51 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # grep software 00:07:06.363 23:28:51 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:06.363 software 00:07:06.363 00:07:06.363 real 0m0.247s 00:07:06.363 user 0m0.040s 00:07:06.363 sys 0m0.010s 00:07:06.363 23:28:51 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:06.363 23:28:51 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:07:06.363 ************************************ 00:07:06.363 END TEST accel_assign_opcode 00:07:06.363 ************************************ 00:07:06.363 23:28:51 accel_rpc -- accel/accel_rpc.sh@55 -- # killprocess 221435 00:07:06.363 23:28:51 accel_rpc -- common/autotest_common.sh@950 -- # '[' -z 221435 ']' 00:07:06.363 23:28:51 accel_rpc -- common/autotest_common.sh@954 -- # kill -0 221435 00:07:06.363 23:28:51 accel_rpc -- common/autotest_common.sh@955 -- # uname 00:07:06.363 23:28:51 accel_rpc -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:06.363 23:28:51 accel_rpc -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 221435 00:07:06.363 23:28:51 accel_rpc -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:06.363 23:28:51 accel_rpc -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:06.363 23:28:51 accel_rpc -- common/autotest_common.sh@968 -- # echo 'killing process with pid 221435' 00:07:06.363 killing process with pid 221435 00:07:06.363 23:28:51 accel_rpc -- common/autotest_common.sh@969 -- # kill 221435 00:07:06.363 23:28:51 accel_rpc -- common/autotest_common.sh@974 -- # wait 221435 00:07:06.929 00:07:06.929 real 0m1.602s 00:07:06.929 user 0m1.655s 00:07:06.929 sys 0m0.426s 00:07:06.929 23:28:51 accel_rpc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:06.929 23:28:51 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:06.929 ************************************ 00:07:06.929 END TEST accel_rpc 00:07:06.929 ************************************ 00:07:06.929 23:28:51 -- spdk/autotest.sh@189 -- # run_test app_cmdline /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/cmdline.sh 00:07:06.929 23:28:51 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:06.929 23:28:51 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:06.929 23:28:51 -- common/autotest_common.sh@10 -- # set +x 00:07:06.929 ************************************ 00:07:06.929 START TEST app_cmdline 00:07:06.929 ************************************ 00:07:06.929 23:28:51 app_cmdline -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/cmdline.sh 00:07:06.929 * Looking for test storage... 00:07:06.929 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:07:06.929 23:28:51 app_cmdline -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:07:06.929 23:28:51 app_cmdline -- app/cmdline.sh@16 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:07:06.929 23:28:51 app_cmdline -- app/cmdline.sh@17 -- # spdk_tgt_pid=221744 00:07:06.929 23:28:51 app_cmdline -- app/cmdline.sh@18 -- # waitforlisten 221744 00:07:06.929 23:28:51 app_cmdline -- common/autotest_common.sh@831 -- # '[' -z 221744 ']' 00:07:06.929 23:28:51 app_cmdline -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:06.929 23:28:51 app_cmdline -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:06.929 23:28:51 app_cmdline -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:06.929 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:06.929 23:28:51 app_cmdline -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:06.929 23:28:51 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:07:06.929 [2024-07-24 23:28:51.842789] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:07:06.929 [2024-07-24 23:28:51.842837] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid221744 ] 00:07:06.929 [2024-07-24 23:28:51.907724] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:07.186 [2024-07-24 23:28:51.987316] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:07.751 23:28:52 app_cmdline -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:07.751 23:28:52 app_cmdline -- common/autotest_common.sh@864 -- # return 0 00:07:07.751 23:28:52 app_cmdline -- app/cmdline.sh@20 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:07:08.009 { 00:07:08.009 "version": "SPDK v24.09-pre git sha1 68f798423", 00:07:08.009 "fields": { 00:07:08.009 "major": 24, 00:07:08.009 "minor": 9, 00:07:08.009 "patch": 0, 00:07:08.009 "suffix": "-pre", 00:07:08.009 "commit": "68f798423" 00:07:08.009 } 00:07:08.009 } 00:07:08.009 23:28:52 app_cmdline -- app/cmdline.sh@22 -- # expected_methods=() 00:07:08.009 23:28:52 app_cmdline -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:07:08.009 23:28:52 app_cmdline -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:07:08.009 23:28:52 app_cmdline -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:07:08.009 23:28:52 app_cmdline -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:07:08.009 23:28:52 app_cmdline -- app/cmdline.sh@26 -- # jq -r '.[]' 00:07:08.009 23:28:52 app_cmdline -- app/cmdline.sh@26 -- # sort 00:07:08.009 23:28:52 app_cmdline -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:08.009 23:28:52 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:07:08.009 23:28:52 app_cmdline -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:08.009 23:28:52 app_cmdline -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:07:08.009 23:28:52 app_cmdline -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:07:08.009 23:28:52 app_cmdline -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:08.009 23:28:52 app_cmdline -- common/autotest_common.sh@650 -- # local es=0 00:07:08.009 23:28:52 app_cmdline -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:08.009 23:28:52 app_cmdline -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:07:08.009 23:28:52 app_cmdline -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:08.009 23:28:52 app_cmdline -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:07:08.009 23:28:52 app_cmdline -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:08.009 23:28:52 app_cmdline -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:07:08.009 23:28:52 app_cmdline -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:08.009 23:28:52 app_cmdline -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:07:08.009 23:28:52 app_cmdline -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:07:08.009 23:28:52 app_cmdline -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:08.267 request: 00:07:08.267 { 00:07:08.267 "method": "env_dpdk_get_mem_stats", 00:07:08.267 "req_id": 1 00:07:08.267 } 00:07:08.267 Got JSON-RPC error response 00:07:08.267 response: 00:07:08.267 { 00:07:08.267 "code": -32601, 00:07:08.267 "message": "Method not found" 00:07:08.267 } 00:07:08.267 23:28:53 app_cmdline -- common/autotest_common.sh@653 -- # es=1 00:07:08.267 23:28:53 app_cmdline -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:07:08.267 23:28:53 app_cmdline -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:07:08.267 23:28:53 app_cmdline -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:07:08.267 23:28:53 app_cmdline -- app/cmdline.sh@1 -- # killprocess 221744 00:07:08.267 23:28:53 app_cmdline -- common/autotest_common.sh@950 -- # '[' -z 221744 ']' 00:07:08.267 23:28:53 app_cmdline -- common/autotest_common.sh@954 -- # kill -0 221744 00:07:08.267 23:28:53 app_cmdline -- common/autotest_common.sh@955 -- # uname 00:07:08.267 23:28:53 app_cmdline -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:08.267 23:28:53 app_cmdline -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 221744 00:07:08.267 23:28:53 app_cmdline -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:08.267 23:28:53 app_cmdline -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:08.267 23:28:53 app_cmdline -- common/autotest_common.sh@968 -- # echo 'killing process with pid 221744' 00:07:08.267 killing process with pid 221744 00:07:08.267 23:28:53 app_cmdline -- common/autotest_common.sh@969 -- # kill 221744 00:07:08.267 23:28:53 app_cmdline -- common/autotest_common.sh@974 -- # wait 221744 00:07:08.534 00:07:08.534 real 0m1.655s 00:07:08.534 user 0m1.951s 00:07:08.534 sys 0m0.422s 00:07:08.534 23:28:53 app_cmdline -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:08.534 23:28:53 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:07:08.534 ************************************ 00:07:08.534 END TEST app_cmdline 00:07:08.534 ************************************ 00:07:08.534 23:28:53 -- spdk/autotest.sh@190 -- # run_test version /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/version.sh 00:07:08.534 23:28:53 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:08.534 23:28:53 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:08.534 23:28:53 -- common/autotest_common.sh@10 -- # set +x 00:07:08.534 ************************************ 00:07:08.534 START TEST version 00:07:08.534 ************************************ 00:07:08.534 23:28:53 version -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/version.sh 00:07:08.534 * Looking for test storage... 00:07:08.534 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:07:08.534 23:28:53 version -- app/version.sh@17 -- # get_header_version major 00:07:08.534 23:28:53 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:07:08.534 23:28:53 version -- app/version.sh@14 -- # cut -f2 00:07:08.534 23:28:53 version -- app/version.sh@14 -- # tr -d '"' 00:07:08.534 23:28:53 version -- app/version.sh@17 -- # major=24 00:07:08.534 23:28:53 version -- app/version.sh@18 -- # get_header_version minor 00:07:08.534 23:28:53 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:07:08.534 23:28:53 version -- app/version.sh@14 -- # cut -f2 00:07:08.534 23:28:53 version -- app/version.sh@14 -- # tr -d '"' 00:07:08.534 23:28:53 version -- app/version.sh@18 -- # minor=9 00:07:08.534 23:28:53 version -- app/version.sh@19 -- # get_header_version patch 00:07:08.534 23:28:53 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:07:08.534 23:28:53 version -- app/version.sh@14 -- # cut -f2 00:07:08.534 23:28:53 version -- app/version.sh@14 -- # tr -d '"' 00:07:08.796 23:28:53 version -- app/version.sh@19 -- # patch=0 00:07:08.796 23:28:53 version -- app/version.sh@20 -- # get_header_version suffix 00:07:08.796 23:28:53 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:07:08.796 23:28:53 version -- app/version.sh@14 -- # tr -d '"' 00:07:08.796 23:28:53 version -- app/version.sh@14 -- # cut -f2 00:07:08.796 23:28:53 version -- app/version.sh@20 -- # suffix=-pre 00:07:08.796 23:28:53 version -- app/version.sh@22 -- # version=24.9 00:07:08.796 23:28:53 version -- app/version.sh@25 -- # (( patch != 0 )) 00:07:08.796 23:28:53 version -- app/version.sh@28 -- # version=24.9rc0 00:07:08.796 23:28:53 version -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:07:08.796 23:28:53 version -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:07:08.796 23:28:53 version -- app/version.sh@30 -- # py_version=24.9rc0 00:07:08.796 23:28:53 version -- app/version.sh@31 -- # [[ 24.9rc0 == \2\4\.\9\r\c\0 ]] 00:07:08.796 00:07:08.796 real 0m0.151s 00:07:08.796 user 0m0.077s 00:07:08.796 sys 0m0.105s 00:07:08.796 23:28:53 version -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:08.796 23:28:53 version -- common/autotest_common.sh@10 -- # set +x 00:07:08.796 ************************************ 00:07:08.796 END TEST version 00:07:08.796 ************************************ 00:07:08.796 23:28:53 -- spdk/autotest.sh@192 -- # '[' 1 -eq 1 ']' 00:07:08.796 23:28:53 -- spdk/autotest.sh@193 -- # run_test blockdev_general /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh 00:07:08.796 23:28:53 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:08.796 23:28:53 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:08.796 23:28:53 -- common/autotest_common.sh@10 -- # set +x 00:07:08.796 ************************************ 00:07:08.796 START TEST blockdev_general 00:07:08.796 ************************************ 00:07:08.796 23:28:53 blockdev_general -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh 00:07:08.796 * Looking for test storage... 00:07:08.796 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:07:08.796 23:28:53 blockdev_general -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:07:08.796 23:28:53 blockdev_general -- bdev/nbd_common.sh@6 -- # set -e 00:07:08.796 23:28:53 blockdev_general -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:07:08.796 23:28:53 blockdev_general -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:07:08.796 23:28:53 blockdev_general -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:07:08.796 23:28:53 blockdev_general -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:07:08.796 23:28:53 blockdev_general -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:07:08.796 23:28:53 blockdev_general -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:07:08.796 23:28:53 blockdev_general -- bdev/blockdev.sh@20 -- # : 00:07:08.796 23:28:53 blockdev_general -- bdev/blockdev.sh@669 -- # QOS_DEV_1=Malloc_0 00:07:08.796 23:28:53 blockdev_general -- bdev/blockdev.sh@670 -- # QOS_DEV_2=Null_1 00:07:08.796 23:28:53 blockdev_general -- bdev/blockdev.sh@671 -- # QOS_RUN_TIME=5 00:07:08.796 23:28:53 blockdev_general -- bdev/blockdev.sh@673 -- # uname -s 00:07:08.796 23:28:53 blockdev_general -- bdev/blockdev.sh@673 -- # '[' Linux = Linux ']' 00:07:08.796 23:28:53 blockdev_general -- bdev/blockdev.sh@675 -- # PRE_RESERVED_MEM=0 00:07:08.796 23:28:53 blockdev_general -- bdev/blockdev.sh@681 -- # test_type=bdev 00:07:08.796 23:28:53 blockdev_general -- bdev/blockdev.sh@682 -- # crypto_device= 00:07:08.796 23:28:53 blockdev_general -- bdev/blockdev.sh@683 -- # dek= 00:07:08.796 23:28:53 blockdev_general -- bdev/blockdev.sh@684 -- # env_ctx= 00:07:08.796 23:28:53 blockdev_general -- bdev/blockdev.sh@685 -- # wait_for_rpc= 00:07:08.796 23:28:53 blockdev_general -- bdev/blockdev.sh@686 -- # '[' -n '' ']' 00:07:08.796 23:28:53 blockdev_general -- bdev/blockdev.sh@689 -- # [[ bdev == bdev ]] 00:07:08.796 23:28:53 blockdev_general -- bdev/blockdev.sh@690 -- # wait_for_rpc=--wait-for-rpc 00:07:08.796 23:28:53 blockdev_general -- bdev/blockdev.sh@692 -- # start_spdk_tgt 00:07:08.796 23:28:53 blockdev_general -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=222310 00:07:08.797 23:28:53 blockdev_general -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:07:08.797 23:28:53 blockdev_general -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:07:08.797 23:28:53 blockdev_general -- bdev/blockdev.sh@49 -- # waitforlisten 222310 00:07:08.797 23:28:53 blockdev_general -- common/autotest_common.sh@831 -- # '[' -z 222310 ']' 00:07:08.797 23:28:53 blockdev_general -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:08.797 23:28:53 blockdev_general -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:08.797 23:28:53 blockdev_general -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:08.797 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:08.797 23:28:53 blockdev_general -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:08.797 23:28:53 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:08.797 [2024-07-24 23:28:53.795537] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:07:08.797 [2024-07-24 23:28:53.795582] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid222310 ] 00:07:09.053 [2024-07-24 23:28:53.860275] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:09.053 [2024-07-24 23:28:53.931389] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:09.617 23:28:54 blockdev_general -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:09.617 23:28:54 blockdev_general -- common/autotest_common.sh@864 -- # return 0 00:07:09.617 23:28:54 blockdev_general -- bdev/blockdev.sh@693 -- # case "$test_type" in 00:07:09.617 23:28:54 blockdev_general -- bdev/blockdev.sh@695 -- # setup_bdev_conf 00:07:09.617 23:28:54 blockdev_general -- bdev/blockdev.sh@53 -- # rpc_cmd 00:07:09.617 23:28:54 blockdev_general -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:09.617 23:28:54 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:09.875 [2024-07-24 23:28:54.782046] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:07:09.875 [2024-07-24 23:28:54.782089] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:07:09.875 00:07:09.875 [2024-07-24 23:28:54.790036] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:07:09.875 [2024-07-24 23:28:54.790051] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:07:09.875 00:07:09.875 Malloc0 00:07:09.875 Malloc1 00:07:09.875 Malloc2 00:07:09.875 Malloc3 00:07:09.875 Malloc4 00:07:09.875 Malloc5 00:07:10.132 Malloc6 00:07:10.132 Malloc7 00:07:10.132 Malloc8 00:07:10.132 Malloc9 00:07:10.132 [2024-07-24 23:28:54.914327] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:10.132 [2024-07-24 23:28:54.914365] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:07:10.132 [2024-07-24 23:28:54.914377] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d61500 00:07:10.132 [2024-07-24 23:28:54.914383] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:07:10.132 [2024-07-24 23:28:54.915300] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:07:10.132 [2024-07-24 23:28:54.915322] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:07:10.132 TestPT 00:07:10.132 23:28:54 blockdev_general -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:10.132 23:28:54 blockdev_general -- bdev/blockdev.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile bs=2048 count=5000 00:07:10.132 5000+0 records in 00:07:10.132 5000+0 records out 00:07:10.132 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0162696 s, 629 MB/s 00:07:10.132 23:28:54 blockdev_general -- bdev/blockdev.sh@77 -- # rpc_cmd bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile AIO0 2048 00:07:10.132 23:28:54 blockdev_general -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:10.132 23:28:54 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:10.132 AIO0 00:07:10.133 23:28:55 blockdev_general -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:10.133 23:28:55 blockdev_general -- bdev/blockdev.sh@736 -- # rpc_cmd bdev_wait_for_examine 00:07:10.133 23:28:55 blockdev_general -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:10.133 23:28:55 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:10.133 23:28:55 blockdev_general -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:10.133 23:28:55 blockdev_general -- bdev/blockdev.sh@739 -- # cat 00:07:10.133 23:28:55 blockdev_general -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n accel 00:07:10.133 23:28:55 blockdev_general -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:10.133 23:28:55 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:10.133 23:28:55 blockdev_general -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:10.133 23:28:55 blockdev_general -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n bdev 00:07:10.133 23:28:55 blockdev_general -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:10.133 23:28:55 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:10.133 23:28:55 blockdev_general -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:10.133 23:28:55 blockdev_general -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n iobuf 00:07:10.133 23:28:55 blockdev_general -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:10.133 23:28:55 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:10.133 23:28:55 blockdev_general -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:10.133 23:28:55 blockdev_general -- bdev/blockdev.sh@747 -- # mapfile -t bdevs 00:07:10.133 23:28:55 blockdev_general -- bdev/blockdev.sh@747 -- # rpc_cmd bdev_get_bdevs 00:07:10.133 23:28:55 blockdev_general -- bdev/blockdev.sh@747 -- # jq -r '.[] | select(.claimed == false)' 00:07:10.133 23:28:55 blockdev_general -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:10.133 23:28:55 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:10.392 23:28:55 blockdev_general -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:10.392 23:28:55 blockdev_general -- bdev/blockdev.sh@748 -- # mapfile -t bdevs_name 00:07:10.392 23:28:55 blockdev_general -- bdev/blockdev.sh@748 -- # jq -r .name 00:07:10.393 23:28:55 blockdev_general -- bdev/blockdev.sh@748 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "27007d4b-45bc-4ee7-9501-bad740ba0d3a"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "27007d4b-45bc-4ee7-9501-bad740ba0d3a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "b3ca61dc-09b7-5311-8eca-5183cabc2323"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "b3ca61dc-09b7-5311-8eca-5183cabc2323",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "e2226168-a93d-5eb6-975e-8503fce84d59"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "e2226168-a93d-5eb6-975e-8503fce84d59",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "9ee45303-5869-57fe-b02a-6f57bb958f7d"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "9ee45303-5869-57fe-b02a-6f57bb958f7d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "ae217e2d-d155-5ec7-88dd-cdbfeb63a060"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "ae217e2d-d155-5ec7-88dd-cdbfeb63a060",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "948a2fe2-b282-56eb-b119-8666fa630086"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "948a2fe2-b282-56eb-b119-8666fa630086",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "60d1e1c9-3738-56ff-b90b-dc50a50232f7"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "60d1e1c9-3738-56ff-b90b-dc50a50232f7",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "5e57dd1c-fbd3-56f1-9049-f03f312647d8"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "5e57dd1c-fbd3-56f1-9049-f03f312647d8",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "beb25c0a-f3ef-5900-8892-609b26d5a3eb"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "beb25c0a-f3ef-5900-8892-609b26d5a3eb",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "8ed342b1-5d9a-59bf-8721-599282d79d4d"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "8ed342b1-5d9a-59bf-8721-599282d79d4d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "e20fb6e5-601c-537d-baf5-f021a793f27d"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "e20fb6e5-601c-537d-baf5-f021a793f27d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "72c3a8ca-280b-5268-899b-fc758ad3a78b"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "72c3a8ca-280b-5268-899b-fc758ad3a78b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "e2a6c4ce-1d04-4cac-a6a2-f092adad2db1"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "e2a6c4ce-1d04-4cac-a6a2-f092adad2db1",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "e2a6c4ce-1d04-4cac-a6a2-f092adad2db1",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "4d766dc8-bcc3-4c45-8104-2dd12d7bfcdc",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "0fa26c94-4bd6-4a2c-8d0d-cee6030946cf",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "0580588f-4aff-41b9-bee0-b7beecd77712"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "0580588f-4aff-41b9-bee0-b7beecd77712",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "0580588f-4aff-41b9-bee0-b7beecd77712",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "fd706e87-a2ec-4a32-9303-2d42699a3e46",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "f29eb4bc-83da-4e95-bc81-37b30ca509ab",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "d7b1bac4-f601-437b-af87-145d3852915d"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "d7b1bac4-f601-437b-af87-145d3852915d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "d7b1bac4-f601-437b-af87-145d3852915d",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "a9d2b9ae-6a3e-4f70-b6d6-292ade35e6a0",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "e31e4a71-65be-4eff-a034-92497e221200",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "a187f8c4-1b70-49a5-a6e7-349457c09452"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "a187f8c4-1b70-49a5-a6e7-349457c09452",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:07:10.393 23:28:55 blockdev_general -- bdev/blockdev.sh@749 -- # bdev_list=("${bdevs_name[@]}") 00:07:10.393 23:28:55 blockdev_general -- bdev/blockdev.sh@751 -- # hello_world_bdev=Malloc0 00:07:10.393 23:28:55 blockdev_general -- bdev/blockdev.sh@752 -- # trap - SIGINT SIGTERM EXIT 00:07:10.393 23:28:55 blockdev_general -- bdev/blockdev.sh@753 -- # killprocess 222310 00:07:10.393 23:28:55 blockdev_general -- common/autotest_common.sh@950 -- # '[' -z 222310 ']' 00:07:10.393 23:28:55 blockdev_general -- common/autotest_common.sh@954 -- # kill -0 222310 00:07:10.393 23:28:55 blockdev_general -- common/autotest_common.sh@955 -- # uname 00:07:10.393 23:28:55 blockdev_general -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:10.393 23:28:55 blockdev_general -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 222310 00:07:10.393 23:28:55 blockdev_general -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:10.393 23:28:55 blockdev_general -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:10.393 23:28:55 blockdev_general -- common/autotest_common.sh@968 -- # echo 'killing process with pid 222310' 00:07:10.393 killing process with pid 222310 00:07:10.393 23:28:55 blockdev_general -- common/autotest_common.sh@969 -- # kill 222310 00:07:10.393 23:28:55 blockdev_general -- common/autotest_common.sh@974 -- # wait 222310 00:07:10.960 23:28:55 blockdev_general -- bdev/blockdev.sh@757 -- # trap cleanup SIGINT SIGTERM EXIT 00:07:10.960 23:28:55 blockdev_general -- bdev/blockdev.sh@759 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b Malloc0 '' 00:07:10.960 23:28:55 blockdev_general -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:07:10.960 23:28:55 blockdev_general -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:10.960 23:28:55 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:10.960 ************************************ 00:07:10.960 START TEST bdev_hello_world 00:07:10.960 ************************************ 00:07:10.960 23:28:55 blockdev_general.bdev_hello_world -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b Malloc0 '' 00:07:10.960 [2024-07-24 23:28:55.734559] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:07:10.960 [2024-07-24 23:28:55.734597] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid222570 ] 00:07:10.960 [2024-07-24 23:28:55.797829] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:10.960 [2024-07-24 23:28:55.867400] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:11.219 [2024-07-24 23:28:56.004024] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:11.219 [2024-07-24 23:28:56.004063] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:07:11.219 [2024-07-24 23:28:56.004071] vbdev_passthru.c: 736:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:07:11.219 [2024-07-24 23:28:56.012032] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:07:11.219 [2024-07-24 23:28:56.012048] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:07:11.219 [2024-07-24 23:28:56.020045] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:07:11.219 [2024-07-24 23:28:56.020057] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:07:11.219 [2024-07-24 23:28:56.086976] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:11.219 [2024-07-24 23:28:56.087015] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:07:11.219 [2024-07-24 23:28:56.087023] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfc8e50 00:07:11.219 [2024-07-24 23:28:56.087028] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:07:11.219 [2024-07-24 23:28:56.088009] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:07:11.219 [2024-07-24 23:28:56.088031] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:07:11.477 [2024-07-24 23:28:56.237812] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:07:11.477 [2024-07-24 23:28:56.237852] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev Malloc0 00:07:11.477 [2024-07-24 23:28:56.237875] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:07:11.477 [2024-07-24 23:28:56.237907] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:07:11.477 [2024-07-24 23:28:56.237942] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:07:11.477 [2024-07-24 23:28:56.237952] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:07:11.477 [2024-07-24 23:28:56.237980] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:07:11.477 00:07:11.477 [2024-07-24 23:28:56.237995] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:07:11.735 00:07:11.735 real 0m0.811s 00:07:11.735 user 0m0.538s 00:07:11.735 sys 0m0.220s 00:07:11.735 23:28:56 blockdev_general.bdev_hello_world -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:11.735 23:28:56 blockdev_general.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:07:11.735 ************************************ 00:07:11.735 END TEST bdev_hello_world 00:07:11.735 ************************************ 00:07:11.735 23:28:56 blockdev_general -- bdev/blockdev.sh@760 -- # run_test bdev_bounds bdev_bounds '' 00:07:11.735 23:28:56 blockdev_general -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:07:11.735 23:28:56 blockdev_general -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:11.735 23:28:56 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:11.735 ************************************ 00:07:11.735 START TEST bdev_bounds 00:07:11.735 ************************************ 00:07:11.735 23:28:56 blockdev_general.bdev_bounds -- common/autotest_common.sh@1125 -- # bdev_bounds '' 00:07:11.735 23:28:56 blockdev_general.bdev_bounds -- bdev/blockdev.sh@289 -- # bdevio_pid=222805 00:07:11.735 23:28:56 blockdev_general.bdev_bounds -- bdev/blockdev.sh@290 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:07:11.735 23:28:56 blockdev_general.bdev_bounds -- bdev/blockdev.sh@288 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:07:11.735 23:28:56 blockdev_general.bdev_bounds -- bdev/blockdev.sh@291 -- # echo 'Process bdevio pid: 222805' 00:07:11.735 Process bdevio pid: 222805 00:07:11.735 23:28:56 blockdev_general.bdev_bounds -- bdev/blockdev.sh@292 -- # waitforlisten 222805 00:07:11.735 23:28:56 blockdev_general.bdev_bounds -- common/autotest_common.sh@831 -- # '[' -z 222805 ']' 00:07:11.735 23:28:56 blockdev_general.bdev_bounds -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:11.735 23:28:56 blockdev_general.bdev_bounds -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:11.735 23:28:56 blockdev_general.bdev_bounds -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:11.735 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:11.735 23:28:56 blockdev_general.bdev_bounds -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:11.735 23:28:56 blockdev_general.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:07:11.735 [2024-07-24 23:28:56.613872] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:07:11.735 [2024-07-24 23:28:56.613912] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid222805 ] 00:07:11.735 [2024-07-24 23:28:56.678442] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:11.992 [2024-07-24 23:28:56.759001] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:11.992 [2024-07-24 23:28:56.759096] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:07:11.992 [2024-07-24 23:28:56.759098] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:11.992 [2024-07-24 23:28:56.893773] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:11.992 [2024-07-24 23:28:56.893818] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:07:11.992 [2024-07-24 23:28:56.893825] vbdev_passthru.c: 736:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:07:11.992 [2024-07-24 23:28:56.901776] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:07:11.992 [2024-07-24 23:28:56.901791] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:07:11.992 [2024-07-24 23:28:56.909793] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:07:11.992 [2024-07-24 23:28:56.909807] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:07:11.992 [2024-07-24 23:28:56.976866] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:11.992 [2024-07-24 23:28:56.976905] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:07:11.992 [2024-07-24 23:28:56.976914] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2b53af0 00:07:11.992 [2024-07-24 23:28:56.976920] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:07:11.992 [2024-07-24 23:28:56.977955] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:07:11.992 [2024-07-24 23:28:56.977975] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:07:12.557 23:28:57 blockdev_general.bdev_bounds -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:12.557 23:28:57 blockdev_general.bdev_bounds -- common/autotest_common.sh@864 -- # return 0 00:07:12.557 23:28:57 blockdev_general.bdev_bounds -- bdev/blockdev.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:07:12.557 I/O targets: 00:07:12.557 Malloc0: 65536 blocks of 512 bytes (32 MiB) 00:07:12.557 Malloc1p0: 32768 blocks of 512 bytes (16 MiB) 00:07:12.557 Malloc1p1: 32768 blocks of 512 bytes (16 MiB) 00:07:12.557 Malloc2p0: 8192 blocks of 512 bytes (4 MiB) 00:07:12.557 Malloc2p1: 8192 blocks of 512 bytes (4 MiB) 00:07:12.557 Malloc2p2: 8192 blocks of 512 bytes (4 MiB) 00:07:12.557 Malloc2p3: 8192 blocks of 512 bytes (4 MiB) 00:07:12.557 Malloc2p4: 8192 blocks of 512 bytes (4 MiB) 00:07:12.557 Malloc2p5: 8192 blocks of 512 bytes (4 MiB) 00:07:12.557 Malloc2p6: 8192 blocks of 512 bytes (4 MiB) 00:07:12.558 Malloc2p7: 8192 blocks of 512 bytes (4 MiB) 00:07:12.558 TestPT: 65536 blocks of 512 bytes (32 MiB) 00:07:12.558 raid0: 131072 blocks of 512 bytes (64 MiB) 00:07:12.558 concat0: 131072 blocks of 512 bytes (64 MiB) 00:07:12.558 raid1: 65536 blocks of 512 bytes (32 MiB) 00:07:12.558 AIO0: 5000 blocks of 2048 bytes (10 MiB) 00:07:12.558 00:07:12.558 00:07:12.558 CUnit - A unit testing framework for C - Version 2.1-3 00:07:12.558 http://cunit.sourceforge.net/ 00:07:12.558 00:07:12.558 00:07:12.558 Suite: bdevio tests on: AIO0 00:07:12.558 Test: blockdev write read block ...passed 00:07:12.558 Test: blockdev write zeroes read block ...passed 00:07:12.558 Test: blockdev write zeroes read no split ...passed 00:07:12.558 Test: blockdev write zeroes read split ...passed 00:07:12.558 Test: blockdev write zeroes read split partial ...passed 00:07:12.558 Test: blockdev reset ...passed 00:07:12.558 Test: blockdev write read 8 blocks ...passed 00:07:12.558 Test: blockdev write read size > 128k ...passed 00:07:12.558 Test: blockdev write read invalid size ...passed 00:07:12.558 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:12.558 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:12.558 Test: blockdev write read max offset ...passed 00:07:12.558 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:12.558 Test: blockdev writev readv 8 blocks ...passed 00:07:12.558 Test: blockdev writev readv 30 x 1block ...passed 00:07:12.558 Test: blockdev writev readv block ...passed 00:07:12.558 Test: blockdev writev readv size > 128k ...passed 00:07:12.558 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:12.558 Test: blockdev comparev and writev ...passed 00:07:12.558 Test: blockdev nvme passthru rw ...passed 00:07:12.558 Test: blockdev nvme passthru vendor specific ...passed 00:07:12.558 Test: blockdev nvme admin passthru ...passed 00:07:12.558 Test: blockdev copy ...passed 00:07:12.558 Suite: bdevio tests on: raid1 00:07:12.558 Test: blockdev write read block ...passed 00:07:12.558 Test: blockdev write zeroes read block ...passed 00:07:12.558 Test: blockdev write zeroes read no split ...passed 00:07:12.558 Test: blockdev write zeroes read split ...passed 00:07:12.558 Test: blockdev write zeroes read split partial ...passed 00:07:12.558 Test: blockdev reset ...passed 00:07:12.558 Test: blockdev write read 8 blocks ...passed 00:07:12.558 Test: blockdev write read size > 128k ...passed 00:07:12.558 Test: blockdev write read invalid size ...passed 00:07:12.558 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:12.558 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:12.558 Test: blockdev write read max offset ...passed 00:07:12.558 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:12.558 Test: blockdev writev readv 8 blocks ...passed 00:07:12.558 Test: blockdev writev readv 30 x 1block ...passed 00:07:12.558 Test: blockdev writev readv block ...passed 00:07:12.558 Test: blockdev writev readv size > 128k ...passed 00:07:12.558 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:12.558 Test: blockdev comparev and writev ...passed 00:07:12.558 Test: blockdev nvme passthru rw ...passed 00:07:12.558 Test: blockdev nvme passthru vendor specific ...passed 00:07:12.558 Test: blockdev nvme admin passthru ...passed 00:07:12.558 Test: blockdev copy ...passed 00:07:12.558 Suite: bdevio tests on: concat0 00:07:12.558 Test: blockdev write read block ...passed 00:07:12.558 Test: blockdev write zeroes read block ...passed 00:07:12.558 Test: blockdev write zeroes read no split ...passed 00:07:12.558 Test: blockdev write zeroes read split ...passed 00:07:12.558 Test: blockdev write zeroes read split partial ...passed 00:07:12.558 Test: blockdev reset ...passed 00:07:12.558 Test: blockdev write read 8 blocks ...passed 00:07:12.558 Test: blockdev write read size > 128k ...passed 00:07:12.558 Test: blockdev write read invalid size ...passed 00:07:12.558 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:12.558 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:12.558 Test: blockdev write read max offset ...passed 00:07:12.558 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:12.558 Test: blockdev writev readv 8 blocks ...passed 00:07:12.558 Test: blockdev writev readv 30 x 1block ...passed 00:07:12.558 Test: blockdev writev readv block ...passed 00:07:12.558 Test: blockdev writev readv size > 128k ...passed 00:07:12.558 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:12.558 Test: blockdev comparev and writev ...passed 00:07:12.558 Test: blockdev nvme passthru rw ...passed 00:07:12.558 Test: blockdev nvme passthru vendor specific ...passed 00:07:12.558 Test: blockdev nvme admin passthru ...passed 00:07:12.558 Test: blockdev copy ...passed 00:07:12.558 Suite: bdevio tests on: raid0 00:07:12.558 Test: blockdev write read block ...passed 00:07:12.558 Test: blockdev write zeroes read block ...passed 00:07:12.558 Test: blockdev write zeroes read no split ...passed 00:07:12.558 Test: blockdev write zeroes read split ...passed 00:07:12.558 Test: blockdev write zeroes read split partial ...passed 00:07:12.558 Test: blockdev reset ...passed 00:07:12.558 Test: blockdev write read 8 blocks ...passed 00:07:12.558 Test: blockdev write read size > 128k ...passed 00:07:12.558 Test: blockdev write read invalid size ...passed 00:07:12.558 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:12.558 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:12.558 Test: blockdev write read max offset ...passed 00:07:12.558 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:12.558 Test: blockdev writev readv 8 blocks ...passed 00:07:12.558 Test: blockdev writev readv 30 x 1block ...passed 00:07:12.558 Test: blockdev writev readv block ...passed 00:07:12.558 Test: blockdev writev readv size > 128k ...passed 00:07:12.558 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:12.558 Test: blockdev comparev and writev ...passed 00:07:12.558 Test: blockdev nvme passthru rw ...passed 00:07:12.558 Test: blockdev nvme passthru vendor specific ...passed 00:07:12.558 Test: blockdev nvme admin passthru ...passed 00:07:12.558 Test: blockdev copy ...passed 00:07:12.558 Suite: bdevio tests on: TestPT 00:07:12.558 Test: blockdev write read block ...passed 00:07:12.558 Test: blockdev write zeroes read block ...passed 00:07:12.558 Test: blockdev write zeroes read no split ...passed 00:07:12.558 Test: blockdev write zeroes read split ...passed 00:07:12.818 Test: blockdev write zeroes read split partial ...passed 00:07:12.818 Test: blockdev reset ...passed 00:07:12.818 Test: blockdev write read 8 blocks ...passed 00:07:12.818 Test: blockdev write read size > 128k ...passed 00:07:12.818 Test: blockdev write read invalid size ...passed 00:07:12.818 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:12.818 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:12.818 Test: blockdev write read max offset ...passed 00:07:12.818 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:12.818 Test: blockdev writev readv 8 blocks ...passed 00:07:12.818 Test: blockdev writev readv 30 x 1block ...passed 00:07:12.818 Test: blockdev writev readv block ...passed 00:07:12.818 Test: blockdev writev readv size > 128k ...passed 00:07:12.818 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:12.818 Test: blockdev comparev and writev ...passed 00:07:12.818 Test: blockdev nvme passthru rw ...passed 00:07:12.818 Test: blockdev nvme passthru vendor specific ...passed 00:07:12.818 Test: blockdev nvme admin passthru ...passed 00:07:12.818 Test: blockdev copy ...passed 00:07:12.818 Suite: bdevio tests on: Malloc2p7 00:07:12.818 Test: blockdev write read block ...passed 00:07:12.818 Test: blockdev write zeroes read block ...passed 00:07:12.818 Test: blockdev write zeroes read no split ...passed 00:07:12.818 Test: blockdev write zeroes read split ...passed 00:07:12.818 Test: blockdev write zeroes read split partial ...passed 00:07:12.818 Test: blockdev reset ...passed 00:07:12.818 Test: blockdev write read 8 blocks ...passed 00:07:12.818 Test: blockdev write read size > 128k ...passed 00:07:12.818 Test: blockdev write read invalid size ...passed 00:07:12.818 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:12.818 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:12.818 Test: blockdev write read max offset ...passed 00:07:12.818 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:12.818 Test: blockdev writev readv 8 blocks ...passed 00:07:12.818 Test: blockdev writev readv 30 x 1block ...passed 00:07:12.818 Test: blockdev writev readv block ...passed 00:07:12.818 Test: blockdev writev readv size > 128k ...passed 00:07:12.818 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:12.818 Test: blockdev comparev and writev ...passed 00:07:12.818 Test: blockdev nvme passthru rw ...passed 00:07:12.818 Test: blockdev nvme passthru vendor specific ...passed 00:07:12.818 Test: blockdev nvme admin passthru ...passed 00:07:12.818 Test: blockdev copy ...passed 00:07:12.818 Suite: bdevio tests on: Malloc2p6 00:07:12.818 Test: blockdev write read block ...passed 00:07:12.818 Test: blockdev write zeroes read block ...passed 00:07:12.818 Test: blockdev write zeroes read no split ...passed 00:07:12.818 Test: blockdev write zeroes read split ...passed 00:07:12.818 Test: blockdev write zeroes read split partial ...passed 00:07:12.818 Test: blockdev reset ...passed 00:07:12.818 Test: blockdev write read 8 blocks ...passed 00:07:12.818 Test: blockdev write read size > 128k ...passed 00:07:12.818 Test: blockdev write read invalid size ...passed 00:07:12.818 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:12.818 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:12.818 Test: blockdev write read max offset ...passed 00:07:12.818 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:12.818 Test: blockdev writev readv 8 blocks ...passed 00:07:12.818 Test: blockdev writev readv 30 x 1block ...passed 00:07:12.818 Test: blockdev writev readv block ...passed 00:07:12.818 Test: blockdev writev readv size > 128k ...passed 00:07:12.818 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:12.818 Test: blockdev comparev and writev ...passed 00:07:12.818 Test: blockdev nvme passthru rw ...passed 00:07:12.818 Test: blockdev nvme passthru vendor specific ...passed 00:07:12.818 Test: blockdev nvme admin passthru ...passed 00:07:12.818 Test: blockdev copy ...passed 00:07:12.818 Suite: bdevio tests on: Malloc2p5 00:07:12.818 Test: blockdev write read block ...passed 00:07:12.818 Test: blockdev write zeroes read block ...passed 00:07:12.818 Test: blockdev write zeroes read no split ...passed 00:07:12.818 Test: blockdev write zeroes read split ...passed 00:07:12.818 Test: blockdev write zeroes read split partial ...passed 00:07:12.818 Test: blockdev reset ...passed 00:07:12.818 Test: blockdev write read 8 blocks ...passed 00:07:12.818 Test: blockdev write read size > 128k ...passed 00:07:12.818 Test: blockdev write read invalid size ...passed 00:07:12.818 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:12.818 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:12.818 Test: blockdev write read max offset ...passed 00:07:12.818 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:12.818 Test: blockdev writev readv 8 blocks ...passed 00:07:12.818 Test: blockdev writev readv 30 x 1block ...passed 00:07:12.818 Test: blockdev writev readv block ...passed 00:07:12.818 Test: blockdev writev readv size > 128k ...passed 00:07:12.818 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:12.818 Test: blockdev comparev and writev ...passed 00:07:12.818 Test: blockdev nvme passthru rw ...passed 00:07:12.818 Test: blockdev nvme passthru vendor specific ...passed 00:07:12.818 Test: blockdev nvme admin passthru ...passed 00:07:12.818 Test: blockdev copy ...passed 00:07:12.818 Suite: bdevio tests on: Malloc2p4 00:07:12.818 Test: blockdev write read block ...passed 00:07:12.818 Test: blockdev write zeroes read block ...passed 00:07:12.818 Test: blockdev write zeroes read no split ...passed 00:07:12.818 Test: blockdev write zeroes read split ...passed 00:07:12.818 Test: blockdev write zeroes read split partial ...passed 00:07:12.818 Test: blockdev reset ...passed 00:07:12.818 Test: blockdev write read 8 blocks ...passed 00:07:12.818 Test: blockdev write read size > 128k ...passed 00:07:12.818 Test: blockdev write read invalid size ...passed 00:07:12.818 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:12.818 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:12.818 Test: blockdev write read max offset ...passed 00:07:12.818 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:12.818 Test: blockdev writev readv 8 blocks ...passed 00:07:12.818 Test: blockdev writev readv 30 x 1block ...passed 00:07:12.818 Test: blockdev writev readv block ...passed 00:07:12.818 Test: blockdev writev readv size > 128k ...passed 00:07:12.818 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:12.818 Test: blockdev comparev and writev ...passed 00:07:12.818 Test: blockdev nvme passthru rw ...passed 00:07:12.818 Test: blockdev nvme passthru vendor specific ...passed 00:07:12.818 Test: blockdev nvme admin passthru ...passed 00:07:12.818 Test: blockdev copy ...passed 00:07:12.818 Suite: bdevio tests on: Malloc2p3 00:07:12.818 Test: blockdev write read block ...passed 00:07:12.818 Test: blockdev write zeroes read block ...passed 00:07:12.818 Test: blockdev write zeroes read no split ...passed 00:07:12.818 Test: blockdev write zeroes read split ...passed 00:07:12.818 Test: blockdev write zeroes read split partial ...passed 00:07:12.818 Test: blockdev reset ...passed 00:07:12.818 Test: blockdev write read 8 blocks ...passed 00:07:12.818 Test: blockdev write read size > 128k ...passed 00:07:12.818 Test: blockdev write read invalid size ...passed 00:07:12.818 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:12.818 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:12.818 Test: blockdev write read max offset ...passed 00:07:12.818 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:12.818 Test: blockdev writev readv 8 blocks ...passed 00:07:12.818 Test: blockdev writev readv 30 x 1block ...passed 00:07:12.818 Test: blockdev writev readv block ...passed 00:07:12.818 Test: blockdev writev readv size > 128k ...passed 00:07:12.818 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:12.818 Test: blockdev comparev and writev ...passed 00:07:12.818 Test: blockdev nvme passthru rw ...passed 00:07:12.818 Test: blockdev nvme passthru vendor specific ...passed 00:07:12.818 Test: blockdev nvme admin passthru ...passed 00:07:12.818 Test: blockdev copy ...passed 00:07:12.818 Suite: bdevio tests on: Malloc2p2 00:07:12.818 Test: blockdev write read block ...passed 00:07:12.818 Test: blockdev write zeroes read block ...passed 00:07:12.818 Test: blockdev write zeroes read no split ...passed 00:07:12.818 Test: blockdev write zeroes read split ...passed 00:07:12.818 Test: blockdev write zeroes read split partial ...passed 00:07:12.818 Test: blockdev reset ...passed 00:07:12.818 Test: blockdev write read 8 blocks ...passed 00:07:12.818 Test: blockdev write read size > 128k ...passed 00:07:12.818 Test: blockdev write read invalid size ...passed 00:07:12.818 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:12.818 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:12.818 Test: blockdev write read max offset ...passed 00:07:12.818 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:12.818 Test: blockdev writev readv 8 blocks ...passed 00:07:12.818 Test: blockdev writev readv 30 x 1block ...passed 00:07:12.818 Test: blockdev writev readv block ...passed 00:07:12.818 Test: blockdev writev readv size > 128k ...passed 00:07:12.818 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:12.818 Test: blockdev comparev and writev ...passed 00:07:12.818 Test: blockdev nvme passthru rw ...passed 00:07:12.818 Test: blockdev nvme passthru vendor specific ...passed 00:07:12.818 Test: blockdev nvme admin passthru ...passed 00:07:12.818 Test: blockdev copy ...passed 00:07:12.818 Suite: bdevio tests on: Malloc2p1 00:07:12.818 Test: blockdev write read block ...passed 00:07:12.818 Test: blockdev write zeroes read block ...passed 00:07:12.818 Test: blockdev write zeroes read no split ...passed 00:07:12.818 Test: blockdev write zeroes read split ...passed 00:07:12.818 Test: blockdev write zeroes read split partial ...passed 00:07:12.818 Test: blockdev reset ...passed 00:07:12.818 Test: blockdev write read 8 blocks ...passed 00:07:12.818 Test: blockdev write read size > 128k ...passed 00:07:12.818 Test: blockdev write read invalid size ...passed 00:07:12.818 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:12.818 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:12.818 Test: blockdev write read max offset ...passed 00:07:12.818 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:12.818 Test: blockdev writev readv 8 blocks ...passed 00:07:12.818 Test: blockdev writev readv 30 x 1block ...passed 00:07:12.818 Test: blockdev writev readv block ...passed 00:07:12.818 Test: blockdev writev readv size > 128k ...passed 00:07:12.818 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:12.818 Test: blockdev comparev and writev ...passed 00:07:12.818 Test: blockdev nvme passthru rw ...passed 00:07:12.818 Test: blockdev nvme passthru vendor specific ...passed 00:07:12.818 Test: blockdev nvme admin passthru ...passed 00:07:12.818 Test: blockdev copy ...passed 00:07:12.818 Suite: bdevio tests on: Malloc2p0 00:07:12.818 Test: blockdev write read block ...passed 00:07:12.818 Test: blockdev write zeroes read block ...passed 00:07:12.818 Test: blockdev write zeroes read no split ...passed 00:07:12.818 Test: blockdev write zeroes read split ...passed 00:07:12.818 Test: blockdev write zeroes read split partial ...passed 00:07:12.818 Test: blockdev reset ...passed 00:07:12.818 Test: blockdev write read 8 blocks ...passed 00:07:12.818 Test: blockdev write read size > 128k ...passed 00:07:12.818 Test: blockdev write read invalid size ...passed 00:07:12.818 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:12.818 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:12.818 Test: blockdev write read max offset ...passed 00:07:12.818 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:12.818 Test: blockdev writev readv 8 blocks ...passed 00:07:12.818 Test: blockdev writev readv 30 x 1block ...passed 00:07:12.818 Test: blockdev writev readv block ...passed 00:07:12.818 Test: blockdev writev readv size > 128k ...passed 00:07:12.818 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:12.818 Test: blockdev comparev and writev ...passed 00:07:12.818 Test: blockdev nvme passthru rw ...passed 00:07:12.818 Test: blockdev nvme passthru vendor specific ...passed 00:07:12.818 Test: blockdev nvme admin passthru ...passed 00:07:12.818 Test: blockdev copy ...passed 00:07:12.818 Suite: bdevio tests on: Malloc1p1 00:07:12.818 Test: blockdev write read block ...passed 00:07:12.818 Test: blockdev write zeroes read block ...passed 00:07:12.818 Test: blockdev write zeroes read no split ...passed 00:07:12.818 Test: blockdev write zeroes read split ...passed 00:07:12.818 Test: blockdev write zeroes read split partial ...passed 00:07:12.818 Test: blockdev reset ...passed 00:07:12.818 Test: blockdev write read 8 blocks ...passed 00:07:12.818 Test: blockdev write read size > 128k ...passed 00:07:12.818 Test: blockdev write read invalid size ...passed 00:07:12.818 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:12.818 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:12.818 Test: blockdev write read max offset ...passed 00:07:12.818 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:12.818 Test: blockdev writev readv 8 blocks ...passed 00:07:12.818 Test: blockdev writev readv 30 x 1block ...passed 00:07:12.818 Test: blockdev writev readv block ...passed 00:07:12.818 Test: blockdev writev readv size > 128k ...passed 00:07:12.818 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:12.818 Test: blockdev comparev and writev ...passed 00:07:12.818 Test: blockdev nvme passthru rw ...passed 00:07:12.818 Test: blockdev nvme passthru vendor specific ...passed 00:07:12.818 Test: blockdev nvme admin passthru ...passed 00:07:12.818 Test: blockdev copy ...passed 00:07:12.818 Suite: bdevio tests on: Malloc1p0 00:07:12.818 Test: blockdev write read block ...passed 00:07:12.818 Test: blockdev write zeroes read block ...passed 00:07:12.818 Test: blockdev write zeroes read no split ...passed 00:07:12.818 Test: blockdev write zeroes read split ...passed 00:07:12.818 Test: blockdev write zeroes read split partial ...passed 00:07:12.818 Test: blockdev reset ...passed 00:07:12.818 Test: blockdev write read 8 blocks ...passed 00:07:12.818 Test: blockdev write read size > 128k ...passed 00:07:12.818 Test: blockdev write read invalid size ...passed 00:07:12.818 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:12.818 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:12.818 Test: blockdev write read max offset ...passed 00:07:12.818 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:12.818 Test: blockdev writev readv 8 blocks ...passed 00:07:12.818 Test: blockdev writev readv 30 x 1block ...passed 00:07:12.818 Test: blockdev writev readv block ...passed 00:07:12.818 Test: blockdev writev readv size > 128k ...passed 00:07:12.818 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:12.818 Test: blockdev comparev and writev ...passed 00:07:12.818 Test: blockdev nvme passthru rw ...passed 00:07:12.818 Test: blockdev nvme passthru vendor specific ...passed 00:07:12.818 Test: blockdev nvme admin passthru ...passed 00:07:12.818 Test: blockdev copy ...passed 00:07:12.818 Suite: bdevio tests on: Malloc0 00:07:12.818 Test: blockdev write read block ...passed 00:07:12.818 Test: blockdev write zeroes read block ...passed 00:07:12.818 Test: blockdev write zeroes read no split ...passed 00:07:12.818 Test: blockdev write zeroes read split ...passed 00:07:12.818 Test: blockdev write zeroes read split partial ...passed 00:07:12.818 Test: blockdev reset ...passed 00:07:12.818 Test: blockdev write read 8 blocks ...passed 00:07:12.818 Test: blockdev write read size > 128k ...passed 00:07:12.818 Test: blockdev write read invalid size ...passed 00:07:12.818 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:12.818 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:12.818 Test: blockdev write read max offset ...passed 00:07:12.818 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:12.818 Test: blockdev writev readv 8 blocks ...passed 00:07:12.818 Test: blockdev writev readv 30 x 1block ...passed 00:07:12.818 Test: blockdev writev readv block ...passed 00:07:12.818 Test: blockdev writev readv size > 128k ...passed 00:07:12.818 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:12.818 Test: blockdev comparev and writev ...passed 00:07:12.818 Test: blockdev nvme passthru rw ...passed 00:07:12.818 Test: blockdev nvme passthru vendor specific ...passed 00:07:12.818 Test: blockdev nvme admin passthru ...passed 00:07:12.818 Test: blockdev copy ...passed 00:07:12.818 00:07:12.818 Run Summary: Type Total Ran Passed Failed Inactive 00:07:12.818 suites 16 16 n/a 0 0 00:07:12.818 tests 368 368 368 0 0 00:07:12.818 asserts 2224 2224 2224 0 n/a 00:07:12.818 00:07:12.818 Elapsed time = 0.460 seconds 00:07:12.818 0 00:07:12.818 23:28:57 blockdev_general.bdev_bounds -- bdev/blockdev.sh@294 -- # killprocess 222805 00:07:12.819 23:28:57 blockdev_general.bdev_bounds -- common/autotest_common.sh@950 -- # '[' -z 222805 ']' 00:07:12.819 23:28:57 blockdev_general.bdev_bounds -- common/autotest_common.sh@954 -- # kill -0 222805 00:07:12.819 23:28:57 blockdev_general.bdev_bounds -- common/autotest_common.sh@955 -- # uname 00:07:12.819 23:28:57 blockdev_general.bdev_bounds -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:12.819 23:28:57 blockdev_general.bdev_bounds -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 222805 00:07:12.819 23:28:57 blockdev_general.bdev_bounds -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:12.819 23:28:57 blockdev_general.bdev_bounds -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:12.819 23:28:57 blockdev_general.bdev_bounds -- common/autotest_common.sh@968 -- # echo 'killing process with pid 222805' 00:07:12.819 killing process with pid 222805 00:07:12.819 23:28:57 blockdev_general.bdev_bounds -- common/autotest_common.sh@969 -- # kill 222805 00:07:12.819 23:28:57 blockdev_general.bdev_bounds -- common/autotest_common.sh@974 -- # wait 222805 00:07:13.076 23:28:58 blockdev_general.bdev_bounds -- bdev/blockdev.sh@295 -- # trap - SIGINT SIGTERM EXIT 00:07:13.076 00:07:13.076 real 0m1.464s 00:07:13.076 user 0m3.715s 00:07:13.076 sys 0m0.353s 00:07:13.076 23:28:58 blockdev_general.bdev_bounds -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:13.076 23:28:58 blockdev_general.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:07:13.076 ************************************ 00:07:13.076 END TEST bdev_bounds 00:07:13.076 ************************************ 00:07:13.076 23:28:58 blockdev_general -- bdev/blockdev.sh@761 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '' 00:07:13.076 23:28:58 blockdev_general -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:07:13.076 23:28:58 blockdev_general -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:13.076 23:28:58 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:13.334 ************************************ 00:07:13.334 START TEST bdev_nbd 00:07:13.334 ************************************ 00:07:13.334 23:28:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@1125 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '' 00:07:13.334 23:28:58 blockdev_general.bdev_nbd -- bdev/blockdev.sh@299 -- # uname -s 00:07:13.334 23:28:58 blockdev_general.bdev_nbd -- bdev/blockdev.sh@299 -- # [[ Linux == Linux ]] 00:07:13.334 23:28:58 blockdev_general.bdev_nbd -- bdev/blockdev.sh@301 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:13.334 23:28:58 blockdev_general.bdev_nbd -- bdev/blockdev.sh@302 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:07:13.334 23:28:58 blockdev_general.bdev_nbd -- bdev/blockdev.sh@303 -- # bdev_all=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:07:13.334 23:28:58 blockdev_general.bdev_nbd -- bdev/blockdev.sh@303 -- # local bdev_all 00:07:13.334 23:28:58 blockdev_general.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_num=16 00:07:13.334 23:28:58 blockdev_general.bdev_nbd -- bdev/blockdev.sh@308 -- # [[ -e /sys/module/nbd ]] 00:07:13.334 23:28:58 blockdev_general.bdev_nbd -- bdev/blockdev.sh@310 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:07:13.334 23:28:58 blockdev_general.bdev_nbd -- bdev/blockdev.sh@310 -- # local nbd_all 00:07:13.334 23:28:58 blockdev_general.bdev_nbd -- bdev/blockdev.sh@311 -- # bdev_num=16 00:07:13.334 23:28:58 blockdev_general.bdev_nbd -- bdev/blockdev.sh@313 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:07:13.334 23:28:58 blockdev_general.bdev_nbd -- bdev/blockdev.sh@313 -- # local nbd_list 00:07:13.334 23:28:58 blockdev_general.bdev_nbd -- bdev/blockdev.sh@314 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:07:13.334 23:28:58 blockdev_general.bdev_nbd -- bdev/blockdev.sh@314 -- # local bdev_list 00:07:13.334 23:28:58 blockdev_general.bdev_nbd -- bdev/blockdev.sh@317 -- # nbd_pid=223070 00:07:13.334 23:28:58 blockdev_general.bdev_nbd -- bdev/blockdev.sh@318 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:07:13.334 23:28:58 blockdev_general.bdev_nbd -- bdev/blockdev.sh@316 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:07:13.334 23:28:58 blockdev_general.bdev_nbd -- bdev/blockdev.sh@319 -- # waitforlisten 223070 /var/tmp/spdk-nbd.sock 00:07:13.334 23:28:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@831 -- # '[' -z 223070 ']' 00:07:13.334 23:28:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:13.334 23:28:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:13.334 23:28:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:13.334 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:13.334 23:28:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:13.334 23:28:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:07:13.334 [2024-07-24 23:28:58.147989] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:07:13.334 [2024-07-24 23:28:58.148026] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:07:13.334 [2024-07-24 23:28:58.210987] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:13.334 [2024-07-24 23:28:58.288011] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:13.591 [2024-07-24 23:28:58.424098] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:13.591 [2024-07-24 23:28:58.424136] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:07:13.591 [2024-07-24 23:28:58.424144] vbdev_passthru.c: 736:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:07:13.591 [2024-07-24 23:28:58.432108] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:07:13.591 [2024-07-24 23:28:58.432122] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:07:13.591 [2024-07-24 23:28:58.440123] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:07:13.591 [2024-07-24 23:28:58.440136] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:07:13.591 [2024-07-24 23:28:58.506839] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:13.591 [2024-07-24 23:28:58.506878] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:07:13.591 [2024-07-24 23:28:58.506887] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x240b560 00:07:13.591 [2024-07-24 23:28:58.506893] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:07:13.591 [2024-07-24 23:28:58.507911] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:07:13.591 [2024-07-24 23:28:58.507932] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:07:14.157 23:28:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:14.157 23:28:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # return 0 00:07:14.157 23:28:58 blockdev_general.bdev_nbd -- bdev/blockdev.sh@321 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' 00:07:14.157 23:28:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:14.157 23:28:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:07:14.157 23:28:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:07:14.157 23:28:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' 00:07:14.157 23:28:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:14.157 23:28:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:07:14.157 23:28:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:07:14.157 23:28:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:07:14.157 23:28:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:07:14.157 23:28:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:07:14.157 23:28:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:14.157 23:28:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 00:07:14.157 23:28:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:07:14.157 23:28:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:07:14.157 23:28:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:07:14.157 23:28:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:07:14.157 23:28:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:07:14.157 23:28:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:07:14.157 23:28:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:07:14.157 23:28:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:07:14.157 23:28:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:07:14.158 23:28:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:07:14.158 23:28:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:07:14.158 23:28:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:14.158 1+0 records in 00:07:14.158 1+0 records out 00:07:14.158 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00021122 s, 19.4 MB/s 00:07:14.158 23:28:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:14.158 23:28:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:07:14.158 23:28:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:14.158 23:28:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:07:14.158 23:28:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:07:14.158 23:28:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:14.158 23:28:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:14.158 23:28:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p0 00:07:14.416 23:28:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:07:14.416 23:28:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:07:14.416 23:28:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:07:14.416 23:28:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:07:14.416 23:28:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:07:14.416 23:28:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:07:14.416 23:28:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:07:14.416 23:28:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:07:14.416 23:28:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:07:14.416 23:28:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:07:14.416 23:28:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:07:14.416 23:28:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:14.416 1+0 records in 00:07:14.416 1+0 records out 00:07:14.416 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000178774 s, 22.9 MB/s 00:07:14.416 23:28:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:14.416 23:28:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:07:14.416 23:28:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:14.416 23:28:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:07:14.416 23:28:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:07:14.416 23:28:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:14.416 23:28:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:14.416 23:28:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p1 00:07:14.674 23:28:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:07:14.674 23:28:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:07:14.674 23:28:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:07:14.674 23:28:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd2 00:07:14.674 23:28:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:07:14.674 23:28:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:07:14.674 23:28:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:07:14.674 23:28:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd2 /proc/partitions 00:07:14.674 23:28:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:07:14.674 23:28:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:07:14.674 23:28:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:07:14.674 23:28:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:14.674 1+0 records in 00:07:14.674 1+0 records out 00:07:14.674 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000240858 s, 17.0 MB/s 00:07:14.674 23:28:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:14.674 23:28:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:07:14.674 23:28:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:14.674 23:28:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:07:14.674 23:28:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:07:14.674 23:28:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:14.674 23:28:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:14.674 23:28:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p0 00:07:14.933 23:28:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:07:14.933 23:28:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:07:14.933 23:28:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:07:14.933 23:28:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd3 00:07:14.933 23:28:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:07:14.933 23:28:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:07:14.933 23:28:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:07:14.933 23:28:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd3 /proc/partitions 00:07:14.933 23:28:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:07:14.933 23:28:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:07:14.933 23:28:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:07:14.933 23:28:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:14.933 1+0 records in 00:07:14.933 1+0 records out 00:07:14.933 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00023922 s, 17.1 MB/s 00:07:14.933 23:28:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:14.933 23:28:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:07:14.933 23:28:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:14.933 23:28:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:07:14.933 23:28:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:07:14.933 23:28:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:14.933 23:28:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:14.933 23:28:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p1 00:07:14.933 23:28:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd4 00:07:14.933 23:28:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd4 00:07:14.933 23:28:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd4 00:07:14.933 23:28:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd4 00:07:14.933 23:28:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:07:14.933 23:28:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:07:14.933 23:28:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:07:14.933 23:28:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd4 /proc/partitions 00:07:15.191 23:28:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:07:15.191 23:28:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:07:15.191 23:28:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:07:15.191 23:28:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd4 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:15.191 1+0 records in 00:07:15.191 1+0 records out 00:07:15.191 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000205904 s, 19.9 MB/s 00:07:15.191 23:28:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:15.191 23:28:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:07:15.191 23:28:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:15.191 23:28:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:07:15.191 23:28:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:07:15.191 23:28:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:15.191 23:28:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:15.191 23:28:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p2 00:07:15.191 23:29:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd5 00:07:15.191 23:29:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd5 00:07:15.191 23:29:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd5 00:07:15.191 23:29:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd5 00:07:15.191 23:29:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:07:15.191 23:29:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:07:15.191 23:29:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:07:15.191 23:29:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd5 /proc/partitions 00:07:15.191 23:29:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:07:15.191 23:29:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:07:15.191 23:29:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:07:15.191 23:29:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd5 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:15.191 1+0 records in 00:07:15.191 1+0 records out 00:07:15.191 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00022917 s, 17.9 MB/s 00:07:15.191 23:29:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:15.191 23:29:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:07:15.191 23:29:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:15.191 23:29:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:07:15.191 23:29:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:07:15.192 23:29:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:15.192 23:29:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:15.192 23:29:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p3 00:07:15.450 23:29:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd6 00:07:15.450 23:29:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd6 00:07:15.450 23:29:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd6 00:07:15.450 23:29:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd6 00:07:15.450 23:29:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:07:15.450 23:29:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:07:15.450 23:29:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:07:15.450 23:29:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd6 /proc/partitions 00:07:15.450 23:29:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:07:15.450 23:29:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:07:15.450 23:29:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:07:15.450 23:29:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd6 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:15.450 1+0 records in 00:07:15.450 1+0 records out 00:07:15.450 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000246922 s, 16.6 MB/s 00:07:15.450 23:29:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:15.450 23:29:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:07:15.450 23:29:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:15.450 23:29:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:07:15.450 23:29:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:07:15.450 23:29:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:15.450 23:29:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:15.450 23:29:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p4 00:07:15.708 23:29:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd7 00:07:15.708 23:29:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd7 00:07:15.708 23:29:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd7 00:07:15.708 23:29:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd7 00:07:15.708 23:29:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:07:15.708 23:29:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:07:15.708 23:29:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:07:15.708 23:29:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd7 /proc/partitions 00:07:15.708 23:29:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:07:15.708 23:29:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:07:15.708 23:29:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:07:15.708 23:29:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd7 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:15.708 1+0 records in 00:07:15.708 1+0 records out 00:07:15.708 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000271429 s, 15.1 MB/s 00:07:15.708 23:29:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:15.708 23:29:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:07:15.708 23:29:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:15.708 23:29:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:07:15.708 23:29:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:07:15.708 23:29:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:15.708 23:29:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:15.708 23:29:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p5 00:07:15.966 23:29:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd8 00:07:15.966 23:29:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd8 00:07:15.966 23:29:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd8 00:07:15.966 23:29:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd8 00:07:15.966 23:29:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:07:15.966 23:29:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:07:15.966 23:29:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:07:15.966 23:29:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd8 /proc/partitions 00:07:15.966 23:29:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:07:15.966 23:29:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:07:15.966 23:29:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:07:15.966 23:29:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd8 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:15.966 1+0 records in 00:07:15.966 1+0 records out 00:07:15.966 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000284027 s, 14.4 MB/s 00:07:15.966 23:29:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:15.966 23:29:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:07:15.966 23:29:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:15.966 23:29:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:07:15.966 23:29:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:07:15.966 23:29:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:15.966 23:29:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:15.967 23:29:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p6 00:07:15.967 23:29:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd9 00:07:15.967 23:29:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd9 00:07:15.967 23:29:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd9 00:07:15.967 23:29:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd9 00:07:15.967 23:29:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:07:15.967 23:29:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:07:15.967 23:29:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:07:15.967 23:29:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd9 /proc/partitions 00:07:15.967 23:29:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:07:15.967 23:29:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:07:15.967 23:29:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:07:15.967 23:29:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd9 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:15.967 1+0 records in 00:07:15.967 1+0 records out 00:07:15.967 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000252117 s, 16.2 MB/s 00:07:15.967 23:29:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:15.967 23:29:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:07:15.967 23:29:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:16.225 23:29:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:07:16.225 23:29:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:07:16.225 23:29:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:16.225 23:29:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:16.225 23:29:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p7 00:07:16.225 23:29:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd10 00:07:16.225 23:29:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd10 00:07:16.225 23:29:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd10 00:07:16.225 23:29:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd10 00:07:16.225 23:29:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:07:16.225 23:29:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:07:16.225 23:29:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:07:16.225 23:29:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd10 /proc/partitions 00:07:16.225 23:29:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:07:16.225 23:29:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:07:16.225 23:29:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:07:16.225 23:29:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:16.225 1+0 records in 00:07:16.225 1+0 records out 00:07:16.225 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000253175 s, 16.2 MB/s 00:07:16.225 23:29:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:16.225 23:29:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:07:16.225 23:29:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:16.225 23:29:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:07:16.225 23:29:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:07:16.225 23:29:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:16.225 23:29:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:16.225 23:29:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk TestPT 00:07:16.483 23:29:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd11 00:07:16.483 23:29:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd11 00:07:16.483 23:29:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd11 00:07:16.483 23:29:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd11 00:07:16.483 23:29:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:07:16.483 23:29:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:07:16.483 23:29:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:07:16.483 23:29:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd11 /proc/partitions 00:07:16.483 23:29:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:07:16.483 23:29:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:07:16.483 23:29:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:07:16.483 23:29:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:16.483 1+0 records in 00:07:16.483 1+0 records out 00:07:16.483 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000240278 s, 17.0 MB/s 00:07:16.483 23:29:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:16.483 23:29:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:07:16.483 23:29:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:16.483 23:29:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:07:16.483 23:29:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:07:16.483 23:29:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:16.483 23:29:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:16.483 23:29:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid0 00:07:16.742 23:29:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd12 00:07:16.742 23:29:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd12 00:07:16.742 23:29:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd12 00:07:16.742 23:29:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd12 00:07:16.742 23:29:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:07:16.742 23:29:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:07:16.742 23:29:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:07:16.742 23:29:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd12 /proc/partitions 00:07:16.742 23:29:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:07:16.742 23:29:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:07:16.742 23:29:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:07:16.742 23:29:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd12 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:16.742 1+0 records in 00:07:16.742 1+0 records out 00:07:16.742 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000365147 s, 11.2 MB/s 00:07:16.742 23:29:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:16.742 23:29:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:07:16.742 23:29:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:16.742 23:29:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:07:16.742 23:29:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:07:16.742 23:29:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:16.742 23:29:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:16.742 23:29:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk concat0 00:07:17.000 23:29:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd13 00:07:17.000 23:29:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd13 00:07:17.000 23:29:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd13 00:07:17.000 23:29:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd13 00:07:17.000 23:29:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:07:17.000 23:29:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:07:17.000 23:29:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:07:17.000 23:29:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd13 /proc/partitions 00:07:17.000 23:29:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:07:17.000 23:29:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:07:17.000 23:29:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:07:17.000 23:29:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd13 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:17.000 1+0 records in 00:07:17.000 1+0 records out 00:07:17.000 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000393993 s, 10.4 MB/s 00:07:17.000 23:29:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:17.000 23:29:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:07:17.000 23:29:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:17.000 23:29:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:07:17.000 23:29:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:07:17.000 23:29:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:17.000 23:29:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:17.000 23:29:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid1 00:07:17.000 23:29:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd14 00:07:17.000 23:29:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd14 00:07:17.000 23:29:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd14 00:07:17.000 23:29:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd14 00:07:17.000 23:29:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:07:17.000 23:29:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:07:17.000 23:29:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:07:17.000 23:29:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd14 /proc/partitions 00:07:17.000 23:29:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:07:17.000 23:29:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:07:17.000 23:29:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:07:17.000 23:29:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd14 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:17.000 1+0 records in 00:07:17.000 1+0 records out 00:07:17.000 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000321483 s, 12.7 MB/s 00:07:17.000 23:29:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:17.000 23:29:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:07:17.000 23:29:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:17.000 23:29:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:07:17.000 23:29:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:07:17.000 23:29:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:17.000 23:29:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:17.258 23:29:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk AIO0 00:07:17.258 23:29:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd15 00:07:17.258 23:29:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd15 00:07:17.258 23:29:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd15 00:07:17.258 23:29:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd15 00:07:17.258 23:29:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:07:17.258 23:29:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:07:17.258 23:29:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:07:17.258 23:29:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd15 /proc/partitions 00:07:17.258 23:29:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:07:17.258 23:29:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:07:17.258 23:29:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:07:17.258 23:29:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd15 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:17.258 1+0 records in 00:07:17.258 1+0 records out 00:07:17.258 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000407544 s, 10.1 MB/s 00:07:17.258 23:29:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:17.258 23:29:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:07:17.258 23:29:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:17.258 23:29:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:07:17.258 23:29:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:07:17.258 23:29:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:17.258 23:29:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:17.258 23:29:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:17.517 23:29:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:07:17.517 { 00:07:17.517 "nbd_device": "/dev/nbd0", 00:07:17.517 "bdev_name": "Malloc0" 00:07:17.517 }, 00:07:17.517 { 00:07:17.517 "nbd_device": "/dev/nbd1", 00:07:17.517 "bdev_name": "Malloc1p0" 00:07:17.517 }, 00:07:17.517 { 00:07:17.517 "nbd_device": "/dev/nbd2", 00:07:17.517 "bdev_name": "Malloc1p1" 00:07:17.517 }, 00:07:17.517 { 00:07:17.517 "nbd_device": "/dev/nbd3", 00:07:17.517 "bdev_name": "Malloc2p0" 00:07:17.517 }, 00:07:17.517 { 00:07:17.517 "nbd_device": "/dev/nbd4", 00:07:17.517 "bdev_name": "Malloc2p1" 00:07:17.517 }, 00:07:17.517 { 00:07:17.517 "nbd_device": "/dev/nbd5", 00:07:17.517 "bdev_name": "Malloc2p2" 00:07:17.517 }, 00:07:17.517 { 00:07:17.517 "nbd_device": "/dev/nbd6", 00:07:17.517 "bdev_name": "Malloc2p3" 00:07:17.517 }, 00:07:17.517 { 00:07:17.517 "nbd_device": "/dev/nbd7", 00:07:17.517 "bdev_name": "Malloc2p4" 00:07:17.517 }, 00:07:17.517 { 00:07:17.517 "nbd_device": "/dev/nbd8", 00:07:17.517 "bdev_name": "Malloc2p5" 00:07:17.517 }, 00:07:17.517 { 00:07:17.517 "nbd_device": "/dev/nbd9", 00:07:17.517 "bdev_name": "Malloc2p6" 00:07:17.517 }, 00:07:17.517 { 00:07:17.517 "nbd_device": "/dev/nbd10", 00:07:17.517 "bdev_name": "Malloc2p7" 00:07:17.517 }, 00:07:17.517 { 00:07:17.517 "nbd_device": "/dev/nbd11", 00:07:17.517 "bdev_name": "TestPT" 00:07:17.517 }, 00:07:17.517 { 00:07:17.517 "nbd_device": "/dev/nbd12", 00:07:17.517 "bdev_name": "raid0" 00:07:17.517 }, 00:07:17.517 { 00:07:17.517 "nbd_device": "/dev/nbd13", 00:07:17.517 "bdev_name": "concat0" 00:07:17.517 }, 00:07:17.517 { 00:07:17.517 "nbd_device": "/dev/nbd14", 00:07:17.517 "bdev_name": "raid1" 00:07:17.517 }, 00:07:17.517 { 00:07:17.517 "nbd_device": "/dev/nbd15", 00:07:17.517 "bdev_name": "AIO0" 00:07:17.517 } 00:07:17.517 ]' 00:07:17.517 23:29:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:07:17.517 23:29:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:07:17.517 { 00:07:17.517 "nbd_device": "/dev/nbd0", 00:07:17.517 "bdev_name": "Malloc0" 00:07:17.517 }, 00:07:17.517 { 00:07:17.517 "nbd_device": "/dev/nbd1", 00:07:17.517 "bdev_name": "Malloc1p0" 00:07:17.517 }, 00:07:17.517 { 00:07:17.517 "nbd_device": "/dev/nbd2", 00:07:17.517 "bdev_name": "Malloc1p1" 00:07:17.517 }, 00:07:17.517 { 00:07:17.517 "nbd_device": "/dev/nbd3", 00:07:17.517 "bdev_name": "Malloc2p0" 00:07:17.517 }, 00:07:17.517 { 00:07:17.517 "nbd_device": "/dev/nbd4", 00:07:17.517 "bdev_name": "Malloc2p1" 00:07:17.517 }, 00:07:17.517 { 00:07:17.517 "nbd_device": "/dev/nbd5", 00:07:17.517 "bdev_name": "Malloc2p2" 00:07:17.517 }, 00:07:17.517 { 00:07:17.517 "nbd_device": "/dev/nbd6", 00:07:17.517 "bdev_name": "Malloc2p3" 00:07:17.517 }, 00:07:17.517 { 00:07:17.517 "nbd_device": "/dev/nbd7", 00:07:17.517 "bdev_name": "Malloc2p4" 00:07:17.517 }, 00:07:17.517 { 00:07:17.517 "nbd_device": "/dev/nbd8", 00:07:17.517 "bdev_name": "Malloc2p5" 00:07:17.517 }, 00:07:17.517 { 00:07:17.517 "nbd_device": "/dev/nbd9", 00:07:17.517 "bdev_name": "Malloc2p6" 00:07:17.517 }, 00:07:17.517 { 00:07:17.517 "nbd_device": "/dev/nbd10", 00:07:17.517 "bdev_name": "Malloc2p7" 00:07:17.517 }, 00:07:17.517 { 00:07:17.517 "nbd_device": "/dev/nbd11", 00:07:17.517 "bdev_name": "TestPT" 00:07:17.517 }, 00:07:17.517 { 00:07:17.517 "nbd_device": "/dev/nbd12", 00:07:17.517 "bdev_name": "raid0" 00:07:17.517 }, 00:07:17.517 { 00:07:17.517 "nbd_device": "/dev/nbd13", 00:07:17.517 "bdev_name": "concat0" 00:07:17.517 }, 00:07:17.517 { 00:07:17.517 "nbd_device": "/dev/nbd14", 00:07:17.517 "bdev_name": "raid1" 00:07:17.517 }, 00:07:17.517 { 00:07:17.517 "nbd_device": "/dev/nbd15", 00:07:17.517 "bdev_name": "AIO0" 00:07:17.517 } 00:07:17.517 ]' 00:07:17.517 23:29:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:07:17.517 23:29:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15' 00:07:17.517 23:29:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:17.517 23:29:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15') 00:07:17.517 23:29:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:17.517 23:29:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:07:17.517 23:29:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:17.517 23:29:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:17.775 23:29:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:17.775 23:29:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:17.775 23:29:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:17.775 23:29:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:17.775 23:29:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:17.775 23:29:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:17.775 23:29:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:17.775 23:29:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:17.775 23:29:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:17.775 23:29:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:18.033 23:29:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:18.033 23:29:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:18.033 23:29:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:18.033 23:29:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:18.033 23:29:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:18.033 23:29:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:18.033 23:29:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:18.033 23:29:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:18.033 23:29:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:18.033 23:29:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:07:18.033 23:29:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:07:18.033 23:29:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:07:18.033 23:29:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:07:18.033 23:29:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:18.033 23:29:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:18.033 23:29:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:07:18.033 23:29:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:18.033 23:29:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:18.033 23:29:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:18.033 23:29:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:07:18.291 23:29:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:07:18.291 23:29:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:07:18.291 23:29:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:07:18.291 23:29:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:18.291 23:29:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:18.291 23:29:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:07:18.291 23:29:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:18.291 23:29:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:18.291 23:29:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:18.291 23:29:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd4 00:07:18.547 23:29:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd4 00:07:18.547 23:29:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd4 00:07:18.547 23:29:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd4 00:07:18.547 23:29:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:18.547 23:29:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:18.547 23:29:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd4 /proc/partitions 00:07:18.547 23:29:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:18.547 23:29:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:18.547 23:29:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:18.547 23:29:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd5 00:07:18.803 23:29:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd5 00:07:18.803 23:29:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd5 00:07:18.803 23:29:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd5 00:07:18.803 23:29:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:18.803 23:29:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:18.803 23:29:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd5 /proc/partitions 00:07:18.803 23:29:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:18.803 23:29:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:18.803 23:29:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:18.803 23:29:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd6 00:07:18.803 23:29:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd6 00:07:18.803 23:29:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd6 00:07:18.803 23:29:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd6 00:07:18.803 23:29:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:18.803 23:29:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:18.803 23:29:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd6 /proc/partitions 00:07:18.803 23:29:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:18.803 23:29:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:18.803 23:29:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:18.803 23:29:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd7 00:07:19.061 23:29:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd7 00:07:19.061 23:29:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd7 00:07:19.061 23:29:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd7 00:07:19.061 23:29:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:19.061 23:29:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:19.061 23:29:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd7 /proc/partitions 00:07:19.061 23:29:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:19.061 23:29:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:19.061 23:29:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:19.061 23:29:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd8 00:07:19.318 23:29:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd8 00:07:19.318 23:29:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd8 00:07:19.318 23:29:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd8 00:07:19.318 23:29:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:19.318 23:29:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:19.318 23:29:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd8 /proc/partitions 00:07:19.318 23:29:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:19.318 23:29:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:19.318 23:29:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:19.318 23:29:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd9 00:07:19.576 23:29:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd9 00:07:19.576 23:29:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd9 00:07:19.576 23:29:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd9 00:07:19.576 23:29:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:19.576 23:29:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:19.576 23:29:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd9 /proc/partitions 00:07:19.576 23:29:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:19.576 23:29:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:19.576 23:29:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:19.576 23:29:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:07:19.576 23:29:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:07:19.576 23:29:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:07:19.576 23:29:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:07:19.576 23:29:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:19.576 23:29:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:19.576 23:29:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:07:19.576 23:29:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:19.576 23:29:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:19.576 23:29:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:19.576 23:29:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:07:19.872 23:29:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:07:19.872 23:29:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:07:19.872 23:29:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:07:19.872 23:29:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:19.872 23:29:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:19.872 23:29:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:07:19.872 23:29:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:19.872 23:29:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:19.872 23:29:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:19.872 23:29:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd12 00:07:20.130 23:29:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd12 00:07:20.130 23:29:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd12 00:07:20.130 23:29:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd12 00:07:20.130 23:29:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:20.130 23:29:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:20.130 23:29:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd12 /proc/partitions 00:07:20.130 23:29:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:20.130 23:29:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:20.130 23:29:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:20.130 23:29:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd13 00:07:20.130 23:29:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd13 00:07:20.130 23:29:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd13 00:07:20.130 23:29:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd13 00:07:20.130 23:29:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:20.130 23:29:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:20.130 23:29:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd13 /proc/partitions 00:07:20.130 23:29:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:20.130 23:29:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:20.130 23:29:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:20.130 23:29:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd14 00:07:20.388 23:29:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd14 00:07:20.388 23:29:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd14 00:07:20.388 23:29:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd14 00:07:20.388 23:29:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:20.388 23:29:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:20.388 23:29:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd14 /proc/partitions 00:07:20.388 23:29:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:20.388 23:29:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:20.388 23:29:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:20.388 23:29:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd15 00:07:20.646 23:29:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd15 00:07:20.646 23:29:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd15 00:07:20.646 23:29:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd15 00:07:20.646 23:29:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:20.646 23:29:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:20.646 23:29:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd15 /proc/partitions 00:07:20.646 23:29:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:20.646 23:29:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:20.646 23:29:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:20.646 23:29:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:20.646 23:29:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:20.904 23:29:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:20.904 23:29:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:20.904 23:29:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:20.904 23:29:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:20.904 23:29:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:07:20.904 23:29:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:20.904 23:29:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:07:20.904 23:29:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:07:20.904 23:29:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:07:20.904 23:29:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:07:20.904 23:29:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:07:20.904 23:29:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:07:20.904 23:29:05 blockdev_general.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:07:20.905 23:29:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:20.905 23:29:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:07:20.905 23:29:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:20.905 23:29:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:07:20.905 23:29:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:20.905 23:29:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:07:20.905 23:29:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:20.905 23:29:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:07:20.905 23:29:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:20.905 23:29:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:07:20.905 23:29:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:20.905 23:29:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:07:20.905 23:29:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:20.905 23:29:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:20.905 23:29:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:21.163 /dev/nbd0 00:07:21.163 23:29:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:21.163 23:29:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:21.163 23:29:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:07:21.163 23:29:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:07:21.163 23:29:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:07:21.163 23:29:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:07:21.163 23:29:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:07:21.163 23:29:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:07:21.163 23:29:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:07:21.163 23:29:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:07:21.163 23:29:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:21.163 1+0 records in 00:07:21.163 1+0 records out 00:07:21.163 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000236227 s, 17.3 MB/s 00:07:21.163 23:29:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:21.163 23:29:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:07:21.163 23:29:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:21.163 23:29:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:07:21.163 23:29:05 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:07:21.163 23:29:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:21.163 23:29:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:21.163 23:29:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p0 /dev/nbd1 00:07:21.163 /dev/nbd1 00:07:21.163 23:29:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:21.163 23:29:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:21.163 23:29:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:07:21.163 23:29:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:07:21.163 23:29:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:07:21.163 23:29:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:07:21.163 23:29:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:07:21.163 23:29:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:07:21.163 23:29:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:07:21.163 23:29:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:07:21.163 23:29:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:21.163 1+0 records in 00:07:21.163 1+0 records out 00:07:21.163 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000233215 s, 17.6 MB/s 00:07:21.163 23:29:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:21.163 23:29:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:07:21.163 23:29:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:21.163 23:29:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:07:21.163 23:29:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:07:21.163 23:29:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:21.163 23:29:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:21.163 23:29:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p1 /dev/nbd10 00:07:21.422 /dev/nbd10 00:07:21.422 23:29:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:07:21.422 23:29:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:07:21.422 23:29:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd10 00:07:21.422 23:29:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:07:21.422 23:29:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:07:21.422 23:29:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:07:21.422 23:29:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd10 /proc/partitions 00:07:21.422 23:29:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:07:21.422 23:29:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:07:21.422 23:29:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:07:21.422 23:29:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:21.422 1+0 records in 00:07:21.422 1+0 records out 00:07:21.423 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000224389 s, 18.3 MB/s 00:07:21.423 23:29:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:21.423 23:29:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:07:21.423 23:29:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:21.423 23:29:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:07:21.423 23:29:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:07:21.423 23:29:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:21.423 23:29:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:21.423 23:29:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p0 /dev/nbd11 00:07:21.680 /dev/nbd11 00:07:21.680 23:29:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:07:21.680 23:29:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:07:21.680 23:29:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd11 00:07:21.680 23:29:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:07:21.680 23:29:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:07:21.680 23:29:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:07:21.680 23:29:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd11 /proc/partitions 00:07:21.680 23:29:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:07:21.680 23:29:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:07:21.680 23:29:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:07:21.680 23:29:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:21.680 1+0 records in 00:07:21.680 1+0 records out 00:07:21.680 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000270178 s, 15.2 MB/s 00:07:21.680 23:29:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:21.680 23:29:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:07:21.680 23:29:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:21.680 23:29:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:07:21.680 23:29:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:07:21.680 23:29:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:21.680 23:29:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:21.680 23:29:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p1 /dev/nbd12 00:07:21.936 /dev/nbd12 00:07:21.936 23:29:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd12 00:07:21.936 23:29:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd12 00:07:21.936 23:29:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd12 00:07:21.936 23:29:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:07:21.936 23:29:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:07:21.936 23:29:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:07:21.936 23:29:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd12 /proc/partitions 00:07:21.936 23:29:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:07:21.936 23:29:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:07:21.936 23:29:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:07:21.936 23:29:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd12 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:21.936 1+0 records in 00:07:21.936 1+0 records out 00:07:21.936 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000209484 s, 19.6 MB/s 00:07:21.936 23:29:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:21.936 23:29:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:07:21.936 23:29:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:21.936 23:29:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:07:21.936 23:29:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:07:21.936 23:29:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:21.936 23:29:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:21.936 23:29:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p2 /dev/nbd13 00:07:21.936 /dev/nbd13 00:07:22.194 23:29:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd13 00:07:22.194 23:29:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd13 00:07:22.194 23:29:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd13 00:07:22.194 23:29:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:07:22.194 23:29:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:07:22.194 23:29:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:07:22.194 23:29:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd13 /proc/partitions 00:07:22.194 23:29:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:07:22.194 23:29:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:07:22.194 23:29:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:07:22.194 23:29:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd13 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:22.194 1+0 records in 00:07:22.194 1+0 records out 00:07:22.194 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00022806 s, 18.0 MB/s 00:07:22.194 23:29:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:22.194 23:29:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:07:22.194 23:29:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:22.194 23:29:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:07:22.194 23:29:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:07:22.194 23:29:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:22.194 23:29:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:22.194 23:29:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p3 /dev/nbd14 00:07:22.194 /dev/nbd14 00:07:22.194 23:29:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd14 00:07:22.194 23:29:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd14 00:07:22.194 23:29:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd14 00:07:22.194 23:29:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:07:22.194 23:29:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:07:22.194 23:29:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:07:22.194 23:29:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd14 /proc/partitions 00:07:22.194 23:29:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:07:22.194 23:29:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:07:22.194 23:29:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:07:22.194 23:29:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd14 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:22.194 1+0 records in 00:07:22.194 1+0 records out 00:07:22.194 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000257083 s, 15.9 MB/s 00:07:22.194 23:29:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:22.194 23:29:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:07:22.194 23:29:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:22.194 23:29:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:07:22.194 23:29:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:07:22.194 23:29:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:22.194 23:29:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:22.194 23:29:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p4 /dev/nbd15 00:07:22.451 /dev/nbd15 00:07:22.451 23:29:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd15 00:07:22.451 23:29:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd15 00:07:22.451 23:29:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd15 00:07:22.451 23:29:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:07:22.451 23:29:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:07:22.451 23:29:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:07:22.451 23:29:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd15 /proc/partitions 00:07:22.451 23:29:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:07:22.451 23:29:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:07:22.451 23:29:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:07:22.451 23:29:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd15 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:22.451 1+0 records in 00:07:22.451 1+0 records out 00:07:22.451 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000269659 s, 15.2 MB/s 00:07:22.451 23:29:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:22.451 23:29:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:07:22.451 23:29:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:22.451 23:29:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:07:22.451 23:29:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:07:22.451 23:29:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:22.451 23:29:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:22.451 23:29:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p5 /dev/nbd2 00:07:22.708 /dev/nbd2 00:07:22.708 23:29:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd2 00:07:22.708 23:29:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd2 00:07:22.708 23:29:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd2 00:07:22.708 23:29:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:07:22.708 23:29:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:07:22.708 23:29:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:07:22.708 23:29:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd2 /proc/partitions 00:07:22.708 23:29:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:07:22.708 23:29:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:07:22.708 23:29:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:07:22.708 23:29:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:22.708 1+0 records in 00:07:22.708 1+0 records out 00:07:22.708 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000325069 s, 12.6 MB/s 00:07:22.708 23:29:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:22.708 23:29:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:07:22.708 23:29:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:22.708 23:29:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:07:22.708 23:29:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:07:22.708 23:29:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:22.708 23:29:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:22.708 23:29:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p6 /dev/nbd3 00:07:22.966 /dev/nbd3 00:07:22.966 23:29:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd3 00:07:22.966 23:29:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd3 00:07:22.966 23:29:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd3 00:07:22.966 23:29:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:07:22.966 23:29:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:07:22.966 23:29:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:07:22.966 23:29:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd3 /proc/partitions 00:07:22.966 23:29:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:07:22.966 23:29:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:07:22.966 23:29:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:07:22.966 23:29:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:22.966 1+0 records in 00:07:22.966 1+0 records out 00:07:22.966 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000317989 s, 12.9 MB/s 00:07:22.966 23:29:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:22.966 23:29:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:07:22.966 23:29:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:22.966 23:29:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:07:22.966 23:29:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:07:22.966 23:29:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:22.966 23:29:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:22.966 23:29:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p7 /dev/nbd4 00:07:23.224 /dev/nbd4 00:07:23.224 23:29:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd4 00:07:23.224 23:29:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd4 00:07:23.224 23:29:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd4 00:07:23.224 23:29:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:07:23.224 23:29:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:07:23.224 23:29:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:07:23.224 23:29:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd4 /proc/partitions 00:07:23.224 23:29:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:07:23.224 23:29:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:07:23.224 23:29:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:07:23.224 23:29:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd4 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:23.224 1+0 records in 00:07:23.224 1+0 records out 00:07:23.224 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000288012 s, 14.2 MB/s 00:07:23.224 23:29:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:23.224 23:29:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:07:23.224 23:29:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:23.224 23:29:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:07:23.224 23:29:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:07:23.224 23:29:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:23.224 23:29:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:23.224 23:29:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk TestPT /dev/nbd5 00:07:23.224 /dev/nbd5 00:07:23.224 23:29:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd5 00:07:23.224 23:29:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd5 00:07:23.224 23:29:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd5 00:07:23.224 23:29:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:07:23.224 23:29:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:07:23.224 23:29:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:07:23.224 23:29:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd5 /proc/partitions 00:07:23.224 23:29:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:07:23.224 23:29:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:07:23.224 23:29:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:07:23.224 23:29:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd5 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:23.224 1+0 records in 00:07:23.224 1+0 records out 00:07:23.224 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000221473 s, 18.5 MB/s 00:07:23.224 23:29:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:23.224 23:29:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:07:23.225 23:29:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:23.225 23:29:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:07:23.225 23:29:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:07:23.225 23:29:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:23.225 23:29:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:23.225 23:29:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid0 /dev/nbd6 00:07:23.482 /dev/nbd6 00:07:23.482 23:29:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd6 00:07:23.482 23:29:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd6 00:07:23.482 23:29:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd6 00:07:23.482 23:29:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:07:23.482 23:29:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:07:23.482 23:29:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:07:23.482 23:29:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd6 /proc/partitions 00:07:23.482 23:29:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:07:23.482 23:29:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:07:23.482 23:29:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:07:23.482 23:29:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd6 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:23.482 1+0 records in 00:07:23.482 1+0 records out 00:07:23.482 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000331796 s, 12.3 MB/s 00:07:23.482 23:29:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:23.482 23:29:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:07:23.482 23:29:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:23.482 23:29:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:07:23.482 23:29:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:07:23.482 23:29:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:23.482 23:29:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:23.482 23:29:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk concat0 /dev/nbd7 00:07:23.740 /dev/nbd7 00:07:23.740 23:29:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd7 00:07:23.740 23:29:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd7 00:07:23.740 23:29:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd7 00:07:23.740 23:29:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:07:23.740 23:29:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:07:23.740 23:29:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:07:23.740 23:29:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd7 /proc/partitions 00:07:23.740 23:29:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:07:23.740 23:29:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:07:23.740 23:29:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:07:23.740 23:29:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd7 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:23.740 1+0 records in 00:07:23.740 1+0 records out 00:07:23.740 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000318547 s, 12.9 MB/s 00:07:23.740 23:29:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:23.740 23:29:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:07:23.740 23:29:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:23.740 23:29:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:07:23.740 23:29:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:07:23.740 23:29:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:23.740 23:29:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:23.740 23:29:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid1 /dev/nbd8 00:07:23.998 /dev/nbd8 00:07:23.998 23:29:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd8 00:07:23.998 23:29:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd8 00:07:23.998 23:29:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd8 00:07:23.998 23:29:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:07:23.998 23:29:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:07:23.998 23:29:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:07:23.998 23:29:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd8 /proc/partitions 00:07:23.998 23:29:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:07:23.998 23:29:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:07:23.998 23:29:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:07:23.998 23:29:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd8 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:23.998 1+0 records in 00:07:23.998 1+0 records out 00:07:23.998 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000360654 s, 11.4 MB/s 00:07:23.998 23:29:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:23.998 23:29:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:07:23.998 23:29:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:23.998 23:29:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:07:23.998 23:29:08 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:07:23.998 23:29:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:23.998 23:29:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:23.998 23:29:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk AIO0 /dev/nbd9 00:07:24.256 /dev/nbd9 00:07:24.256 23:29:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd9 00:07:24.256 23:29:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd9 00:07:24.256 23:29:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd9 00:07:24.256 23:29:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:07:24.256 23:29:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:07:24.256 23:29:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:07:24.256 23:29:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd9 /proc/partitions 00:07:24.256 23:29:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:07:24.256 23:29:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:07:24.256 23:29:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:07:24.256 23:29:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd9 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:24.256 1+0 records in 00:07:24.256 1+0 records out 00:07:24.256 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000269758 s, 15.2 MB/s 00:07:24.256 23:29:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:24.256 23:29:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:07:24.256 23:29:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:24.256 23:29:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:07:24.256 23:29:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:07:24.256 23:29:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:24.256 23:29:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:24.256 23:29:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:24.256 23:29:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:24.256 23:29:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:24.256 23:29:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:24.256 { 00:07:24.256 "nbd_device": "/dev/nbd0", 00:07:24.256 "bdev_name": "Malloc0" 00:07:24.256 }, 00:07:24.256 { 00:07:24.256 "nbd_device": "/dev/nbd1", 00:07:24.256 "bdev_name": "Malloc1p0" 00:07:24.256 }, 00:07:24.256 { 00:07:24.256 "nbd_device": "/dev/nbd10", 00:07:24.256 "bdev_name": "Malloc1p1" 00:07:24.256 }, 00:07:24.256 { 00:07:24.256 "nbd_device": "/dev/nbd11", 00:07:24.256 "bdev_name": "Malloc2p0" 00:07:24.256 }, 00:07:24.256 { 00:07:24.256 "nbd_device": "/dev/nbd12", 00:07:24.256 "bdev_name": "Malloc2p1" 00:07:24.256 }, 00:07:24.256 { 00:07:24.256 "nbd_device": "/dev/nbd13", 00:07:24.256 "bdev_name": "Malloc2p2" 00:07:24.256 }, 00:07:24.256 { 00:07:24.256 "nbd_device": "/dev/nbd14", 00:07:24.256 "bdev_name": "Malloc2p3" 00:07:24.256 }, 00:07:24.256 { 00:07:24.256 "nbd_device": "/dev/nbd15", 00:07:24.256 "bdev_name": "Malloc2p4" 00:07:24.256 }, 00:07:24.256 { 00:07:24.256 "nbd_device": "/dev/nbd2", 00:07:24.256 "bdev_name": "Malloc2p5" 00:07:24.256 }, 00:07:24.256 { 00:07:24.256 "nbd_device": "/dev/nbd3", 00:07:24.256 "bdev_name": "Malloc2p6" 00:07:24.256 }, 00:07:24.256 { 00:07:24.256 "nbd_device": "/dev/nbd4", 00:07:24.256 "bdev_name": "Malloc2p7" 00:07:24.256 }, 00:07:24.256 { 00:07:24.256 "nbd_device": "/dev/nbd5", 00:07:24.256 "bdev_name": "TestPT" 00:07:24.256 }, 00:07:24.256 { 00:07:24.257 "nbd_device": "/dev/nbd6", 00:07:24.257 "bdev_name": "raid0" 00:07:24.257 }, 00:07:24.257 { 00:07:24.257 "nbd_device": "/dev/nbd7", 00:07:24.257 "bdev_name": "concat0" 00:07:24.257 }, 00:07:24.257 { 00:07:24.257 "nbd_device": "/dev/nbd8", 00:07:24.257 "bdev_name": "raid1" 00:07:24.257 }, 00:07:24.257 { 00:07:24.257 "nbd_device": "/dev/nbd9", 00:07:24.257 "bdev_name": "AIO0" 00:07:24.257 } 00:07:24.257 ]' 00:07:24.257 23:29:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:24.257 { 00:07:24.257 "nbd_device": "/dev/nbd0", 00:07:24.257 "bdev_name": "Malloc0" 00:07:24.257 }, 00:07:24.257 { 00:07:24.257 "nbd_device": "/dev/nbd1", 00:07:24.257 "bdev_name": "Malloc1p0" 00:07:24.257 }, 00:07:24.257 { 00:07:24.257 "nbd_device": "/dev/nbd10", 00:07:24.257 "bdev_name": "Malloc1p1" 00:07:24.257 }, 00:07:24.257 { 00:07:24.257 "nbd_device": "/dev/nbd11", 00:07:24.257 "bdev_name": "Malloc2p0" 00:07:24.257 }, 00:07:24.257 { 00:07:24.257 "nbd_device": "/dev/nbd12", 00:07:24.257 "bdev_name": "Malloc2p1" 00:07:24.257 }, 00:07:24.257 { 00:07:24.257 "nbd_device": "/dev/nbd13", 00:07:24.257 "bdev_name": "Malloc2p2" 00:07:24.257 }, 00:07:24.257 { 00:07:24.257 "nbd_device": "/dev/nbd14", 00:07:24.257 "bdev_name": "Malloc2p3" 00:07:24.257 }, 00:07:24.257 { 00:07:24.257 "nbd_device": "/dev/nbd15", 00:07:24.257 "bdev_name": "Malloc2p4" 00:07:24.257 }, 00:07:24.257 { 00:07:24.257 "nbd_device": "/dev/nbd2", 00:07:24.257 "bdev_name": "Malloc2p5" 00:07:24.257 }, 00:07:24.257 { 00:07:24.257 "nbd_device": "/dev/nbd3", 00:07:24.257 "bdev_name": "Malloc2p6" 00:07:24.257 }, 00:07:24.257 { 00:07:24.257 "nbd_device": "/dev/nbd4", 00:07:24.257 "bdev_name": "Malloc2p7" 00:07:24.257 }, 00:07:24.257 { 00:07:24.257 "nbd_device": "/dev/nbd5", 00:07:24.257 "bdev_name": "TestPT" 00:07:24.257 }, 00:07:24.257 { 00:07:24.257 "nbd_device": "/dev/nbd6", 00:07:24.257 "bdev_name": "raid0" 00:07:24.257 }, 00:07:24.257 { 00:07:24.257 "nbd_device": "/dev/nbd7", 00:07:24.257 "bdev_name": "concat0" 00:07:24.257 }, 00:07:24.257 { 00:07:24.257 "nbd_device": "/dev/nbd8", 00:07:24.257 "bdev_name": "raid1" 00:07:24.257 }, 00:07:24.257 { 00:07:24.257 "nbd_device": "/dev/nbd9", 00:07:24.257 "bdev_name": "AIO0" 00:07:24.257 } 00:07:24.257 ]' 00:07:24.257 23:29:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:24.515 23:29:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:24.515 /dev/nbd1 00:07:24.515 /dev/nbd10 00:07:24.515 /dev/nbd11 00:07:24.515 /dev/nbd12 00:07:24.515 /dev/nbd13 00:07:24.515 /dev/nbd14 00:07:24.515 /dev/nbd15 00:07:24.515 /dev/nbd2 00:07:24.515 /dev/nbd3 00:07:24.515 /dev/nbd4 00:07:24.515 /dev/nbd5 00:07:24.515 /dev/nbd6 00:07:24.515 /dev/nbd7 00:07:24.515 /dev/nbd8 00:07:24.515 /dev/nbd9' 00:07:24.515 23:29:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:24.515 /dev/nbd1 00:07:24.515 /dev/nbd10 00:07:24.515 /dev/nbd11 00:07:24.515 /dev/nbd12 00:07:24.515 /dev/nbd13 00:07:24.515 /dev/nbd14 00:07:24.515 /dev/nbd15 00:07:24.515 /dev/nbd2 00:07:24.515 /dev/nbd3 00:07:24.515 /dev/nbd4 00:07:24.515 /dev/nbd5 00:07:24.515 /dev/nbd6 00:07:24.515 /dev/nbd7 00:07:24.515 /dev/nbd8 00:07:24.515 /dev/nbd9' 00:07:24.515 23:29:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:24.515 23:29:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=16 00:07:24.515 23:29:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 16 00:07:24.515 23:29:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=16 00:07:24.515 23:29:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 16 -ne 16 ']' 00:07:24.515 23:29:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' write 00:07:24.515 23:29:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:07:24.515 23:29:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:24.515 23:29:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:24.515 23:29:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:07:24.515 23:29:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:24.515 23:29:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:07:24.515 256+0 records in 00:07:24.515 256+0 records out 00:07:24.515 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0103483 s, 101 MB/s 00:07:24.515 23:29:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:24.515 23:29:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:24.515 256+0 records in 00:07:24.515 256+0 records out 00:07:24.515 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0629011 s, 16.7 MB/s 00:07:24.515 23:29:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:24.515 23:29:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:24.515 256+0 records in 00:07:24.515 256+0 records out 00:07:24.515 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0651013 s, 16.1 MB/s 00:07:24.515 23:29:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:24.515 23:29:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:07:24.515 256+0 records in 00:07:24.515 256+0 records out 00:07:24.515 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0649985 s, 16.1 MB/s 00:07:24.515 23:29:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:24.515 23:29:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:07:24.773 256+0 records in 00:07:24.773 256+0 records out 00:07:24.773 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0649277 s, 16.1 MB/s 00:07:24.773 23:29:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:24.773 23:29:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd12 bs=4096 count=256 oflag=direct 00:07:24.773 256+0 records in 00:07:24.773 256+0 records out 00:07:24.773 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0634716 s, 16.5 MB/s 00:07:24.773 23:29:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:24.773 23:29:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd13 bs=4096 count=256 oflag=direct 00:07:24.773 256+0 records in 00:07:24.773 256+0 records out 00:07:24.773 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0646677 s, 16.2 MB/s 00:07:24.773 23:29:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:24.773 23:29:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd14 bs=4096 count=256 oflag=direct 00:07:25.031 256+0 records in 00:07:25.031 256+0 records out 00:07:25.031 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0654967 s, 16.0 MB/s 00:07:25.031 23:29:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:25.031 23:29:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd15 bs=4096 count=256 oflag=direct 00:07:25.031 256+0 records in 00:07:25.031 256+0 records out 00:07:25.031 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0676502 s, 15.5 MB/s 00:07:25.031 23:29:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:25.031 23:29:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd2 bs=4096 count=256 oflag=direct 00:07:25.031 256+0 records in 00:07:25.031 256+0 records out 00:07:25.031 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0653969 s, 16.0 MB/s 00:07:25.031 23:29:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:25.031 23:29:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd3 bs=4096 count=256 oflag=direct 00:07:25.031 256+0 records in 00:07:25.031 256+0 records out 00:07:25.031 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0642671 s, 16.3 MB/s 00:07:25.031 23:29:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:25.031 23:29:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd4 bs=4096 count=256 oflag=direct 00:07:25.290 256+0 records in 00:07:25.290 256+0 records out 00:07:25.290 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0683582 s, 15.3 MB/s 00:07:25.290 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:25.290 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd5 bs=4096 count=256 oflag=direct 00:07:25.290 256+0 records in 00:07:25.290 256+0 records out 00:07:25.290 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0660316 s, 15.9 MB/s 00:07:25.290 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:25.290 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd6 bs=4096 count=256 oflag=direct 00:07:25.290 256+0 records in 00:07:25.290 256+0 records out 00:07:25.290 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0656842 s, 16.0 MB/s 00:07:25.290 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:25.290 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd7 bs=4096 count=256 oflag=direct 00:07:25.290 256+0 records in 00:07:25.290 256+0 records out 00:07:25.290 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0663498 s, 15.8 MB/s 00:07:25.290 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:25.290 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd8 bs=4096 count=256 oflag=direct 00:07:25.548 256+0 records in 00:07:25.548 256+0 records out 00:07:25.548 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0672805 s, 15.6 MB/s 00:07:25.548 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:25.548 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd9 bs=4096 count=256 oflag=direct 00:07:25.548 256+0 records in 00:07:25.548 256+0 records out 00:07:25.548 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0639531 s, 16.4 MB/s 00:07:25.548 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' verify 00:07:25.548 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:07:25.548 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:25.548 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:25.548 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:07:25.548 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:25.548 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:25.548 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:25.548 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:07:25.548 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:25.548 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:07:25.548 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:25.548 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:07:25.548 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:25.548 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:07:25.548 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:25.548 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd12 00:07:25.548 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:25.548 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd13 00:07:25.548 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:25.548 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd14 00:07:25.548 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:25.548 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd15 00:07:25.548 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:25.548 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd2 00:07:25.548 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:25.548 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd3 00:07:25.548 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:25.548 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd4 00:07:25.548 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:25.548 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd5 00:07:25.548 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:25.548 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd6 00:07:25.548 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:25.548 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd7 00:07:25.548 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:25.548 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd8 00:07:25.548 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:25.548 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd9 00:07:25.548 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:07:25.548 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:07:25.548 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:25.548 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:07:25.548 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:25.548 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:07:25.548 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:25.548 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:25.805 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:25.805 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:25.805 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:25.805 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:25.805 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:25.805 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:25.805 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:25.805 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:25.805 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:25.805 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:26.063 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:26.063 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:26.063 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:26.063 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:26.063 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:26.063 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:26.063 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:26.063 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:26.063 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:26.063 23:29:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:07:26.320 23:29:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:07:26.320 23:29:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:07:26.320 23:29:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:07:26.320 23:29:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:26.321 23:29:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:26.321 23:29:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:07:26.321 23:29:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:26.321 23:29:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:26.321 23:29:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:26.321 23:29:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:07:26.321 23:29:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:07:26.321 23:29:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:07:26.321 23:29:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:07:26.321 23:29:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:26.321 23:29:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:26.321 23:29:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:07:26.321 23:29:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:26.321 23:29:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:26.321 23:29:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:26.321 23:29:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd12 00:07:26.579 23:29:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd12 00:07:26.579 23:29:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd12 00:07:26.579 23:29:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd12 00:07:26.579 23:29:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:26.579 23:29:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:26.579 23:29:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd12 /proc/partitions 00:07:26.579 23:29:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:26.579 23:29:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:26.579 23:29:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:26.579 23:29:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd13 00:07:26.837 23:29:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd13 00:07:26.837 23:29:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd13 00:07:26.837 23:29:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd13 00:07:26.837 23:29:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:26.837 23:29:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:26.837 23:29:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd13 /proc/partitions 00:07:26.837 23:29:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:26.837 23:29:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:26.837 23:29:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:26.837 23:29:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd14 00:07:27.095 23:29:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd14 00:07:27.095 23:29:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd14 00:07:27.095 23:29:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd14 00:07:27.095 23:29:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:27.095 23:29:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:27.095 23:29:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd14 /proc/partitions 00:07:27.095 23:29:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:27.095 23:29:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:27.095 23:29:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:27.095 23:29:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd15 00:07:27.095 23:29:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd15 00:07:27.095 23:29:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd15 00:07:27.095 23:29:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd15 00:07:27.095 23:29:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:27.095 23:29:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:27.095 23:29:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd15 /proc/partitions 00:07:27.095 23:29:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:27.095 23:29:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:27.095 23:29:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:27.095 23:29:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:07:27.352 23:29:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:07:27.352 23:29:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:07:27.352 23:29:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:07:27.352 23:29:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:27.352 23:29:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:27.352 23:29:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:07:27.352 23:29:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:27.352 23:29:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:27.352 23:29:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:27.352 23:29:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:07:27.614 23:29:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:07:27.614 23:29:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:07:27.614 23:29:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:07:27.614 23:29:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:27.614 23:29:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:27.614 23:29:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:07:27.614 23:29:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:27.614 23:29:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:27.614 23:29:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:27.614 23:29:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd4 00:07:27.872 23:29:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd4 00:07:27.872 23:29:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd4 00:07:27.872 23:29:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd4 00:07:27.872 23:29:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:27.872 23:29:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:27.872 23:29:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd4 /proc/partitions 00:07:27.872 23:29:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:27.872 23:29:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:27.872 23:29:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:27.872 23:29:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd5 00:07:27.872 23:29:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd5 00:07:27.872 23:29:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd5 00:07:27.872 23:29:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd5 00:07:27.872 23:29:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:27.872 23:29:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:27.872 23:29:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd5 /proc/partitions 00:07:27.872 23:29:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:27.872 23:29:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:27.872 23:29:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:27.872 23:29:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd6 00:07:28.129 23:29:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd6 00:07:28.129 23:29:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd6 00:07:28.129 23:29:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd6 00:07:28.129 23:29:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:28.129 23:29:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:28.130 23:29:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd6 /proc/partitions 00:07:28.130 23:29:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:28.130 23:29:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:28.130 23:29:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:28.130 23:29:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd7 00:07:28.387 23:29:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd7 00:07:28.387 23:29:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd7 00:07:28.387 23:29:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd7 00:07:28.387 23:29:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:28.387 23:29:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:28.387 23:29:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd7 /proc/partitions 00:07:28.387 23:29:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:28.387 23:29:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:28.387 23:29:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:28.387 23:29:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd8 00:07:28.387 23:29:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd8 00:07:28.387 23:29:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd8 00:07:28.387 23:29:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd8 00:07:28.387 23:29:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:28.387 23:29:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:28.645 23:29:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd8 /proc/partitions 00:07:28.645 23:29:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:28.645 23:29:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:28.645 23:29:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:28.645 23:29:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd9 00:07:28.645 23:29:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd9 00:07:28.645 23:29:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd9 00:07:28.645 23:29:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd9 00:07:28.645 23:29:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:28.645 23:29:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:28.645 23:29:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd9 /proc/partitions 00:07:28.645 23:29:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:28.645 23:29:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:28.645 23:29:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:28.645 23:29:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:28.646 23:29:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:28.904 23:29:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:28.904 23:29:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:28.904 23:29:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:28.904 23:29:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:28.904 23:29:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:07:28.904 23:29:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:28.904 23:29:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:07:28.904 23:29:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:07:28.904 23:29:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:07:28.904 23:29:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:07:28.904 23:29:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:28.904 23:29:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:07:28.904 23:29:13 blockdev_general.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:07:28.904 23:29:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:28.904 23:29:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:07:28.904 23:29:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:07:28.904 23:29:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:07:28.904 23:29:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:07:29.162 malloc_lvol_verify 00:07:29.162 23:29:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:07:29.162 731eb7ed-1604-4dab-ab30-3814e92829c7 00:07:29.162 23:29:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:07:29.420 ab0ce5b6-babf-4443-92f0-2fd98b738608 00:07:29.420 23:29:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:07:29.678 /dev/nbd0 00:07:29.678 23:29:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:07:29.678 mke2fs 1.46.5 (30-Dec-2021) 00:07:29.678 Discarding device blocks: 0/4096 done 00:07:29.678 Creating filesystem with 4096 1k blocks and 1024 inodes 00:07:29.678 00:07:29.678 Allocating group tables: 0/1 done 00:07:29.678 Writing inode tables: 0/1 done 00:07:29.678 Creating journal (1024 blocks): done 00:07:29.678 Writing superblocks and filesystem accounting information: 0/1 done 00:07:29.679 00:07:29.679 23:29:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:07:29.679 23:29:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:07:29.679 23:29:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:29.679 23:29:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:07:29.679 23:29:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:29.679 23:29:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:07:29.679 23:29:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:29.679 23:29:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:29.937 23:29:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:29.937 23:29:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:29.937 23:29:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:29.937 23:29:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:29.937 23:29:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:29.937 23:29:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:29.937 23:29:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:29.937 23:29:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:29.937 23:29:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:07:29.937 23:29:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:07:29.937 23:29:14 blockdev_general.bdev_nbd -- bdev/blockdev.sh@325 -- # killprocess 223070 00:07:29.937 23:29:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@950 -- # '[' -z 223070 ']' 00:07:29.937 23:29:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@954 -- # kill -0 223070 00:07:29.937 23:29:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@955 -- # uname 00:07:29.937 23:29:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:29.937 23:29:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 223070 00:07:29.937 23:29:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:29.937 23:29:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:29.937 23:29:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@968 -- # echo 'killing process with pid 223070' 00:07:29.937 killing process with pid 223070 00:07:29.937 23:29:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@969 -- # kill 223070 00:07:29.937 23:29:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@974 -- # wait 223070 00:07:30.197 23:29:14 blockdev_general.bdev_nbd -- bdev/blockdev.sh@326 -- # trap - SIGINT SIGTERM EXIT 00:07:30.197 00:07:30.197 real 0m16.894s 00:07:30.197 user 0m22.766s 00:07:30.197 sys 0m8.109s 00:07:30.197 23:29:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:30.197 23:29:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:07:30.197 ************************************ 00:07:30.197 END TEST bdev_nbd 00:07:30.197 ************************************ 00:07:30.197 23:29:15 blockdev_general -- bdev/blockdev.sh@762 -- # [[ y == y ]] 00:07:30.197 23:29:15 blockdev_general -- bdev/blockdev.sh@763 -- # '[' bdev = nvme ']' 00:07:30.197 23:29:15 blockdev_general -- bdev/blockdev.sh@763 -- # '[' bdev = gpt ']' 00:07:30.197 23:29:15 blockdev_general -- bdev/blockdev.sh@767 -- # run_test bdev_fio fio_test_suite '' 00:07:30.197 23:29:15 blockdev_general -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:07:30.197 23:29:15 blockdev_general -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:30.197 23:29:15 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:30.197 ************************************ 00:07:30.197 START TEST bdev_fio 00:07:30.197 ************************************ 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- common/autotest_common.sh@1125 -- # fio_test_suite '' 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@330 -- # local env_context 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@334 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:07:30.197 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@335 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@338 -- # echo '' 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@338 -- # sed s/--env-context=// 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@338 -- # env_context= 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc0]' 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc0 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc1p0]' 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc1p0 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc1p1]' 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc1p1 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc2p0]' 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc2p0 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc2p1]' 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc2p1 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc2p2]' 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc2p2 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc2p3]' 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc2p3 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc2p4]' 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc2p4 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc2p5]' 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc2p5 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc2p6]' 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc2p6 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc2p7]' 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc2p7 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_TestPT]' 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=TestPT 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_raid0]' 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=raid0 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_concat0]' 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=concat0 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_raid1]' 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=raid1 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_AIO0]' 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=AIO0 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@346 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- bdev/blockdev.sh@348 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- common/autotest_common.sh@1101 -- # '[' 11 -le 1 ']' 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:30.197 23:29:15 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:07:30.197 ************************************ 00:07:30.197 START TEST bdev_fio_rw_verify 00:07:30.197 ************************************ 00:07:30.197 23:29:15 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1125 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:07:30.197 23:29:15 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:07:30.198 23:29:15 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:07:30.198 23:29:15 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:07:30.198 23:29:15 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:07:30.198 23:29:15 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:07:30.198 23:29:15 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:07:30.198 23:29:15 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:07:30.198 23:29:15 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:07:30.198 23:29:15 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:07:30.198 23:29:15 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:07:30.198 23:29:15 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:07:30.478 23:29:15 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:07:30.478 23:29:15 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:07:30.478 23:29:15 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:07:30.478 23:29:15 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:07:30.478 23:29:15 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:07:30.478 23:29:15 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:07:30.478 23:29:15 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:07:30.478 23:29:15 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:07:30.478 23:29:15 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:07:30.478 23:29:15 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:07:30.744 job_Malloc0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:30.744 job_Malloc1p0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:30.744 job_Malloc1p1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:30.744 job_Malloc2p0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:30.744 job_Malloc2p1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:30.744 job_Malloc2p2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:30.744 job_Malloc2p3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:30.744 job_Malloc2p4: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:30.744 job_Malloc2p5: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:30.744 job_Malloc2p6: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:30.744 job_Malloc2p7: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:30.744 job_TestPT: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:30.744 job_raid0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:30.744 job_concat0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:30.744 job_raid1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:30.744 job_AIO0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:30.744 fio-3.35 00:07:30.744 Starting 16 threads 00:07:42.934 00:07:42.934 job_Malloc0: (groupid=0, jobs=16): err= 0: pid=226730: Wed Jul 24 23:29:26 2024 00:07:42.934 read: IOPS=104k, BW=405MiB/s (424MB/s)(4049MiB/10001msec) 00:07:42.935 slat (nsec): min=1975, max=2979.1k, avg=32055.62, stdev=13928.72 00:07:42.935 clat (usec): min=8, max=3116, avg=263.13, stdev=122.90 00:07:42.935 lat (usec): min=15, max=3126, avg=295.18, stdev=130.21 00:07:42.935 clat percentiles (usec): 00:07:42.935 | 50.000th=[ 258], 99.000th=[ 523], 99.900th=[ 586], 99.990th=[ 783], 00:07:42.935 | 99.999th=[ 1074] 00:07:42.935 write: IOPS=162k, BW=633MiB/s (663MB/s)(6245MiB/9872msec); 0 zone resets 00:07:42.935 slat (usec): min=6, max=275, avg=41.80, stdev=13.11 00:07:42.935 clat (usec): min=9, max=1351, avg=301.21, stdev=135.32 00:07:42.935 lat (usec): min=30, max=1394, avg=343.01, stdev=141.79 00:07:42.935 clat percentiles (usec): 00:07:42.935 | 50.000th=[ 289], 99.000th=[ 611], 99.900th=[ 766], 99.990th=[ 840], 00:07:42.935 | 99.999th=[ 1123] 00:07:42.935 bw ( KiB/s): min=531696, max=904551, per=98.97%, avg=641108.16, stdev=5610.07, samples=304 00:07:42.935 iops : min=132924, max=226137, avg=160277.00, stdev=1402.51, samples=304 00:07:42.935 lat (usec) : 10=0.01%, 20=0.02%, 50=0.98%, 100=6.20%, 250=36.03% 00:07:42.935 lat (usec) : 500=51.07%, 750=5.61%, 1000=0.09% 00:07:42.935 lat (msec) : 2=0.01%, 4=0.01% 00:07:42.935 cpu : usr=99.40%, sys=0.24%, ctx=715, majf=0, minf=2608 00:07:42.935 IO depths : 1=12.4%, 2=24.8%, 4=50.3%, 8=12.6%, 16=0.0%, 32=0.0%, >=64=0.0% 00:07:42.935 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:07:42.935 complete : 0=0.0%, 4=89.1%, 8=10.9%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:07:42.935 issued rwts: total=1036477,1598791,0,0 short=0,0,0,0 dropped=0,0,0,0 00:07:42.935 latency : target=0, window=0, percentile=100.00%, depth=8 00:07:42.935 00:07:42.935 Run status group 0 (all jobs): 00:07:42.935 READ: bw=405MiB/s (424MB/s), 405MiB/s-405MiB/s (424MB/s-424MB/s), io=4049MiB (4245MB), run=10001-10001msec 00:07:42.935 WRITE: bw=633MiB/s (663MB/s), 633MiB/s-633MiB/s (663MB/s-663MB/s), io=6245MiB (6549MB), run=9872-9872msec 00:07:42.935 00:07:42.935 real 0m11.436s 00:07:42.935 user 2m47.841s 00:07:42.935 sys 0m0.981s 00:07:42.935 23:29:26 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:42.935 23:29:26 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:07:42.935 ************************************ 00:07:42.935 END TEST bdev_fio_rw_verify 00:07:42.935 ************************************ 00:07:42.935 23:29:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@349 -- # rm -f 00:07:42.935 23:29:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:07:42.935 23:29:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@353 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:07:42.935 23:29:26 blockdev_general.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:07:42.935 23:29:26 blockdev_general.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:07:42.935 23:29:26 blockdev_general.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:07:42.935 23:29:26 blockdev_general.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:07:42.935 23:29:26 blockdev_general.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:07:42.935 23:29:26 blockdev_general.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:07:42.935 23:29:26 blockdev_general.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:07:42.935 23:29:26 blockdev_general.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:07:42.935 23:29:26 blockdev_general.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:07:42.935 23:29:26 blockdev_general.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:07:42.935 23:29:26 blockdev_general.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:07:42.935 23:29:26 blockdev_general.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:07:42.935 23:29:26 blockdev_general.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:07:42.935 23:29:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@354 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:07:42.936 23:29:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@354 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "27007d4b-45bc-4ee7-9501-bad740ba0d3a"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "27007d4b-45bc-4ee7-9501-bad740ba0d3a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "b3ca61dc-09b7-5311-8eca-5183cabc2323"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "b3ca61dc-09b7-5311-8eca-5183cabc2323",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "e2226168-a93d-5eb6-975e-8503fce84d59"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "e2226168-a93d-5eb6-975e-8503fce84d59",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "9ee45303-5869-57fe-b02a-6f57bb958f7d"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "9ee45303-5869-57fe-b02a-6f57bb958f7d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "ae217e2d-d155-5ec7-88dd-cdbfeb63a060"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "ae217e2d-d155-5ec7-88dd-cdbfeb63a060",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "948a2fe2-b282-56eb-b119-8666fa630086"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "948a2fe2-b282-56eb-b119-8666fa630086",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "60d1e1c9-3738-56ff-b90b-dc50a50232f7"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "60d1e1c9-3738-56ff-b90b-dc50a50232f7",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "5e57dd1c-fbd3-56f1-9049-f03f312647d8"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "5e57dd1c-fbd3-56f1-9049-f03f312647d8",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "beb25c0a-f3ef-5900-8892-609b26d5a3eb"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "beb25c0a-f3ef-5900-8892-609b26d5a3eb",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "8ed342b1-5d9a-59bf-8721-599282d79d4d"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "8ed342b1-5d9a-59bf-8721-599282d79d4d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "e20fb6e5-601c-537d-baf5-f021a793f27d"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "e20fb6e5-601c-537d-baf5-f021a793f27d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "72c3a8ca-280b-5268-899b-fc758ad3a78b"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "72c3a8ca-280b-5268-899b-fc758ad3a78b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "e2a6c4ce-1d04-4cac-a6a2-f092adad2db1"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "e2a6c4ce-1d04-4cac-a6a2-f092adad2db1",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "e2a6c4ce-1d04-4cac-a6a2-f092adad2db1",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "4d766dc8-bcc3-4c45-8104-2dd12d7bfcdc",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "0fa26c94-4bd6-4a2c-8d0d-cee6030946cf",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "0580588f-4aff-41b9-bee0-b7beecd77712"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "0580588f-4aff-41b9-bee0-b7beecd77712",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "0580588f-4aff-41b9-bee0-b7beecd77712",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "fd706e87-a2ec-4a32-9303-2d42699a3e46",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "f29eb4bc-83da-4e95-bc81-37b30ca509ab",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "d7b1bac4-f601-437b-af87-145d3852915d"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "d7b1bac4-f601-437b-af87-145d3852915d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "d7b1bac4-f601-437b-af87-145d3852915d",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "a9d2b9ae-6a3e-4f70-b6d6-292ade35e6a0",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "e31e4a71-65be-4eff-a034-92497e221200",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "a187f8c4-1b70-49a5-a6e7-349457c09452"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "a187f8c4-1b70-49a5-a6e7-349457c09452",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:07:42.936 23:29:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@354 -- # [[ -n Malloc0 00:07:42.936 Malloc1p0 00:07:42.936 Malloc1p1 00:07:42.936 Malloc2p0 00:07:42.936 Malloc2p1 00:07:42.936 Malloc2p2 00:07:42.936 Malloc2p3 00:07:42.936 Malloc2p4 00:07:42.936 Malloc2p5 00:07:42.936 Malloc2p6 00:07:42.937 Malloc2p7 00:07:42.937 TestPT 00:07:42.937 raid0 00:07:42.937 concat0 ]] 00:07:42.937 23:29:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:07:42.938 23:29:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "27007d4b-45bc-4ee7-9501-bad740ba0d3a"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "27007d4b-45bc-4ee7-9501-bad740ba0d3a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "b3ca61dc-09b7-5311-8eca-5183cabc2323"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "b3ca61dc-09b7-5311-8eca-5183cabc2323",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "e2226168-a93d-5eb6-975e-8503fce84d59"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "e2226168-a93d-5eb6-975e-8503fce84d59",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "9ee45303-5869-57fe-b02a-6f57bb958f7d"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "9ee45303-5869-57fe-b02a-6f57bb958f7d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "ae217e2d-d155-5ec7-88dd-cdbfeb63a060"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "ae217e2d-d155-5ec7-88dd-cdbfeb63a060",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "948a2fe2-b282-56eb-b119-8666fa630086"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "948a2fe2-b282-56eb-b119-8666fa630086",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "60d1e1c9-3738-56ff-b90b-dc50a50232f7"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "60d1e1c9-3738-56ff-b90b-dc50a50232f7",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "5e57dd1c-fbd3-56f1-9049-f03f312647d8"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "5e57dd1c-fbd3-56f1-9049-f03f312647d8",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "beb25c0a-f3ef-5900-8892-609b26d5a3eb"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "beb25c0a-f3ef-5900-8892-609b26d5a3eb",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "8ed342b1-5d9a-59bf-8721-599282d79d4d"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "8ed342b1-5d9a-59bf-8721-599282d79d4d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "e20fb6e5-601c-537d-baf5-f021a793f27d"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "e20fb6e5-601c-537d-baf5-f021a793f27d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "72c3a8ca-280b-5268-899b-fc758ad3a78b"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "72c3a8ca-280b-5268-899b-fc758ad3a78b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "e2a6c4ce-1d04-4cac-a6a2-f092adad2db1"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "e2a6c4ce-1d04-4cac-a6a2-f092adad2db1",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "e2a6c4ce-1d04-4cac-a6a2-f092adad2db1",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "4d766dc8-bcc3-4c45-8104-2dd12d7bfcdc",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "0fa26c94-4bd6-4a2c-8d0d-cee6030946cf",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "0580588f-4aff-41b9-bee0-b7beecd77712"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "0580588f-4aff-41b9-bee0-b7beecd77712",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "0580588f-4aff-41b9-bee0-b7beecd77712",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "fd706e87-a2ec-4a32-9303-2d42699a3e46",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "f29eb4bc-83da-4e95-bc81-37b30ca509ab",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "d7b1bac4-f601-437b-af87-145d3852915d"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "d7b1bac4-f601-437b-af87-145d3852915d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "d7b1bac4-f601-437b-af87-145d3852915d",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "a9d2b9ae-6a3e-4f70-b6d6-292ade35e6a0",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "e31e4a71-65be-4eff-a034-92497e221200",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "a187f8c4-1b70-49a5-a6e7-349457c09452"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "a187f8c4-1b70-49a5-a6e7-349457c09452",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:07:42.938 23:29:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:07:42.938 23:29:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc0]' 00:07:42.938 23:29:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc0 00:07:42.938 23:29:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:07:42.938 23:29:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc1p0]' 00:07:42.938 23:29:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc1p0 00:07:42.938 23:29:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:07:42.938 23:29:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc1p1]' 00:07:42.938 23:29:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc1p1 00:07:42.938 23:29:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:07:42.938 23:29:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc2p0]' 00:07:42.938 23:29:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc2p0 00:07:42.938 23:29:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:07:42.938 23:29:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc2p1]' 00:07:42.938 23:29:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc2p1 00:07:42.938 23:29:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:07:42.938 23:29:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc2p2]' 00:07:42.938 23:29:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc2p2 00:07:42.938 23:29:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:07:42.938 23:29:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc2p3]' 00:07:42.938 23:29:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc2p3 00:07:42.938 23:29:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:07:42.938 23:29:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc2p4]' 00:07:42.938 23:29:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc2p4 00:07:42.938 23:29:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:07:42.938 23:29:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc2p5]' 00:07:42.938 23:29:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc2p5 00:07:42.938 23:29:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:07:42.938 23:29:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc2p6]' 00:07:42.938 23:29:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc2p6 00:07:42.938 23:29:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:07:42.938 23:29:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc2p7]' 00:07:42.938 23:29:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc2p7 00:07:42.938 23:29:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:07:42.938 23:29:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_TestPT]' 00:07:42.938 23:29:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=TestPT 00:07:42.938 23:29:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:07:42.938 23:29:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_raid0]' 00:07:42.938 23:29:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=raid0 00:07:42.938 23:29:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:07:42.938 23:29:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_concat0]' 00:07:42.938 23:29:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=concat0 00:07:42.938 23:29:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@366 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:07:42.938 23:29:26 blockdev_general.bdev_fio -- common/autotest_common.sh@1101 -- # '[' 11 -le 1 ']' 00:07:42.938 23:29:26 blockdev_general.bdev_fio -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:42.938 23:29:26 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:07:42.938 ************************************ 00:07:42.938 START TEST bdev_fio_trim 00:07:42.938 ************************************ 00:07:42.938 23:29:26 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1125 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:07:42.939 23:29:26 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:07:42.939 23:29:26 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:07:42.939 23:29:26 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:07:42.939 23:29:26 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:07:42.939 23:29:26 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:07:42.939 23:29:26 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:07:42.939 23:29:26 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:07:42.939 23:29:26 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:07:42.939 23:29:26 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:07:42.939 23:29:26 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:07:42.939 23:29:26 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:07:42.939 23:29:26 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:07:42.939 23:29:26 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:07:42.939 23:29:26 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:07:42.939 23:29:26 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:07:42.939 23:29:26 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:07:42.939 23:29:26 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:07:42.939 23:29:26 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:07:42.939 23:29:26 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:07:42.939 23:29:26 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:07:42.939 23:29:26 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:07:42.939 job_Malloc0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:42.939 job_Malloc1p0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:42.939 job_Malloc1p1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:42.939 job_Malloc2p0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:42.939 job_Malloc2p1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:42.939 job_Malloc2p2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:42.939 job_Malloc2p3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:42.939 job_Malloc2p4: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:42.939 job_Malloc2p5: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:42.939 job_Malloc2p6: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:42.939 job_Malloc2p7: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:42.939 job_TestPT: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:42.939 job_raid0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:42.939 job_concat0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:42.939 fio-3.35 00:07:42.939 Starting 14 threads 00:07:52.892 00:07:52.892 job_Malloc0: (groupid=0, jobs=14): err= 0: pid=228813: Wed Jul 24 23:29:37 2024 00:07:52.892 write: IOPS=146k, BW=572MiB/s (600MB/s)(5720MiB/10002msec); 0 zone resets 00:07:52.892 slat (usec): min=2, max=638, avg=34.29, stdev= 9.47 00:07:52.892 clat (usec): min=26, max=2741, avg=237.96, stdev=82.98 00:07:52.892 lat (usec): min=38, max=2764, avg=272.25, stdev=86.30 00:07:52.892 clat percentiles (usec): 00:07:52.892 | 50.000th=[ 231], 99.000th=[ 420], 99.900th=[ 469], 99.990th=[ 537], 00:07:52.892 | 99.999th=[ 914] 00:07:52.892 bw ( KiB/s): min=523776, max=847669, per=100.00%, avg=587396.74, stdev=5680.57, samples=266 00:07:52.892 iops : min=130941, max=211917, avg=146846.32, stdev=1420.22, samples=266 00:07:52.892 trim: IOPS=146k, BW=572MiB/s (600MB/s)(5720MiB/10002msec); 0 zone resets 00:07:52.892 slat (usec): min=4, max=989, avg=23.38, stdev= 6.36 00:07:52.892 clat (usec): min=4, max=2764, avg=268.59, stdev=89.69 00:07:52.892 lat (usec): min=15, max=2779, avg=291.96, stdev=92.30 00:07:52.892 clat percentiles (usec): 00:07:52.892 | 50.000th=[ 262], 99.000th=[ 461], 99.900th=[ 506], 99.990th=[ 578], 00:07:52.892 | 99.999th=[ 725] 00:07:52.892 bw ( KiB/s): min=523776, max=847669, per=100.00%, avg=587396.74, stdev=5680.58, samples=266 00:07:52.892 iops : min=130941, max=211917, avg=146846.42, stdev=1420.22, samples=266 00:07:52.892 lat (usec) : 10=0.01%, 20=0.02%, 50=0.10%, 100=1.84%, 250=50.25% 00:07:52.892 lat (usec) : 500=47.70%, 750=0.08%, 1000=0.01% 00:07:52.892 lat (msec) : 2=0.01%, 4=0.01% 00:07:52.892 cpu : usr=99.66%, sys=0.01%, ctx=485, majf=0, minf=1046 00:07:52.892 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:07:52.892 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:07:52.892 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:07:52.892 issued rwts: total=0,1464269,1464273,0 short=0,0,0,0 dropped=0,0,0,0 00:07:52.892 latency : target=0, window=0, percentile=100.00%, depth=8 00:07:52.892 00:07:52.892 Run status group 0 (all jobs): 00:07:52.892 WRITE: bw=572MiB/s (600MB/s), 572MiB/s-572MiB/s (600MB/s-600MB/s), io=5720MiB (5998MB), run=10002-10002msec 00:07:52.892 TRIM: bw=572MiB/s (600MB/s), 572MiB/s-572MiB/s (600MB/s-600MB/s), io=5720MiB (5998MB), run=10002-10002msec 00:07:53.150 00:07:53.150 real 0m11.233s 00:07:53.150 user 2m27.671s 00:07:53.150 sys 0m0.894s 00:07:53.150 23:29:37 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:53.150 23:29:37 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:07:53.150 ************************************ 00:07:53.150 END TEST bdev_fio_trim 00:07:53.150 ************************************ 00:07:53.150 23:29:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@367 -- # rm -f 00:07:53.150 23:29:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:07:53.150 23:29:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@369 -- # popd 00:07:53.150 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:07:53.150 23:29:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@370 -- # trap - SIGINT SIGTERM EXIT 00:07:53.150 00:07:53.150 real 0m22.974s 00:07:53.150 user 5m15.698s 00:07:53.150 sys 0m2.018s 00:07:53.150 23:29:38 blockdev_general.bdev_fio -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:53.150 23:29:38 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:07:53.150 ************************************ 00:07:53.150 END TEST bdev_fio 00:07:53.150 ************************************ 00:07:53.150 23:29:38 blockdev_general -- bdev/blockdev.sh@774 -- # trap cleanup SIGINT SIGTERM EXIT 00:07:53.150 23:29:38 blockdev_general -- bdev/blockdev.sh@776 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:07:53.150 23:29:38 blockdev_general -- common/autotest_common.sh@1101 -- # '[' 16 -le 1 ']' 00:07:53.150 23:29:38 blockdev_general -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:53.150 23:29:38 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:53.150 ************************************ 00:07:53.150 START TEST bdev_verify 00:07:53.150 ************************************ 00:07:53.150 23:29:38 blockdev_general.bdev_verify -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:07:53.150 [2024-07-24 23:29:38.115670] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:07:53.150 [2024-07-24 23:29:38.115704] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid230552 ] 00:07:53.407 [2024-07-24 23:29:38.177163] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:53.407 [2024-07-24 23:29:38.248512] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:53.407 [2024-07-24 23:29:38.248518] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:53.407 [2024-07-24 23:29:38.382127] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:53.407 [2024-07-24 23:29:38.382167] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:07:53.407 [2024-07-24 23:29:38.382174] vbdev_passthru.c: 736:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:07:53.407 [2024-07-24 23:29:38.390135] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:07:53.407 [2024-07-24 23:29:38.390151] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:07:53.407 [2024-07-24 23:29:38.398154] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:07:53.407 [2024-07-24 23:29:38.398169] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:07:53.667 [2024-07-24 23:29:38.465962] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:53.667 [2024-07-24 23:29:38.465999] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:07:53.667 [2024-07-24 23:29:38.466008] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb28fd0 00:07:53.667 [2024-07-24 23:29:38.466015] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:07:53.667 [2024-07-24 23:29:38.467051] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:07:53.667 [2024-07-24 23:29:38.467072] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:07:53.667 Running I/O for 5 seconds... 00:07:59.024 00:07:59.024 Latency(us) 00:07:59.024 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:07:59.024 Job: Malloc0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:07:59.024 Verification LBA range: start 0x0 length 0x1000 00:07:59.024 Malloc0 : 5.12 1600.92 6.25 0.00 0.00 79826.15 450.56 158784.37 00:07:59.024 Job: Malloc0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:07:59.024 Verification LBA range: start 0x1000 length 0x1000 00:07:59.024 Malloc0 : 5.15 1589.90 6.21 0.00 0.00 80387.45 419.35 253655.53 00:07:59.024 Job: Malloc1p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:07:59.024 Verification LBA range: start 0x0 length 0x800 00:07:59.024 Malloc1p0 : 5.16 819.19 3.20 0.00 0.00 155696.06 2356.18 151793.86 00:07:59.024 Job: Malloc1p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:07:59.024 Verification LBA range: start 0x800 length 0x800 00:07:59.024 Malloc1p0 : 5.15 819.54 3.20 0.00 0.00 155690.93 2356.18 139810.13 00:07:59.024 Job: Malloc1p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:07:59.024 Verification LBA range: start 0x0 length 0x800 00:07:59.024 Malloc1p1 : 5.16 818.96 3.20 0.00 0.00 155438.75 2340.57 147799.28 00:07:59.024 Job: Malloc1p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:07:59.024 Verification LBA range: start 0x800 length 0x800 00:07:59.024 Malloc1p1 : 5.16 819.28 3.20 0.00 0.00 155429.48 2340.57 137812.85 00:07:59.024 Job: Malloc2p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:07:59.024 Verification LBA range: start 0x0 length 0x200 00:07:59.024 Malloc2p0 : 5.16 818.73 3.20 0.00 0.00 155189.92 2340.57 144803.35 00:07:59.024 Job: Malloc2p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:07:59.024 Verification LBA range: start 0x200 length 0x200 00:07:59.024 Malloc2p0 : 5.16 819.06 3.20 0.00 0.00 155178.19 2340.57 135815.56 00:07:59.024 Job: Malloc2p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:07:59.024 Verification LBA range: start 0x0 length 0x200 00:07:59.024 Malloc2p1 : 5.16 818.51 3.20 0.00 0.00 154941.76 2340.57 141807.42 00:07:59.024 Job: Malloc2p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:07:59.024 Verification LBA range: start 0x200 length 0x200 00:07:59.024 Malloc2p1 : 5.16 818.83 3.20 0.00 0.00 154935.38 2356.18 130822.34 00:07:59.024 Job: Malloc2p2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:07:59.024 Verification LBA range: start 0x0 length 0x200 00:07:59.024 Malloc2p2 : 5.16 818.28 3.20 0.00 0.00 154687.04 2402.99 139810.13 00:07:59.024 Job: Malloc2p2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:07:59.024 Verification LBA range: start 0x200 length 0x200 00:07:59.024 Malloc2p2 : 5.16 818.60 3.20 0.00 0.00 154667.11 2402.99 128825.05 00:07:59.024 Job: Malloc2p3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:07:59.024 Verification LBA range: start 0x0 length 0x200 00:07:59.024 Malloc2p3 : 5.16 818.06 3.20 0.00 0.00 154434.07 2293.76 137812.85 00:07:59.024 Job: Malloc2p3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:07:59.024 Verification LBA range: start 0x200 length 0x200 00:07:59.024 Malloc2p3 : 5.16 818.38 3.20 0.00 0.00 154425.41 2293.76 124331.15 00:07:59.024 Job: Malloc2p4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:07:59.024 Verification LBA range: start 0x0 length 0x200 00:07:59.024 Malloc2p4 : 5.16 817.84 3.19 0.00 0.00 154182.91 2387.38 136814.20 00:07:59.024 Job: Malloc2p4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:07:59.024 Verification LBA range: start 0x200 length 0x200 00:07:59.024 Malloc2p4 : 5.16 818.15 3.20 0.00 0.00 154170.83 2371.78 123332.51 00:07:59.024 Job: Malloc2p5 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:07:59.024 Verification LBA range: start 0x0 length 0x200 00:07:59.024 Malloc2p5 : 5.17 817.62 3.19 0.00 0.00 153952.62 2434.19 135815.56 00:07:59.024 Job: Malloc2p5 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:07:59.024 Verification LBA range: start 0x200 length 0x200 00:07:59.024 Malloc2p5 : 5.16 817.94 3.20 0.00 0.00 153926.94 2418.59 122333.87 00:07:59.024 Job: Malloc2p6 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:07:59.024 Verification LBA range: start 0x0 length 0x200 00:07:59.024 Malloc2p6 : 5.17 817.41 3.19 0.00 0.00 153700.18 2371.78 132819.63 00:07:59.024 Job: Malloc2p6 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:07:59.024 Verification LBA range: start 0x200 length 0x200 00:07:59.024 Malloc2p6 : 5.17 817.72 3.19 0.00 0.00 153671.73 2402.99 120336.58 00:07:59.024 Job: Malloc2p7 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:07:59.024 Verification LBA range: start 0x0 length 0x200 00:07:59.024 Malloc2p7 : 5.17 817.11 3.19 0.00 0.00 153477.90 2356.18 129823.70 00:07:59.024 Job: Malloc2p7 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:07:59.024 Verification LBA range: start 0x200 length 0x200 00:07:59.024 Malloc2p7 : 5.17 817.50 3.19 0.00 0.00 153438.43 2402.99 120336.58 00:07:59.024 Job: TestPT (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:07:59.024 Verification LBA range: start 0x0 length 0x1000 00:07:59.024 TestPT : 5.17 816.67 3.19 0.00 0.00 153169.08 1739.82 116342.00 00:07:59.024 Job: TestPT (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:07:59.024 Verification LBA range: start 0x1000 length 0x1000 00:07:59.024 TestPT : 5.18 795.50 3.11 0.00 0.00 157063.99 11796.48 175761.31 00:07:59.024 Job: raid0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:07:59.024 Verification LBA range: start 0x0 length 0x2000 00:07:59.024 raid0 : 5.17 816.23 3.19 0.00 0.00 152804.95 1911.47 115343.36 00:07:59.024 Job: raid0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:07:59.025 Verification LBA range: start 0x2000 length 0x2000 00:07:59.025 raid0 : 5.17 817.07 3.19 0.00 0.00 152762.67 1966.08 108352.85 00:07:59.025 Job: concat0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:07:59.025 Verification LBA range: start 0x0 length 0x2000 00:07:59.025 concat0 : 5.18 815.82 3.19 0.00 0.00 152626.99 2044.10 118838.61 00:07:59.025 Job: concat0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:07:59.025 Verification LBA range: start 0x2000 length 0x2000 00:07:59.025 concat0 : 5.17 816.63 3.19 0.00 0.00 152590.14 1911.47 114344.72 00:07:59.025 Job: raid1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:07:59.025 Verification LBA range: start 0x0 length 0x1000 00:07:59.025 raid1 : 5.18 815.32 3.18 0.00 0.00 152479.10 2044.10 123831.83 00:07:59.025 Job: raid1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:07:59.025 Verification LBA range: start 0x1000 length 0x1000 00:07:59.025 raid1 : 5.18 816.19 3.19 0.00 0.00 152428.97 2481.01 118838.61 00:07:59.025 Job: AIO0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:07:59.025 Verification LBA range: start 0x0 length 0x4e2 00:07:59.025 AIO0 : 5.18 815.08 3.18 0.00 0.00 152316.27 873.81 126328.44 00:07:59.025 Job: AIO0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:07:59.025 Verification LBA range: start 0x4e2 length 0x4e2 00:07:59.025 AIO0 : 5.18 815.82 3.19 0.00 0.00 152253.33 873.81 123831.83 00:07:59.025 =================================================================================================================== 00:07:59.025 Total : 27697.85 108.19 0.00 0.00 145582.79 419.35 253655.53 00:07:59.284 00:07:59.284 real 0m6.127s 00:07:59.284 user 0m11.626s 00:07:59.284 sys 0m0.249s 00:07:59.284 23:29:44 blockdev_general.bdev_verify -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:59.284 23:29:44 blockdev_general.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:07:59.284 ************************************ 00:07:59.284 END TEST bdev_verify 00:07:59.284 ************************************ 00:07:59.284 23:29:44 blockdev_general -- bdev/blockdev.sh@777 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:07:59.284 23:29:44 blockdev_general -- common/autotest_common.sh@1101 -- # '[' 16 -le 1 ']' 00:07:59.284 23:29:44 blockdev_general -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:59.284 23:29:44 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:59.284 ************************************ 00:07:59.284 START TEST bdev_verify_big_io 00:07:59.284 ************************************ 00:07:59.284 23:29:44 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:07:59.542 [2024-07-24 23:29:44.320588] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:07:59.542 [2024-07-24 23:29:44.320625] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid231701 ] 00:07:59.542 [2024-07-24 23:29:44.384336] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:59.542 [2024-07-24 23:29:44.456869] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:59.542 [2024-07-24 23:29:44.456871] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:59.800 [2024-07-24 23:29:44.591568] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:59.801 [2024-07-24 23:29:44.591611] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:07:59.801 [2024-07-24 23:29:44.591618] vbdev_passthru.c: 736:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:07:59.801 [2024-07-24 23:29:44.599576] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:07:59.801 [2024-07-24 23:29:44.599591] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:07:59.801 [2024-07-24 23:29:44.607595] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:07:59.801 [2024-07-24 23:29:44.607610] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:07:59.801 [2024-07-24 23:29:44.674897] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:59.801 [2024-07-24 23:29:44.674936] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:07:59.801 [2024-07-24 23:29:44.674945] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20fcfd0 00:07:59.801 [2024-07-24 23:29:44.674951] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:07:59.801 [2024-07-24 23:29:44.675935] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:07:59.801 [2024-07-24 23:29:44.675957] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:08:00.059 [2024-07-24 23:29:44.836849] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p0 simultaneously (32). Queue depth is limited to 32 00:08:00.059 [2024-07-24 23:29:44.837642] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p0 simultaneously (32). Queue depth is limited to 32 00:08:00.059 [2024-07-24 23:29:44.838852] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p1 simultaneously (32). Queue depth is limited to 32 00:08:00.059 [2024-07-24 23:29:44.839631] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p1 simultaneously (32). Queue depth is limited to 32 00:08:00.059 [2024-07-24 23:29:44.840871] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p2 simultaneously (32). Queue depth is limited to 32 00:08:00.059 [2024-07-24 23:29:44.841678] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p2 simultaneously (32). Queue depth is limited to 32 00:08:00.059 [2024-07-24 23:29:44.842919] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p3 simultaneously (32). Queue depth is limited to 32 00:08:00.059 [2024-07-24 23:29:44.844172] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p3 simultaneously (32). Queue depth is limited to 32 00:08:00.059 [2024-07-24 23:29:44.844971] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p4 simultaneously (32). Queue depth is limited to 32 00:08:00.059 [2024-07-24 23:29:44.846212] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p4 simultaneously (32). Queue depth is limited to 32 00:08:00.059 [2024-07-24 23:29:44.847016] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p5 simultaneously (32). Queue depth is limited to 32 00:08:00.059 [2024-07-24 23:29:44.848131] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p5 simultaneously (32). Queue depth is limited to 32 00:08:00.059 [2024-07-24 23:29:44.848803] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p6 simultaneously (32). Queue depth is limited to 32 00:08:00.059 [2024-07-24 23:29:44.849901] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p6 simultaneously (32). Queue depth is limited to 32 00:08:00.059 [2024-07-24 23:29:44.850582] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p7 simultaneously (32). Queue depth is limited to 32 00:08:00.059 [2024-07-24 23:29:44.851673] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p7 simultaneously (32). Queue depth is limited to 32 00:08:00.059 [2024-07-24 23:29:44.869993] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev AIO0 simultaneously (78). Queue depth is limited to 78 00:08:00.059 [2024-07-24 23:29:44.871513] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev AIO0 simultaneously (78). Queue depth is limited to 78 00:08:00.059 Running I/O for 5 seconds... 00:08:06.618 00:08:06.618 Latency(us) 00:08:06.618 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:08:06.618 Job: Malloc0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:08:06.618 Verification LBA range: start 0x0 length 0x100 00:08:06.618 Malloc0 : 5.57 298.68 18.67 0.00 0.00 423037.70 600.75 1382123.03 00:08:06.618 Job: Malloc0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:08:06.618 Verification LBA range: start 0x100 length 0x100 00:08:06.618 Malloc0 : 5.65 294.30 18.39 0.00 0.00 429407.43 585.14 1581851.79 00:08:06.618 Job: Malloc1p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:08:06.618 Verification LBA range: start 0x0 length 0x80 00:08:06.618 Malloc1p0 : 5.76 121.49 7.59 0.00 0.00 1005749.12 2527.82 1629786.70 00:08:06.618 Job: Malloc1p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:08:06.618 Verification LBA range: start 0x80 length 0x80 00:08:06.618 Malloc1p0 : 5.85 98.51 6.16 0.00 0.00 1238420.11 2324.97 1829515.46 00:08:06.619 Job: Malloc1p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:08:06.619 Verification LBA range: start 0x0 length 0x80 00:08:06.619 Malloc1p1 : 6.03 55.69 3.48 0.00 0.00 2106218.82 1771.03 3275551.70 00:08:06.619 Job: Malloc1p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:08:06.619 Verification LBA range: start 0x80 length 0x80 00:08:06.619 Malloc1p1 : 6.08 57.88 3.62 0.00 0.00 2028789.51 1841.25 3083812.08 00:08:06.619 Job: Malloc2p0 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:08:06.619 Verification LBA range: start 0x0 length 0x20 00:08:06.619 Malloc2p0 : 5.72 41.97 2.62 0.00 0.00 701973.45 561.74 1166415.97 00:08:06.619 Job: Malloc2p0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:08:06.619 Verification LBA range: start 0x20 length 0x20 00:08:06.619 Malloc2p0 : 5.73 44.64 2.79 0.00 0.00 663733.29 581.24 1030600.41 00:08:06.619 Job: Malloc2p1 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:08:06.619 Verification LBA range: start 0x0 length 0x20 00:08:06.619 Malloc2p1 : 5.72 41.96 2.62 0.00 0.00 698166.46 596.85 1150437.67 00:08:06.619 Job: Malloc2p1 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:08:06.619 Verification LBA range: start 0x20 length 0x20 00:08:06.619 Malloc2p1 : 5.74 44.64 2.79 0.00 0.00 660305.22 604.65 1014622.11 00:08:06.619 Job: Malloc2p2 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:08:06.619 Verification LBA range: start 0x0 length 0x20 00:08:06.619 Malloc2p2 : 5.76 44.42 2.78 0.00 0.00 662321.27 741.18 1134459.37 00:08:06.619 Job: Malloc2p2 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:08:06.619 Verification LBA range: start 0x20 length 0x20 00:08:06.619 Malloc2p2 : 5.74 44.63 2.79 0.00 0.00 656731.04 713.87 998643.81 00:08:06.619 Job: Malloc2p3 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:08:06.619 Verification LBA range: start 0x0 length 0x20 00:08:06.619 Malloc2p3 : 5.76 44.41 2.78 0.00 0.00 658391.66 729.48 1110491.92 00:08:06.619 Job: Malloc2p3 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:08:06.619 Verification LBA range: start 0x20 length 0x20 00:08:06.619 Malloc2p3 : 5.74 44.62 2.79 0.00 0.00 652805.84 717.78 978670.93 00:08:06.619 Job: Malloc2p4 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:08:06.619 Verification LBA range: start 0x0 length 0x20 00:08:06.619 Malloc2p4 : 5.77 44.40 2.78 0.00 0.00 654496.07 667.06 1094513.62 00:08:06.619 Job: Malloc2p4 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:08:06.619 Verification LBA range: start 0x20 length 0x20 00:08:06.619 Malloc2p4 : 5.74 44.62 2.79 0.00 0.00 649013.25 682.67 962692.63 00:08:06.619 Job: Malloc2p5 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:08:06.619 Verification LBA range: start 0x0 length 0x20 00:08:06.619 Malloc2p5 : 5.77 44.40 2.77 0.00 0.00 650846.54 612.45 1078535.31 00:08:06.619 Job: Malloc2p5 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:08:06.619 Verification LBA range: start 0x20 length 0x20 00:08:06.619 Malloc2p5 : 5.74 44.61 2.79 0.00 0.00 645376.91 616.35 950708.91 00:08:06.619 Job: Malloc2p6 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:08:06.619 Verification LBA range: start 0x0 length 0x20 00:08:06.619 Malloc2p6 : 5.77 44.39 2.77 0.00 0.00 647483.22 596.85 1062557.01 00:08:06.619 Job: Malloc2p6 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:08:06.619 Verification LBA range: start 0x20 length 0x20 00:08:06.619 Malloc2p6 : 5.74 44.60 2.79 0.00 0.00 641568.66 616.35 934730.61 00:08:06.619 Job: Malloc2p7 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:08:06.619 Verification LBA range: start 0x0 length 0x20 00:08:06.619 Malloc2p7 : 5.77 44.38 2.77 0.00 0.00 643697.14 589.04 1046578.71 00:08:06.619 Job: Malloc2p7 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:08:06.619 Verification LBA range: start 0x20 length 0x20 00:08:06.619 Malloc2p7 : 5.78 47.03 2.94 0.00 0.00 607733.45 585.14 914757.73 00:08:06.619 Job: TestPT (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:08:06.619 Verification LBA range: start 0x0 length 0x100 00:08:06.619 TestPT : 6.07 55.54 3.47 0.00 0.00 1985812.85 64412.53 2796202.67 00:08:06.619 Job: TestPT (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:08:06.619 Verification LBA range: start 0x100 length 0x100 00:08:06.619 TestPT : 6.12 55.06 3.44 0.00 0.00 2003308.74 49432.87 2716311.16 00:08:06.619 Job: raid0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:08:06.619 Verification LBA range: start 0x0 length 0x200 00:08:06.619 raid0 : 6.12 62.71 3.92 0.00 0.00 1731481.83 1427.75 2940007.38 00:08:06.619 Job: raid0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:08:06.619 Verification LBA range: start 0x200 length 0x200 00:08:06.619 raid0 : 6.12 62.71 3.92 0.00 0.00 1732665.31 1419.95 2748267.76 00:08:06.619 Job: concat0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:08:06.619 Verification LBA range: start 0x0 length 0x200 00:08:06.619 concat0 : 6.09 68.31 4.27 0.00 0.00 1572866.18 1435.55 2828159.27 00:08:06.619 Job: concat0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:08:06.619 Verification LBA range: start 0x200 length 0x200 00:08:06.619 concat0 : 6.08 72.49 4.53 0.00 0.00 1480374.92 1419.95 2636419.66 00:08:06.619 Job: raid1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:08:06.619 Verification LBA range: start 0x0 length 0x100 00:08:06.619 raid1 : 6.07 82.04 5.13 0.00 0.00 1291842.09 1934.87 2732289.46 00:08:06.619 Job: raid1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:08:06.619 Verification LBA range: start 0x100 length 0x100 00:08:06.619 raid1 : 6.09 91.04 5.69 0.00 0.00 1164782.72 1911.47 2540549.85 00:08:06.619 Job: AIO0 (Core Mask 0x1, workload: verify, depth: 78, IO size: 65536) 00:08:06.619 Verification LBA range: start 0x0 length 0x4e 00:08:06.619 AIO0 : 6.13 93.03 5.81 0.00 0.00 684826.90 553.94 1629786.70 00:08:06.619 Job: AIO0 (Core Mask 0x2, workload: verify, depth: 78, IO size: 65536) 00:08:06.619 Verification LBA range: start 0x4e length 0x4e 00:08:06.619 AIO0 : 6.12 83.60 5.22 0.00 0.00 761922.91 592.94 1454025.39 00:08:06.619 =================================================================================================================== 00:08:06.619 Total : 2362.77 147.67 0.00 0.00 940413.16 553.94 3275551.70 00:08:06.619 00:08:06.619 real 0m7.135s 00:08:06.619 user 0m13.559s 00:08:06.619 sys 0m0.312s 00:08:06.619 23:29:51 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:06.619 23:29:51 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:08:06.619 ************************************ 00:08:06.619 END TEST bdev_verify_big_io 00:08:06.619 ************************************ 00:08:06.619 23:29:51 blockdev_general -- bdev/blockdev.sh@778 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:08:06.619 23:29:51 blockdev_general -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:08:06.619 23:29:51 blockdev_general -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:06.619 23:29:51 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:06.619 ************************************ 00:08:06.619 START TEST bdev_write_zeroes 00:08:06.619 ************************************ 00:08:06.619 23:29:51 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:08:06.619 [2024-07-24 23:29:51.518766] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:08:06.619 [2024-07-24 23:29:51.518802] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid232861 ] 00:08:06.619 [2024-07-24 23:29:51.579276] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:06.877 [2024-07-24 23:29:51.650203] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:06.877 [2024-07-24 23:29:51.787913] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:06.877 [2024-07-24 23:29:51.787955] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:08:06.877 [2024-07-24 23:29:51.787962] vbdev_passthru.c: 736:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:08:06.877 [2024-07-24 23:29:51.795921] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:06.877 [2024-07-24 23:29:51.795937] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:06.877 [2024-07-24 23:29:51.803932] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:06.877 [2024-07-24 23:29:51.803945] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:06.877 [2024-07-24 23:29:51.870757] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:06.877 [2024-07-24 23:29:51.870796] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:08:06.877 [2024-07-24 23:29:51.870804] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a8fd80 00:08:06.877 [2024-07-24 23:29:51.870810] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:08:06.877 [2024-07-24 23:29:51.871731] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:08:06.877 [2024-07-24 23:29:51.871750] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:08:07.135 Running I/O for 1 seconds... 00:08:08.507 00:08:08.507 Latency(us) 00:08:08.507 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:08:08.507 Job: Malloc0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:08.507 Malloc0 : 1.02 7644.03 29.86 0.00 0.00 16745.96 452.51 28835.84 00:08:08.507 Job: Malloc1p0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:08.507 Malloc1p0 : 1.02 7637.04 29.83 0.00 0.00 16742.81 639.76 28336.52 00:08:08.507 Job: Malloc1p1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:08.507 Malloc1p1 : 1.02 7630.13 29.81 0.00 0.00 16736.07 624.15 27712.37 00:08:08.507 Job: Malloc2p0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:08.507 Malloc2p0 : 1.02 7623.24 29.78 0.00 0.00 16722.03 624.15 27213.04 00:08:08.507 Job: Malloc2p1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:08.507 Malloc2p1 : 1.03 7616.38 29.75 0.00 0.00 16705.39 620.25 26588.89 00:08:08.507 Job: Malloc2p2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:08.507 Malloc2p2 : 1.03 7609.49 29.72 0.00 0.00 16694.31 600.75 26089.57 00:08:08.507 Job: Malloc2p3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:08.507 Malloc2p3 : 1.03 7602.60 29.70 0.00 0.00 16690.86 604.65 25590.25 00:08:08.507 Job: Malloc2p4 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:08.507 Malloc2p4 : 1.03 7595.83 29.67 0.00 0.00 16684.45 596.85 25090.93 00:08:08.507 Job: Malloc2p5 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:08.507 Malloc2p5 : 1.04 7636.32 29.83 0.00 0.00 16571.43 604.65 24466.77 00:08:08.507 Job: Malloc2p6 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:08.507 Malloc2p6 : 1.04 7628.87 29.80 0.00 0.00 16564.29 596.85 23967.45 00:08:08.507 Job: Malloc2p7 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:08.507 Malloc2p7 : 1.04 7622.09 29.77 0.00 0.00 16555.40 600.75 23343.30 00:08:08.507 Job: TestPT (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:08.507 TestPT : 1.04 7615.36 29.75 0.00 0.00 16542.57 624.15 22719.15 00:08:08.507 Job: raid0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:08.507 raid0 : 1.04 7607.26 29.72 0.00 0.00 16532.89 1053.26 21720.50 00:08:08.507 Job: concat0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:08.507 concat0 : 1.04 7599.25 29.68 0.00 0.00 16505.37 1131.28 20597.03 00:08:08.507 Job: raid1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:08.507 raid1 : 1.05 7589.32 29.65 0.00 0.00 16478.86 1802.24 18599.74 00:08:08.507 Job: AIO0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:08.507 AIO0 : 1.05 7583.55 29.62 0.00 0.00 16434.93 709.97 17975.59 00:08:08.507 =================================================================================================================== 00:08:08.507 Total : 121840.76 475.94 0.00 0.00 16618.45 452.51 28835.84 00:08:08.507 00:08:08.507 real 0m1.918s 00:08:08.507 user 0m1.637s 00:08:08.507 sys 0m0.229s 00:08:08.507 23:29:53 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:08.507 23:29:53 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:08:08.507 ************************************ 00:08:08.507 END TEST bdev_write_zeroes 00:08:08.507 ************************************ 00:08:08.507 23:29:53 blockdev_general -- bdev/blockdev.sh@781 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:08:08.507 23:29:53 blockdev_general -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:08:08.507 23:29:53 blockdev_general -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:08.507 23:29:53 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:08.507 ************************************ 00:08:08.507 START TEST bdev_json_nonenclosed 00:08:08.507 ************************************ 00:08:08.507 23:29:53 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:08:08.507 [2024-07-24 23:29:53.498878] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:08:08.507 [2024-07-24 23:29:53.498913] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid233138 ] 00:08:08.765 [2024-07-24 23:29:53.560302] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:08.765 [2024-07-24 23:29:53.632095] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:08.765 [2024-07-24 23:29:53.632149] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:08:08.765 [2024-07-24 23:29:53.632174] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:08:08.765 [2024-07-24 23:29:53.632180] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:08:08.765 00:08:08.765 real 0m0.253s 00:08:08.765 user 0m0.167s 00:08:08.765 sys 0m0.084s 00:08:08.765 23:29:53 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:08.765 23:29:53 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:08:08.765 ************************************ 00:08:08.765 END TEST bdev_json_nonenclosed 00:08:08.765 ************************************ 00:08:08.765 23:29:53 blockdev_general -- bdev/blockdev.sh@784 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:08:08.765 23:29:53 blockdev_general -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:08:08.765 23:29:53 blockdev_general -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:08.765 23:29:53 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:09.023 ************************************ 00:08:09.023 START TEST bdev_json_nonarray 00:08:09.023 ************************************ 00:08:09.023 23:29:53 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:08:09.023 [2024-07-24 23:29:53.815957] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:08:09.023 [2024-07-24 23:29:53.816000] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid233353 ] 00:08:09.023 [2024-07-24 23:29:53.878464] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:09.023 [2024-07-24 23:29:53.948823] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:09.023 [2024-07-24 23:29:53.948899] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:08:09.023 [2024-07-24 23:29:53.948909] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:08:09.023 [2024-07-24 23:29:53.948914] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:08:09.023 00:08:09.023 real 0m0.255s 00:08:09.023 user 0m0.165s 00:08:09.023 sys 0m0.089s 00:08:09.023 23:29:54 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:09.023 23:29:54 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:08:09.023 ************************************ 00:08:09.023 END TEST bdev_json_nonarray 00:08:09.023 ************************************ 00:08:09.281 23:29:54 blockdev_general -- bdev/blockdev.sh@786 -- # [[ bdev == bdev ]] 00:08:09.281 23:29:54 blockdev_general -- bdev/blockdev.sh@787 -- # run_test bdev_qos qos_test_suite '' 00:08:09.281 23:29:54 blockdev_general -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:08:09.281 23:29:54 blockdev_general -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:09.281 23:29:54 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:09.281 ************************************ 00:08:09.281 START TEST bdev_qos 00:08:09.281 ************************************ 00:08:09.281 23:29:54 blockdev_general.bdev_qos -- common/autotest_common.sh@1125 -- # qos_test_suite '' 00:08:09.281 23:29:54 blockdev_general.bdev_qos -- bdev/blockdev.sh@445 -- # QOS_PID=233378 00:08:09.281 23:29:54 blockdev_general.bdev_qos -- bdev/blockdev.sh@446 -- # echo 'Process qos testing pid: 233378' 00:08:09.281 Process qos testing pid: 233378 00:08:09.281 23:29:54 blockdev_general.bdev_qos -- bdev/blockdev.sh@444 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 256 -o 4096 -w randread -t 60 '' 00:08:09.281 23:29:54 blockdev_general.bdev_qos -- bdev/blockdev.sh@447 -- # trap 'cleanup; killprocess $QOS_PID; exit 1' SIGINT SIGTERM EXIT 00:08:09.281 23:29:54 blockdev_general.bdev_qos -- bdev/blockdev.sh@448 -- # waitforlisten 233378 00:08:09.281 23:29:54 blockdev_general.bdev_qos -- common/autotest_common.sh@831 -- # '[' -z 233378 ']' 00:08:09.281 23:29:54 blockdev_general.bdev_qos -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:09.281 23:29:54 blockdev_general.bdev_qos -- common/autotest_common.sh@836 -- # local max_retries=100 00:08:09.281 23:29:54 blockdev_general.bdev_qos -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:09.281 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:09.281 23:29:54 blockdev_general.bdev_qos -- common/autotest_common.sh@840 -- # xtrace_disable 00:08:09.281 23:29:54 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:09.281 [2024-07-24 23:29:54.139585] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:08:09.281 [2024-07-24 23:29:54.139621] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid233378 ] 00:08:09.281 [2024-07-24 23:29:54.202050] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:09.281 [2024-07-24 23:29:54.277335] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:10.212 23:29:54 blockdev_general.bdev_qos -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:08:10.212 23:29:54 blockdev_general.bdev_qos -- common/autotest_common.sh@864 -- # return 0 00:08:10.212 23:29:54 blockdev_general.bdev_qos -- bdev/blockdev.sh@450 -- # rpc_cmd bdev_malloc_create -b Malloc_0 128 512 00:08:10.212 23:29:54 blockdev_general.bdev_qos -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:10.212 23:29:54 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:10.212 Malloc_0 00:08:10.212 23:29:54 blockdev_general.bdev_qos -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:10.212 23:29:54 blockdev_general.bdev_qos -- bdev/blockdev.sh@451 -- # waitforbdev Malloc_0 00:08:10.212 23:29:54 blockdev_general.bdev_qos -- common/autotest_common.sh@899 -- # local bdev_name=Malloc_0 00:08:10.212 23:29:54 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:08:10.212 23:29:54 blockdev_general.bdev_qos -- common/autotest_common.sh@901 -- # local i 00:08:10.212 23:29:54 blockdev_general.bdev_qos -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:08:10.212 23:29:54 blockdev_general.bdev_qos -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:08:10.212 23:29:54 blockdev_general.bdev_qos -- common/autotest_common.sh@904 -- # rpc_cmd bdev_wait_for_examine 00:08:10.212 23:29:54 blockdev_general.bdev_qos -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:10.212 23:29:54 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:10.212 23:29:54 blockdev_general.bdev_qos -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:10.212 23:29:54 blockdev_general.bdev_qos -- common/autotest_common.sh@906 -- # rpc_cmd bdev_get_bdevs -b Malloc_0 -t 2000 00:08:10.212 23:29:54 blockdev_general.bdev_qos -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:10.212 23:29:54 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:10.212 [ 00:08:10.212 { 00:08:10.212 "name": "Malloc_0", 00:08:10.212 "aliases": [ 00:08:10.212 "31f75562-15d7-41be-b396-44100f47bff4" 00:08:10.212 ], 00:08:10.212 "product_name": "Malloc disk", 00:08:10.212 "block_size": 512, 00:08:10.212 "num_blocks": 262144, 00:08:10.212 "uuid": "31f75562-15d7-41be-b396-44100f47bff4", 00:08:10.212 "assigned_rate_limits": { 00:08:10.212 "rw_ios_per_sec": 0, 00:08:10.212 "rw_mbytes_per_sec": 0, 00:08:10.212 "r_mbytes_per_sec": 0, 00:08:10.212 "w_mbytes_per_sec": 0 00:08:10.212 }, 00:08:10.212 "claimed": false, 00:08:10.212 "zoned": false, 00:08:10.212 "supported_io_types": { 00:08:10.212 "read": true, 00:08:10.212 "write": true, 00:08:10.212 "unmap": true, 00:08:10.212 "flush": true, 00:08:10.212 "reset": true, 00:08:10.212 "nvme_admin": false, 00:08:10.212 "nvme_io": false, 00:08:10.212 "nvme_io_md": false, 00:08:10.212 "write_zeroes": true, 00:08:10.212 "zcopy": true, 00:08:10.213 "get_zone_info": false, 00:08:10.213 "zone_management": false, 00:08:10.213 "zone_append": false, 00:08:10.213 "compare": false, 00:08:10.213 "compare_and_write": false, 00:08:10.213 "abort": true, 00:08:10.213 "seek_hole": false, 00:08:10.213 "seek_data": false, 00:08:10.213 "copy": true, 00:08:10.213 "nvme_iov_md": false 00:08:10.213 }, 00:08:10.213 "memory_domains": [ 00:08:10.213 { 00:08:10.213 "dma_device_id": "system", 00:08:10.213 "dma_device_type": 1 00:08:10.213 }, 00:08:10.213 { 00:08:10.213 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:08:10.213 "dma_device_type": 2 00:08:10.213 } 00:08:10.213 ], 00:08:10.213 "driver_specific": {} 00:08:10.213 } 00:08:10.213 ] 00:08:10.213 23:29:54 blockdev_general.bdev_qos -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:10.213 23:29:54 blockdev_general.bdev_qos -- common/autotest_common.sh@907 -- # return 0 00:08:10.213 23:29:54 blockdev_general.bdev_qos -- bdev/blockdev.sh@452 -- # rpc_cmd bdev_null_create Null_1 128 512 00:08:10.213 23:29:54 blockdev_general.bdev_qos -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:10.213 23:29:54 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:10.213 Null_1 00:08:10.213 23:29:54 blockdev_general.bdev_qos -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:10.213 23:29:54 blockdev_general.bdev_qos -- bdev/blockdev.sh@453 -- # waitforbdev Null_1 00:08:10.213 23:29:54 blockdev_general.bdev_qos -- common/autotest_common.sh@899 -- # local bdev_name=Null_1 00:08:10.213 23:29:54 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:08:10.213 23:29:54 blockdev_general.bdev_qos -- common/autotest_common.sh@901 -- # local i 00:08:10.213 23:29:54 blockdev_general.bdev_qos -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:08:10.213 23:29:54 blockdev_general.bdev_qos -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:08:10.213 23:29:54 blockdev_general.bdev_qos -- common/autotest_common.sh@904 -- # rpc_cmd bdev_wait_for_examine 00:08:10.213 23:29:54 blockdev_general.bdev_qos -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:10.213 23:29:54 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:10.213 23:29:55 blockdev_general.bdev_qos -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:10.213 23:29:55 blockdev_general.bdev_qos -- common/autotest_common.sh@906 -- # rpc_cmd bdev_get_bdevs -b Null_1 -t 2000 00:08:10.213 23:29:55 blockdev_general.bdev_qos -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:10.213 23:29:55 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:10.213 [ 00:08:10.213 { 00:08:10.213 "name": "Null_1", 00:08:10.213 "aliases": [ 00:08:10.213 "bd85a82f-0a6c-4698-9a3a-a2416faa479b" 00:08:10.213 ], 00:08:10.213 "product_name": "Null disk", 00:08:10.213 "block_size": 512, 00:08:10.213 "num_blocks": 262144, 00:08:10.213 "uuid": "bd85a82f-0a6c-4698-9a3a-a2416faa479b", 00:08:10.213 "assigned_rate_limits": { 00:08:10.213 "rw_ios_per_sec": 0, 00:08:10.213 "rw_mbytes_per_sec": 0, 00:08:10.213 "r_mbytes_per_sec": 0, 00:08:10.213 "w_mbytes_per_sec": 0 00:08:10.213 }, 00:08:10.213 "claimed": false, 00:08:10.213 "zoned": false, 00:08:10.213 "supported_io_types": { 00:08:10.213 "read": true, 00:08:10.213 "write": true, 00:08:10.213 "unmap": false, 00:08:10.213 "flush": false, 00:08:10.213 "reset": true, 00:08:10.213 "nvme_admin": false, 00:08:10.213 "nvme_io": false, 00:08:10.213 "nvme_io_md": false, 00:08:10.213 "write_zeroes": true, 00:08:10.213 "zcopy": false, 00:08:10.213 "get_zone_info": false, 00:08:10.213 "zone_management": false, 00:08:10.213 "zone_append": false, 00:08:10.213 "compare": false, 00:08:10.213 "compare_and_write": false, 00:08:10.213 "abort": true, 00:08:10.213 "seek_hole": false, 00:08:10.213 "seek_data": false, 00:08:10.213 "copy": false, 00:08:10.213 "nvme_iov_md": false 00:08:10.213 }, 00:08:10.213 "driver_specific": {} 00:08:10.213 } 00:08:10.213 ] 00:08:10.213 23:29:55 blockdev_general.bdev_qos -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:10.213 23:29:55 blockdev_general.bdev_qos -- common/autotest_common.sh@907 -- # return 0 00:08:10.213 23:29:55 blockdev_general.bdev_qos -- bdev/blockdev.sh@456 -- # qos_function_test 00:08:10.213 23:29:55 blockdev_general.bdev_qos -- bdev/blockdev.sh@455 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:08:10.213 23:29:55 blockdev_general.bdev_qos -- bdev/blockdev.sh@409 -- # local qos_lower_iops_limit=1000 00:08:10.213 23:29:55 blockdev_general.bdev_qos -- bdev/blockdev.sh@410 -- # local qos_lower_bw_limit=2 00:08:10.213 23:29:55 blockdev_general.bdev_qos -- bdev/blockdev.sh@411 -- # local io_result=0 00:08:10.213 23:29:55 blockdev_general.bdev_qos -- bdev/blockdev.sh@412 -- # local iops_limit=0 00:08:10.213 23:29:55 blockdev_general.bdev_qos -- bdev/blockdev.sh@413 -- # local bw_limit=0 00:08:10.213 23:29:55 blockdev_general.bdev_qos -- bdev/blockdev.sh@415 -- # get_io_result IOPS Malloc_0 00:08:10.213 23:29:55 blockdev_general.bdev_qos -- bdev/blockdev.sh@374 -- # local limit_type=IOPS 00:08:10.213 23:29:55 blockdev_general.bdev_qos -- bdev/blockdev.sh@375 -- # local qos_dev=Malloc_0 00:08:10.213 23:29:55 blockdev_general.bdev_qos -- bdev/blockdev.sh@376 -- # local iostat_result 00:08:10.213 23:29:55 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:08:10.213 23:29:55 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # grep Malloc_0 00:08:10.213 23:29:55 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # tail -1 00:08:10.213 Running I/O for 60 seconds... 00:08:15.469 23:30:00 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # iostat_result='Malloc_0 95943.80 383775.21 0.00 0.00 387072.00 0.00 0.00 ' 00:08:15.469 23:30:00 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # '[' IOPS = IOPS ']' 00:08:15.469 23:30:00 blockdev_general.bdev_qos -- bdev/blockdev.sh@379 -- # awk '{print $2}' 00:08:15.469 23:30:00 blockdev_general.bdev_qos -- bdev/blockdev.sh@379 -- # iostat_result=95943.80 00:08:15.469 23:30:00 blockdev_general.bdev_qos -- bdev/blockdev.sh@384 -- # echo 95943 00:08:15.469 23:30:00 blockdev_general.bdev_qos -- bdev/blockdev.sh@415 -- # io_result=95943 00:08:15.469 23:30:00 blockdev_general.bdev_qos -- bdev/blockdev.sh@417 -- # iops_limit=23000 00:08:15.469 23:30:00 blockdev_general.bdev_qos -- bdev/blockdev.sh@418 -- # '[' 23000 -gt 1000 ']' 00:08:15.470 23:30:00 blockdev_general.bdev_qos -- bdev/blockdev.sh@421 -- # rpc_cmd bdev_set_qos_limit --rw_ios_per_sec 23000 Malloc_0 00:08:15.470 23:30:00 blockdev_general.bdev_qos -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:15.470 23:30:00 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:15.470 23:30:00 blockdev_general.bdev_qos -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:15.470 23:30:00 blockdev_general.bdev_qos -- bdev/blockdev.sh@422 -- # run_test bdev_qos_iops run_qos_test 23000 IOPS Malloc_0 00:08:15.470 23:30:00 blockdev_general.bdev_qos -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:08:15.470 23:30:00 blockdev_general.bdev_qos -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:15.470 23:30:00 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:15.470 ************************************ 00:08:15.470 START TEST bdev_qos_iops 00:08:15.470 ************************************ 00:08:15.470 23:30:00 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@1125 -- # run_qos_test 23000 IOPS Malloc_0 00:08:15.470 23:30:00 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@388 -- # local qos_limit=23000 00:08:15.470 23:30:00 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@389 -- # local qos_result=0 00:08:15.470 23:30:00 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@391 -- # get_io_result IOPS Malloc_0 00:08:15.470 23:30:00 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@374 -- # local limit_type=IOPS 00:08:15.470 23:30:00 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@375 -- # local qos_dev=Malloc_0 00:08:15.470 23:30:00 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@376 -- # local iostat_result 00:08:15.470 23:30:00 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@377 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:08:15.470 23:30:00 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@377 -- # grep Malloc_0 00:08:15.470 23:30:00 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@377 -- # tail -1 00:08:20.718 23:30:05 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@377 -- # iostat_result='Malloc_0 23018.09 92072.37 0.00 0.00 93012.00 0.00 0.00 ' 00:08:20.718 23:30:05 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # '[' IOPS = IOPS ']' 00:08:20.718 23:30:05 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@379 -- # awk '{print $2}' 00:08:20.718 23:30:05 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@379 -- # iostat_result=23018.09 00:08:20.718 23:30:05 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@384 -- # echo 23018 00:08:20.718 23:30:05 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@391 -- # qos_result=23018 00:08:20.718 23:30:05 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@392 -- # '[' IOPS = BANDWIDTH ']' 00:08:20.718 23:30:05 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@395 -- # lower_limit=20700 00:08:20.718 23:30:05 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@396 -- # upper_limit=25300 00:08:20.718 23:30:05 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@399 -- # '[' 23018 -lt 20700 ']' 00:08:20.718 23:30:05 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@399 -- # '[' 23018 -gt 25300 ']' 00:08:20.718 00:08:20.718 real 0m5.179s 00:08:20.718 user 0m0.085s 00:08:20.718 sys 0m0.039s 00:08:20.718 23:30:05 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:20.718 23:30:05 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@10 -- # set +x 00:08:20.718 ************************************ 00:08:20.718 END TEST bdev_qos_iops 00:08:20.718 ************************************ 00:08:20.718 23:30:05 blockdev_general.bdev_qos -- bdev/blockdev.sh@426 -- # get_io_result BANDWIDTH Null_1 00:08:20.718 23:30:05 blockdev_general.bdev_qos -- bdev/blockdev.sh@374 -- # local limit_type=BANDWIDTH 00:08:20.718 23:30:05 blockdev_general.bdev_qos -- bdev/blockdev.sh@375 -- # local qos_dev=Null_1 00:08:20.718 23:30:05 blockdev_general.bdev_qos -- bdev/blockdev.sh@376 -- # local iostat_result 00:08:20.718 23:30:05 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:08:20.719 23:30:05 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # grep Null_1 00:08:20.719 23:30:05 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # tail -1 00:08:25.973 23:30:10 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # iostat_result='Null_1 30756.16 123024.63 0.00 0.00 124928.00 0.00 0.00 ' 00:08:25.973 23:30:10 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # '[' BANDWIDTH = IOPS ']' 00:08:25.973 23:30:10 blockdev_general.bdev_qos -- bdev/blockdev.sh@380 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:08:25.973 23:30:10 blockdev_general.bdev_qos -- bdev/blockdev.sh@381 -- # awk '{print $6}' 00:08:25.973 23:30:10 blockdev_general.bdev_qos -- bdev/blockdev.sh@381 -- # iostat_result=124928.00 00:08:25.973 23:30:10 blockdev_general.bdev_qos -- bdev/blockdev.sh@384 -- # echo 124928 00:08:25.973 23:30:10 blockdev_general.bdev_qos -- bdev/blockdev.sh@426 -- # bw_limit=124928 00:08:25.973 23:30:10 blockdev_general.bdev_qos -- bdev/blockdev.sh@427 -- # bw_limit=12 00:08:25.973 23:30:10 blockdev_general.bdev_qos -- bdev/blockdev.sh@428 -- # '[' 12 -lt 2 ']' 00:08:25.973 23:30:10 blockdev_general.bdev_qos -- bdev/blockdev.sh@431 -- # rpc_cmd bdev_set_qos_limit --rw_mbytes_per_sec 12 Null_1 00:08:25.973 23:30:10 blockdev_general.bdev_qos -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:25.973 23:30:10 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:25.973 23:30:10 blockdev_general.bdev_qos -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:25.973 23:30:10 blockdev_general.bdev_qos -- bdev/blockdev.sh@432 -- # run_test bdev_qos_bw run_qos_test 12 BANDWIDTH Null_1 00:08:25.973 23:30:10 blockdev_general.bdev_qos -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:08:25.973 23:30:10 blockdev_general.bdev_qos -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:25.973 23:30:10 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:25.973 ************************************ 00:08:25.973 START TEST bdev_qos_bw 00:08:25.973 ************************************ 00:08:25.973 23:30:10 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@1125 -- # run_qos_test 12 BANDWIDTH Null_1 00:08:25.973 23:30:10 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@388 -- # local qos_limit=12 00:08:25.973 23:30:10 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@389 -- # local qos_result=0 00:08:25.973 23:30:10 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@391 -- # get_io_result BANDWIDTH Null_1 00:08:25.973 23:30:10 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@374 -- # local limit_type=BANDWIDTH 00:08:25.973 23:30:10 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@375 -- # local qos_dev=Null_1 00:08:25.973 23:30:10 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@376 -- # local iostat_result 00:08:25.973 23:30:10 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@377 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:08:25.973 23:30:10 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@377 -- # grep Null_1 00:08:25.974 23:30:10 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@377 -- # tail -1 00:08:31.233 23:30:15 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@377 -- # iostat_result='Null_1 3070.32 12281.30 0.00 0.00 12420.00 0.00 0.00 ' 00:08:31.233 23:30:15 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # '[' BANDWIDTH = IOPS ']' 00:08:31.233 23:30:15 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@380 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:08:31.233 23:30:15 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@381 -- # awk '{print $6}' 00:08:31.233 23:30:15 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@381 -- # iostat_result=12420.00 00:08:31.233 23:30:15 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@384 -- # echo 12420 00:08:31.233 23:30:15 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@391 -- # qos_result=12420 00:08:31.233 23:30:15 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@392 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:08:31.233 23:30:15 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@393 -- # qos_limit=12288 00:08:31.233 23:30:15 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@395 -- # lower_limit=11059 00:08:31.233 23:30:15 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@396 -- # upper_limit=13516 00:08:31.233 23:30:15 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@399 -- # '[' 12420 -lt 11059 ']' 00:08:31.233 23:30:15 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@399 -- # '[' 12420 -gt 13516 ']' 00:08:31.233 00:08:31.233 real 0m5.177s 00:08:31.233 user 0m0.078s 00:08:31.233 sys 0m0.038s 00:08:31.233 23:30:15 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:31.233 23:30:15 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@10 -- # set +x 00:08:31.233 ************************************ 00:08:31.233 END TEST bdev_qos_bw 00:08:31.233 ************************************ 00:08:31.233 23:30:15 blockdev_general.bdev_qos -- bdev/blockdev.sh@435 -- # rpc_cmd bdev_set_qos_limit --r_mbytes_per_sec 2 Malloc_0 00:08:31.233 23:30:15 blockdev_general.bdev_qos -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:31.233 23:30:15 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:31.233 23:30:15 blockdev_general.bdev_qos -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:31.233 23:30:15 blockdev_general.bdev_qos -- bdev/blockdev.sh@436 -- # run_test bdev_qos_ro_bw run_qos_test 2 BANDWIDTH Malloc_0 00:08:31.233 23:30:15 blockdev_general.bdev_qos -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:08:31.233 23:30:15 blockdev_general.bdev_qos -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:31.233 23:30:15 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:31.233 ************************************ 00:08:31.233 START TEST bdev_qos_ro_bw 00:08:31.233 ************************************ 00:08:31.233 23:30:15 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@1125 -- # run_qos_test 2 BANDWIDTH Malloc_0 00:08:31.233 23:30:15 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@388 -- # local qos_limit=2 00:08:31.233 23:30:15 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@389 -- # local qos_result=0 00:08:31.233 23:30:15 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@391 -- # get_io_result BANDWIDTH Malloc_0 00:08:31.233 23:30:15 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@374 -- # local limit_type=BANDWIDTH 00:08:31.233 23:30:15 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@375 -- # local qos_dev=Malloc_0 00:08:31.233 23:30:15 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@376 -- # local iostat_result 00:08:31.233 23:30:15 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@377 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:08:31.233 23:30:15 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@377 -- # grep Malloc_0 00:08:31.233 23:30:15 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@377 -- # tail -1 00:08:36.545 23:30:21 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@377 -- # iostat_result='Malloc_0 511.22 2044.86 0.00 0.00 2056.00 0.00 0.00 ' 00:08:36.545 23:30:21 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # '[' BANDWIDTH = IOPS ']' 00:08:36.545 23:30:21 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@380 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:08:36.545 23:30:21 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@381 -- # awk '{print $6}' 00:08:36.545 23:30:21 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@381 -- # iostat_result=2056.00 00:08:36.545 23:30:21 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@384 -- # echo 2056 00:08:36.545 23:30:21 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@391 -- # qos_result=2056 00:08:36.545 23:30:21 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@392 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:08:36.545 23:30:21 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@393 -- # qos_limit=2048 00:08:36.545 23:30:21 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@395 -- # lower_limit=1843 00:08:36.545 23:30:21 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@396 -- # upper_limit=2252 00:08:36.545 23:30:21 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@399 -- # '[' 2056 -lt 1843 ']' 00:08:36.545 23:30:21 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@399 -- # '[' 2056 -gt 2252 ']' 00:08:36.545 00:08:36.545 real 0m5.144s 00:08:36.545 user 0m0.080s 00:08:36.545 sys 0m0.034s 00:08:36.545 23:30:21 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:36.545 23:30:21 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@10 -- # set +x 00:08:36.545 ************************************ 00:08:36.545 END TEST bdev_qos_ro_bw 00:08:36.545 ************************************ 00:08:36.545 23:30:21 blockdev_general.bdev_qos -- bdev/blockdev.sh@458 -- # rpc_cmd bdev_malloc_delete Malloc_0 00:08:36.545 23:30:21 blockdev_general.bdev_qos -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:36.545 23:30:21 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:36.804 23:30:21 blockdev_general.bdev_qos -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:36.804 23:30:21 blockdev_general.bdev_qos -- bdev/blockdev.sh@459 -- # rpc_cmd bdev_null_delete Null_1 00:08:36.804 23:30:21 blockdev_general.bdev_qos -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:36.804 23:30:21 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:36.804 00:08:36.804 Latency(us) 00:08:36.804 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:08:36.804 Job: Malloc_0 (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:08:36.804 Malloc_0 : 26.45 32085.50 125.33 0.00 0.00 7902.26 1388.74 503316.48 00:08:36.804 Job: Null_1 (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:08:36.804 Null_1 : 26.56 32020.29 125.08 0.00 0.00 7979.52 546.13 102860.31 00:08:36.804 =================================================================================================================== 00:08:36.804 Total : 64105.79 250.41 0.00 0.00 7940.92 546.13 503316.48 00:08:36.804 0 00:08:36.804 23:30:21 blockdev_general.bdev_qos -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:36.804 23:30:21 blockdev_general.bdev_qos -- bdev/blockdev.sh@460 -- # killprocess 233378 00:08:36.804 23:30:21 blockdev_general.bdev_qos -- common/autotest_common.sh@950 -- # '[' -z 233378 ']' 00:08:36.804 23:30:21 blockdev_general.bdev_qos -- common/autotest_common.sh@954 -- # kill -0 233378 00:08:36.804 23:30:21 blockdev_general.bdev_qos -- common/autotest_common.sh@955 -- # uname 00:08:36.804 23:30:21 blockdev_general.bdev_qos -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:08:36.804 23:30:21 blockdev_general.bdev_qos -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 233378 00:08:36.804 23:30:21 blockdev_general.bdev_qos -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:08:36.804 23:30:21 blockdev_general.bdev_qos -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:08:36.804 23:30:21 blockdev_general.bdev_qos -- common/autotest_common.sh@968 -- # echo 'killing process with pid 233378' 00:08:36.804 killing process with pid 233378 00:08:36.804 23:30:21 blockdev_general.bdev_qos -- common/autotest_common.sh@969 -- # kill 233378 00:08:36.804 Received shutdown signal, test time was about 26.609653 seconds 00:08:36.804 00:08:36.804 Latency(us) 00:08:36.804 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:08:36.804 =================================================================================================================== 00:08:36.804 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:08:36.804 23:30:21 blockdev_general.bdev_qos -- common/autotest_common.sh@974 -- # wait 233378 00:08:37.062 23:30:21 blockdev_general.bdev_qos -- bdev/blockdev.sh@461 -- # trap - SIGINT SIGTERM EXIT 00:08:37.062 00:08:37.062 real 0m27.847s 00:08:37.062 user 0m28.406s 00:08:37.062 sys 0m0.591s 00:08:37.062 23:30:21 blockdev_general.bdev_qos -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:37.062 23:30:21 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:37.062 ************************************ 00:08:37.062 END TEST bdev_qos 00:08:37.062 ************************************ 00:08:37.062 23:30:21 blockdev_general -- bdev/blockdev.sh@788 -- # run_test bdev_qd_sampling qd_sampling_test_suite '' 00:08:37.062 23:30:21 blockdev_general -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:08:37.062 23:30:21 blockdev_general -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:37.062 23:30:21 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:37.062 ************************************ 00:08:37.062 START TEST bdev_qd_sampling 00:08:37.062 ************************************ 00:08:37.062 23:30:22 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@1125 -- # qd_sampling_test_suite '' 00:08:37.062 23:30:22 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@537 -- # QD_DEV=Malloc_QD 00:08:37.062 23:30:22 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@540 -- # QD_PID=238532 00:08:37.062 23:30:22 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@541 -- # echo 'Process bdev QD sampling period testing pid: 238532' 00:08:37.062 Process bdev QD sampling period testing pid: 238532 00:08:37.062 23:30:22 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x3 -q 256 -o 4096 -w randread -t 5 -C '' 00:08:37.062 23:30:22 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@542 -- # trap 'cleanup; killprocess $QD_PID; exit 1' SIGINT SIGTERM EXIT 00:08:37.062 23:30:22 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@543 -- # waitforlisten 238532 00:08:37.062 23:30:22 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@831 -- # '[' -z 238532 ']' 00:08:37.062 23:30:22 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:37.062 23:30:22 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@836 -- # local max_retries=100 00:08:37.062 23:30:22 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:37.062 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:37.062 23:30:22 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@840 -- # xtrace_disable 00:08:37.062 23:30:22 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:08:37.062 [2024-07-24 23:30:22.060290] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:08:37.062 [2024-07-24 23:30:22.060333] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid238532 ] 00:08:37.320 [2024-07-24 23:30:22.127474] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:08:37.320 [2024-07-24 23:30:22.204855] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:37.320 [2024-07-24 23:30:22.204858] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:37.885 23:30:22 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:08:37.885 23:30:22 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@864 -- # return 0 00:08:37.885 23:30:22 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@545 -- # rpc_cmd bdev_malloc_create -b Malloc_QD 128 512 00:08:37.885 23:30:22 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:37.885 23:30:22 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:08:37.885 Malloc_QD 00:08:37.885 23:30:22 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:37.885 23:30:22 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@546 -- # waitforbdev Malloc_QD 00:08:37.885 23:30:22 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@899 -- # local bdev_name=Malloc_QD 00:08:37.885 23:30:22 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:08:37.885 23:30:22 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@901 -- # local i 00:08:37.885 23:30:22 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:08:37.885 23:30:22 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:08:37.885 23:30:22 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@904 -- # rpc_cmd bdev_wait_for_examine 00:08:37.885 23:30:22 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:37.885 23:30:22 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:08:38.142 23:30:22 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:38.142 23:30:22 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@906 -- # rpc_cmd bdev_get_bdevs -b Malloc_QD -t 2000 00:08:38.142 23:30:22 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:38.142 23:30:22 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:08:38.142 [ 00:08:38.142 { 00:08:38.142 "name": "Malloc_QD", 00:08:38.142 "aliases": [ 00:08:38.142 "5f4eece2-b7de-4102-9c28-8ece26e34713" 00:08:38.142 ], 00:08:38.142 "product_name": "Malloc disk", 00:08:38.142 "block_size": 512, 00:08:38.142 "num_blocks": 262144, 00:08:38.142 "uuid": "5f4eece2-b7de-4102-9c28-8ece26e34713", 00:08:38.142 "assigned_rate_limits": { 00:08:38.142 "rw_ios_per_sec": 0, 00:08:38.142 "rw_mbytes_per_sec": 0, 00:08:38.143 "r_mbytes_per_sec": 0, 00:08:38.143 "w_mbytes_per_sec": 0 00:08:38.143 }, 00:08:38.143 "claimed": false, 00:08:38.143 "zoned": false, 00:08:38.143 "supported_io_types": { 00:08:38.143 "read": true, 00:08:38.143 "write": true, 00:08:38.143 "unmap": true, 00:08:38.143 "flush": true, 00:08:38.143 "reset": true, 00:08:38.143 "nvme_admin": false, 00:08:38.143 "nvme_io": false, 00:08:38.143 "nvme_io_md": false, 00:08:38.143 "write_zeroes": true, 00:08:38.143 "zcopy": true, 00:08:38.143 "get_zone_info": false, 00:08:38.143 "zone_management": false, 00:08:38.143 "zone_append": false, 00:08:38.143 "compare": false, 00:08:38.143 "compare_and_write": false, 00:08:38.143 "abort": true, 00:08:38.143 "seek_hole": false, 00:08:38.143 "seek_data": false, 00:08:38.143 "copy": true, 00:08:38.143 "nvme_iov_md": false 00:08:38.143 }, 00:08:38.143 "memory_domains": [ 00:08:38.143 { 00:08:38.143 "dma_device_id": "system", 00:08:38.143 "dma_device_type": 1 00:08:38.143 }, 00:08:38.143 { 00:08:38.143 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:08:38.143 "dma_device_type": 2 00:08:38.143 } 00:08:38.143 ], 00:08:38.143 "driver_specific": {} 00:08:38.143 } 00:08:38.143 ] 00:08:38.143 23:30:22 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:38.143 23:30:22 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@907 -- # return 0 00:08:38.143 23:30:22 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@549 -- # sleep 2 00:08:38.143 23:30:22 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@548 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:08:38.143 Running I/O for 5 seconds... 00:08:40.038 23:30:24 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@550 -- # qd_sampling_function_test Malloc_QD 00:08:40.038 23:30:24 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@518 -- # local bdev_name=Malloc_QD 00:08:40.038 23:30:24 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@519 -- # local sampling_period=10 00:08:40.038 23:30:24 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@520 -- # local iostats 00:08:40.038 23:30:24 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@522 -- # rpc_cmd bdev_set_qd_sampling_period Malloc_QD 10 00:08:40.038 23:30:24 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:40.038 23:30:24 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:08:40.038 23:30:24 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:40.038 23:30:24 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@524 -- # rpc_cmd bdev_get_iostat -b Malloc_QD 00:08:40.038 23:30:24 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:40.038 23:30:24 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:08:40.038 23:30:24 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:40.038 23:30:24 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@524 -- # iostats='{ 00:08:40.038 "tick_rate": 2100000000, 00:08:40.038 "ticks": 11878010669898936, 00:08:40.038 "bdevs": [ 00:08:40.038 { 00:08:40.038 "name": "Malloc_QD", 00:08:40.038 "bytes_read": 983609856, 00:08:40.038 "num_read_ops": 240132, 00:08:40.038 "bytes_written": 0, 00:08:40.038 "num_write_ops": 0, 00:08:40.038 "bytes_unmapped": 0, 00:08:40.038 "num_unmap_ops": 0, 00:08:40.038 "bytes_copied": 0, 00:08:40.038 "num_copy_ops": 0, 00:08:40.038 "read_latency_ticks": 2072474706962, 00:08:40.038 "max_read_latency_ticks": 10542846, 00:08:40.038 "min_read_latency_ticks": 181386, 00:08:40.038 "write_latency_ticks": 0, 00:08:40.038 "max_write_latency_ticks": 0, 00:08:40.038 "min_write_latency_ticks": 0, 00:08:40.038 "unmap_latency_ticks": 0, 00:08:40.038 "max_unmap_latency_ticks": 0, 00:08:40.038 "min_unmap_latency_ticks": 0, 00:08:40.038 "copy_latency_ticks": 0, 00:08:40.038 "max_copy_latency_ticks": 0, 00:08:40.038 "min_copy_latency_ticks": 0, 00:08:40.038 "io_error": {}, 00:08:40.038 "queue_depth_polling_period": 10, 00:08:40.038 "queue_depth": 512, 00:08:40.038 "io_time": 20, 00:08:40.038 "weighted_io_time": 10240 00:08:40.038 } 00:08:40.038 ] 00:08:40.038 }' 00:08:40.038 23:30:24 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@526 -- # jq -r '.bdevs[0].queue_depth_polling_period' 00:08:40.038 23:30:24 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@526 -- # qd_sampling_period=10 00:08:40.038 23:30:24 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@528 -- # '[' 10 == null ']' 00:08:40.038 23:30:24 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@528 -- # '[' 10 -ne 10 ']' 00:08:40.038 23:30:24 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@552 -- # rpc_cmd bdev_malloc_delete Malloc_QD 00:08:40.038 23:30:24 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:40.038 23:30:24 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:08:40.038 00:08:40.038 Latency(us) 00:08:40.038 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:08:40.038 Job: Malloc_QD (Core Mask 0x1, workload: randread, depth: 256, IO size: 4096) 00:08:40.038 Malloc_QD : 1.99 61931.23 241.92 0.00 0.00 4124.34 1068.86 4431.48 00:08:40.038 Job: Malloc_QD (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:08:40.038 Malloc_QD : 1.99 62418.31 243.82 0.00 0.00 4092.66 639.76 5024.43 00:08:40.038 =================================================================================================================== 00:08:40.038 Total : 124349.53 485.74 0.00 0.00 4108.43 639.76 5024.43 00:08:40.038 0 00:08:40.038 23:30:24 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:40.038 23:30:24 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@553 -- # killprocess 238532 00:08:40.038 23:30:24 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@950 -- # '[' -z 238532 ']' 00:08:40.038 23:30:24 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@954 -- # kill -0 238532 00:08:40.038 23:30:24 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@955 -- # uname 00:08:40.038 23:30:25 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:08:40.038 23:30:25 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 238532 00:08:40.296 23:30:25 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:08:40.296 23:30:25 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:08:40.296 23:30:25 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@968 -- # echo 'killing process with pid 238532' 00:08:40.296 killing process with pid 238532 00:08:40.296 23:30:25 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@969 -- # kill 238532 00:08:40.296 Received shutdown signal, test time was about 2.054639 seconds 00:08:40.296 00:08:40.296 Latency(us) 00:08:40.296 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:08:40.296 =================================================================================================================== 00:08:40.296 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:08:40.296 23:30:25 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@974 -- # wait 238532 00:08:40.296 23:30:25 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@554 -- # trap - SIGINT SIGTERM EXIT 00:08:40.296 00:08:40.296 real 0m3.208s 00:08:40.296 user 0m6.297s 00:08:40.296 sys 0m0.315s 00:08:40.296 23:30:25 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:40.296 23:30:25 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:08:40.296 ************************************ 00:08:40.296 END TEST bdev_qd_sampling 00:08:40.296 ************************************ 00:08:40.296 23:30:25 blockdev_general -- bdev/blockdev.sh@789 -- # run_test bdev_error error_test_suite '' 00:08:40.296 23:30:25 blockdev_general -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:08:40.296 23:30:25 blockdev_general -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:40.296 23:30:25 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:40.296 ************************************ 00:08:40.296 START TEST bdev_error 00:08:40.296 ************************************ 00:08:40.296 23:30:25 blockdev_general.bdev_error -- common/autotest_common.sh@1125 -- # error_test_suite '' 00:08:40.296 23:30:25 blockdev_general.bdev_error -- bdev/blockdev.sh@465 -- # DEV_1=Dev_1 00:08:40.296 23:30:25 blockdev_general.bdev_error -- bdev/blockdev.sh@466 -- # DEV_2=Dev_2 00:08:40.296 23:30:25 blockdev_general.bdev_error -- bdev/blockdev.sh@467 -- # ERR_DEV=EE_Dev_1 00:08:40.296 23:30:25 blockdev_general.bdev_error -- bdev/blockdev.sh@471 -- # ERR_PID=239240 00:08:40.296 23:30:25 blockdev_general.bdev_error -- bdev/blockdev.sh@472 -- # echo 'Process error testing pid: 239240' 00:08:40.296 Process error testing pid: 239240 00:08:40.296 23:30:25 blockdev_general.bdev_error -- bdev/blockdev.sh@473 -- # waitforlisten 239240 00:08:40.296 23:30:25 blockdev_general.bdev_error -- common/autotest_common.sh@831 -- # '[' -z 239240 ']' 00:08:40.296 23:30:25 blockdev_general.bdev_error -- bdev/blockdev.sh@470 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 16 -o 4096 -w randread -t 5 -f '' 00:08:40.296 23:30:25 blockdev_general.bdev_error -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:40.296 23:30:25 blockdev_general.bdev_error -- common/autotest_common.sh@836 -- # local max_retries=100 00:08:40.296 23:30:25 blockdev_general.bdev_error -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:40.296 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:40.296 23:30:25 blockdev_general.bdev_error -- common/autotest_common.sh@840 -- # xtrace_disable 00:08:40.296 23:30:25 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:08:40.554 [2024-07-24 23:30:25.324525] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:08:40.554 [2024-07-24 23:30:25.324564] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid239240 ] 00:08:40.554 [2024-07-24 23:30:25.387820] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:40.554 [2024-07-24 23:30:25.465775] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:41.486 23:30:26 blockdev_general.bdev_error -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:08:41.486 23:30:26 blockdev_general.bdev_error -- common/autotest_common.sh@864 -- # return 0 00:08:41.486 23:30:26 blockdev_general.bdev_error -- bdev/blockdev.sh@475 -- # rpc_cmd bdev_malloc_create -b Dev_1 128 512 00:08:41.486 23:30:26 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:41.486 23:30:26 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:08:41.486 Dev_1 00:08:41.486 23:30:26 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:41.486 23:30:26 blockdev_general.bdev_error -- bdev/blockdev.sh@476 -- # waitforbdev Dev_1 00:08:41.486 23:30:26 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local bdev_name=Dev_1 00:08:41.486 23:30:26 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:08:41.486 23:30:26 blockdev_general.bdev_error -- common/autotest_common.sh@901 -- # local i 00:08:41.486 23:30:26 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:08:41.486 23:30:26 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:08:41.487 23:30:26 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_wait_for_examine 00:08:41.487 23:30:26 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:41.487 23:30:26 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:08:41.487 23:30:26 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:41.487 23:30:26 blockdev_general.bdev_error -- common/autotest_common.sh@906 -- # rpc_cmd bdev_get_bdevs -b Dev_1 -t 2000 00:08:41.487 23:30:26 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:41.487 23:30:26 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:08:41.487 [ 00:08:41.487 { 00:08:41.487 "name": "Dev_1", 00:08:41.487 "aliases": [ 00:08:41.487 "53efc143-f7c3-4ee0-8663-9c1993ce422e" 00:08:41.487 ], 00:08:41.487 "product_name": "Malloc disk", 00:08:41.487 "block_size": 512, 00:08:41.487 "num_blocks": 262144, 00:08:41.487 "uuid": "53efc143-f7c3-4ee0-8663-9c1993ce422e", 00:08:41.487 "assigned_rate_limits": { 00:08:41.487 "rw_ios_per_sec": 0, 00:08:41.487 "rw_mbytes_per_sec": 0, 00:08:41.487 "r_mbytes_per_sec": 0, 00:08:41.487 "w_mbytes_per_sec": 0 00:08:41.487 }, 00:08:41.487 "claimed": false, 00:08:41.487 "zoned": false, 00:08:41.487 "supported_io_types": { 00:08:41.487 "read": true, 00:08:41.487 "write": true, 00:08:41.487 "unmap": true, 00:08:41.487 "flush": true, 00:08:41.487 "reset": true, 00:08:41.487 "nvme_admin": false, 00:08:41.487 "nvme_io": false, 00:08:41.487 "nvme_io_md": false, 00:08:41.487 "write_zeroes": true, 00:08:41.487 "zcopy": true, 00:08:41.487 "get_zone_info": false, 00:08:41.487 "zone_management": false, 00:08:41.487 "zone_append": false, 00:08:41.487 "compare": false, 00:08:41.487 "compare_and_write": false, 00:08:41.487 "abort": true, 00:08:41.487 "seek_hole": false, 00:08:41.487 "seek_data": false, 00:08:41.487 "copy": true, 00:08:41.487 "nvme_iov_md": false 00:08:41.487 }, 00:08:41.487 "memory_domains": [ 00:08:41.487 { 00:08:41.487 "dma_device_id": "system", 00:08:41.487 "dma_device_type": 1 00:08:41.487 }, 00:08:41.487 { 00:08:41.487 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:08:41.487 "dma_device_type": 2 00:08:41.487 } 00:08:41.487 ], 00:08:41.487 "driver_specific": {} 00:08:41.487 } 00:08:41.487 ] 00:08:41.487 23:30:26 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:41.487 23:30:26 blockdev_general.bdev_error -- common/autotest_common.sh@907 -- # return 0 00:08:41.487 23:30:26 blockdev_general.bdev_error -- bdev/blockdev.sh@477 -- # rpc_cmd bdev_error_create Dev_1 00:08:41.487 23:30:26 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:41.487 23:30:26 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:08:41.487 true 00:08:41.487 23:30:26 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:41.487 23:30:26 blockdev_general.bdev_error -- bdev/blockdev.sh@478 -- # rpc_cmd bdev_malloc_create -b Dev_2 128 512 00:08:41.487 23:30:26 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:41.487 23:30:26 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:08:41.487 Dev_2 00:08:41.487 23:30:26 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:41.487 23:30:26 blockdev_general.bdev_error -- bdev/blockdev.sh@479 -- # waitforbdev Dev_2 00:08:41.487 23:30:26 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local bdev_name=Dev_2 00:08:41.487 23:30:26 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:08:41.487 23:30:26 blockdev_general.bdev_error -- common/autotest_common.sh@901 -- # local i 00:08:41.487 23:30:26 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:08:41.487 23:30:26 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:08:41.487 23:30:26 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_wait_for_examine 00:08:41.487 23:30:26 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:41.487 23:30:26 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:08:41.487 23:30:26 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:41.487 23:30:26 blockdev_general.bdev_error -- common/autotest_common.sh@906 -- # rpc_cmd bdev_get_bdevs -b Dev_2 -t 2000 00:08:41.487 23:30:26 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:41.487 23:30:26 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:08:41.487 [ 00:08:41.487 { 00:08:41.487 "name": "Dev_2", 00:08:41.487 "aliases": [ 00:08:41.487 "9ea4e5de-5726-4732-92c8-fcb3f473d280" 00:08:41.487 ], 00:08:41.487 "product_name": "Malloc disk", 00:08:41.487 "block_size": 512, 00:08:41.487 "num_blocks": 262144, 00:08:41.487 "uuid": "9ea4e5de-5726-4732-92c8-fcb3f473d280", 00:08:41.487 "assigned_rate_limits": { 00:08:41.487 "rw_ios_per_sec": 0, 00:08:41.487 "rw_mbytes_per_sec": 0, 00:08:41.487 "r_mbytes_per_sec": 0, 00:08:41.487 "w_mbytes_per_sec": 0 00:08:41.487 }, 00:08:41.487 "claimed": false, 00:08:41.487 "zoned": false, 00:08:41.487 "supported_io_types": { 00:08:41.487 "read": true, 00:08:41.487 "write": true, 00:08:41.487 "unmap": true, 00:08:41.487 "flush": true, 00:08:41.487 "reset": true, 00:08:41.487 "nvme_admin": false, 00:08:41.487 "nvme_io": false, 00:08:41.487 "nvme_io_md": false, 00:08:41.487 "write_zeroes": true, 00:08:41.487 "zcopy": true, 00:08:41.487 "get_zone_info": false, 00:08:41.487 "zone_management": false, 00:08:41.487 "zone_append": false, 00:08:41.487 "compare": false, 00:08:41.487 "compare_and_write": false, 00:08:41.487 "abort": true, 00:08:41.487 "seek_hole": false, 00:08:41.487 "seek_data": false, 00:08:41.487 "copy": true, 00:08:41.487 "nvme_iov_md": false 00:08:41.487 }, 00:08:41.487 "memory_domains": [ 00:08:41.487 { 00:08:41.487 "dma_device_id": "system", 00:08:41.487 "dma_device_type": 1 00:08:41.487 }, 00:08:41.487 { 00:08:41.487 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:08:41.487 "dma_device_type": 2 00:08:41.487 } 00:08:41.487 ], 00:08:41.487 "driver_specific": {} 00:08:41.487 } 00:08:41.487 ] 00:08:41.487 23:30:26 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:41.487 23:30:26 blockdev_general.bdev_error -- common/autotest_common.sh@907 -- # return 0 00:08:41.487 23:30:26 blockdev_general.bdev_error -- bdev/blockdev.sh@480 -- # rpc_cmd bdev_error_inject_error EE_Dev_1 all failure -n 5 00:08:41.487 23:30:26 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:41.487 23:30:26 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:08:41.487 23:30:26 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:41.487 23:30:26 blockdev_general.bdev_error -- bdev/blockdev.sh@483 -- # sleep 1 00:08:41.487 23:30:26 blockdev_general.bdev_error -- bdev/blockdev.sh@482 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -t 1 perform_tests 00:08:41.487 Running I/O for 5 seconds... 00:08:42.420 23:30:27 blockdev_general.bdev_error -- bdev/blockdev.sh@486 -- # kill -0 239240 00:08:42.420 23:30:27 blockdev_general.bdev_error -- bdev/blockdev.sh@487 -- # echo 'Process is existed as continue on error is set. Pid: 239240' 00:08:42.420 Process is existed as continue on error is set. Pid: 239240 00:08:42.420 23:30:27 blockdev_general.bdev_error -- bdev/blockdev.sh@494 -- # rpc_cmd bdev_error_delete EE_Dev_1 00:08:42.420 23:30:27 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:42.420 23:30:27 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:08:42.420 23:30:27 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:42.420 23:30:27 blockdev_general.bdev_error -- bdev/blockdev.sh@495 -- # rpc_cmd bdev_malloc_delete Dev_1 00:08:42.420 23:30:27 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:42.420 23:30:27 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:08:42.420 23:30:27 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:42.420 23:30:27 blockdev_general.bdev_error -- bdev/blockdev.sh@496 -- # sleep 5 00:08:42.420 Timeout while waiting for response: 00:08:42.420 00:08:42.420 00:08:46.599 00:08:46.599 Latency(us) 00:08:46.599 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:08:46.599 Job: EE_Dev_1 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:08:46.599 EE_Dev_1 : 0.93 58102.63 226.96 5.39 0.00 273.13 91.67 458.36 00:08:46.599 Job: Dev_2 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:08:46.599 Dev_2 : 5.00 123147.32 481.04 0.00 0.00 127.67 42.18 18599.74 00:08:46.599 =================================================================================================================== 00:08:46.599 Total : 181249.95 708.01 5.39 0.00 139.38 42.18 18599.74 00:08:47.532 23:30:32 blockdev_general.bdev_error -- bdev/blockdev.sh@498 -- # killprocess 239240 00:08:47.532 23:30:32 blockdev_general.bdev_error -- common/autotest_common.sh@950 -- # '[' -z 239240 ']' 00:08:47.532 23:30:32 blockdev_general.bdev_error -- common/autotest_common.sh@954 -- # kill -0 239240 00:08:47.532 23:30:32 blockdev_general.bdev_error -- common/autotest_common.sh@955 -- # uname 00:08:47.532 23:30:32 blockdev_general.bdev_error -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:08:47.532 23:30:32 blockdev_general.bdev_error -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 239240 00:08:47.532 23:30:32 blockdev_general.bdev_error -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:08:47.532 23:30:32 blockdev_general.bdev_error -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:08:47.532 23:30:32 blockdev_general.bdev_error -- common/autotest_common.sh@968 -- # echo 'killing process with pid 239240' 00:08:47.532 killing process with pid 239240 00:08:47.532 23:30:32 blockdev_general.bdev_error -- common/autotest_common.sh@969 -- # kill 239240 00:08:47.532 Received shutdown signal, test time was about 5.000000 seconds 00:08:47.532 00:08:47.532 Latency(us) 00:08:47.532 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:08:47.532 =================================================================================================================== 00:08:47.532 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:08:47.532 23:30:32 blockdev_general.bdev_error -- common/autotest_common.sh@974 -- # wait 239240 00:08:47.532 23:30:32 blockdev_general.bdev_error -- bdev/blockdev.sh@502 -- # ERR_PID=240384 00:08:47.532 23:30:32 blockdev_general.bdev_error -- bdev/blockdev.sh@503 -- # echo 'Process error testing pid: 240384' 00:08:47.532 Process error testing pid: 240384 00:08:47.532 23:30:32 blockdev_general.bdev_error -- bdev/blockdev.sh@504 -- # waitforlisten 240384 00:08:47.532 23:30:32 blockdev_general.bdev_error -- common/autotest_common.sh@831 -- # '[' -z 240384 ']' 00:08:47.532 23:30:32 blockdev_general.bdev_error -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:47.532 23:30:32 blockdev_general.bdev_error -- common/autotest_common.sh@836 -- # local max_retries=100 00:08:47.532 23:30:32 blockdev_general.bdev_error -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:47.532 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:47.532 23:30:32 blockdev_general.bdev_error -- common/autotest_common.sh@840 -- # xtrace_disable 00:08:47.532 23:30:32 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:08:47.532 23:30:32 blockdev_general.bdev_error -- bdev/blockdev.sh@501 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 16 -o 4096 -w randread -t 5 '' 00:08:47.790 [2024-07-24 23:30:32.576542] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:08:47.790 [2024-07-24 23:30:32.576587] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid240384 ] 00:08:47.790 [2024-07-24 23:30:32.637077] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:47.790 [2024-07-24 23:30:32.708509] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:48.724 23:30:33 blockdev_general.bdev_error -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:08:48.724 23:30:33 blockdev_general.bdev_error -- common/autotest_common.sh@864 -- # return 0 00:08:48.724 23:30:33 blockdev_general.bdev_error -- bdev/blockdev.sh@506 -- # rpc_cmd bdev_malloc_create -b Dev_1 128 512 00:08:48.724 23:30:33 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:48.724 23:30:33 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:08:48.724 Dev_1 00:08:48.724 23:30:33 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:48.724 23:30:33 blockdev_general.bdev_error -- bdev/blockdev.sh@507 -- # waitforbdev Dev_1 00:08:48.724 23:30:33 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local bdev_name=Dev_1 00:08:48.724 23:30:33 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:08:48.724 23:30:33 blockdev_general.bdev_error -- common/autotest_common.sh@901 -- # local i 00:08:48.724 23:30:33 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:08:48.724 23:30:33 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:08:48.724 23:30:33 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_wait_for_examine 00:08:48.724 23:30:33 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:48.724 23:30:33 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:08:48.724 23:30:33 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:48.724 23:30:33 blockdev_general.bdev_error -- common/autotest_common.sh@906 -- # rpc_cmd bdev_get_bdevs -b Dev_1 -t 2000 00:08:48.724 23:30:33 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:48.724 23:30:33 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:08:48.724 [ 00:08:48.724 { 00:08:48.724 "name": "Dev_1", 00:08:48.724 "aliases": [ 00:08:48.724 "7895ac4f-2987-44d5-935a-7ea3ad085091" 00:08:48.724 ], 00:08:48.724 "product_name": "Malloc disk", 00:08:48.724 "block_size": 512, 00:08:48.724 "num_blocks": 262144, 00:08:48.724 "uuid": "7895ac4f-2987-44d5-935a-7ea3ad085091", 00:08:48.724 "assigned_rate_limits": { 00:08:48.724 "rw_ios_per_sec": 0, 00:08:48.724 "rw_mbytes_per_sec": 0, 00:08:48.724 "r_mbytes_per_sec": 0, 00:08:48.724 "w_mbytes_per_sec": 0 00:08:48.724 }, 00:08:48.724 "claimed": false, 00:08:48.724 "zoned": false, 00:08:48.724 "supported_io_types": { 00:08:48.724 "read": true, 00:08:48.724 "write": true, 00:08:48.724 "unmap": true, 00:08:48.724 "flush": true, 00:08:48.724 "reset": true, 00:08:48.724 "nvme_admin": false, 00:08:48.724 "nvme_io": false, 00:08:48.724 "nvme_io_md": false, 00:08:48.724 "write_zeroes": true, 00:08:48.724 "zcopy": true, 00:08:48.724 "get_zone_info": false, 00:08:48.724 "zone_management": false, 00:08:48.724 "zone_append": false, 00:08:48.724 "compare": false, 00:08:48.724 "compare_and_write": false, 00:08:48.724 "abort": true, 00:08:48.724 "seek_hole": false, 00:08:48.724 "seek_data": false, 00:08:48.724 "copy": true, 00:08:48.724 "nvme_iov_md": false 00:08:48.724 }, 00:08:48.724 "memory_domains": [ 00:08:48.724 { 00:08:48.724 "dma_device_id": "system", 00:08:48.724 "dma_device_type": 1 00:08:48.724 }, 00:08:48.724 { 00:08:48.724 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:08:48.724 "dma_device_type": 2 00:08:48.724 } 00:08:48.724 ], 00:08:48.724 "driver_specific": {} 00:08:48.724 } 00:08:48.724 ] 00:08:48.724 23:30:33 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:48.724 23:30:33 blockdev_general.bdev_error -- common/autotest_common.sh@907 -- # return 0 00:08:48.724 23:30:33 blockdev_general.bdev_error -- bdev/blockdev.sh@508 -- # rpc_cmd bdev_error_create Dev_1 00:08:48.724 23:30:33 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:48.724 23:30:33 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:08:48.724 true 00:08:48.724 23:30:33 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:48.724 23:30:33 blockdev_general.bdev_error -- bdev/blockdev.sh@509 -- # rpc_cmd bdev_malloc_create -b Dev_2 128 512 00:08:48.724 23:30:33 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:48.724 23:30:33 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:08:48.724 Dev_2 00:08:48.724 23:30:33 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:48.724 23:30:33 blockdev_general.bdev_error -- bdev/blockdev.sh@510 -- # waitforbdev Dev_2 00:08:48.724 23:30:33 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local bdev_name=Dev_2 00:08:48.724 23:30:33 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:08:48.724 23:30:33 blockdev_general.bdev_error -- common/autotest_common.sh@901 -- # local i 00:08:48.724 23:30:33 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:08:48.724 23:30:33 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:08:48.724 23:30:33 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_wait_for_examine 00:08:48.725 23:30:33 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:48.725 23:30:33 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:08:48.725 23:30:33 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:48.725 23:30:33 blockdev_general.bdev_error -- common/autotest_common.sh@906 -- # rpc_cmd bdev_get_bdevs -b Dev_2 -t 2000 00:08:48.725 23:30:33 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:48.725 23:30:33 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:08:48.725 [ 00:08:48.725 { 00:08:48.725 "name": "Dev_2", 00:08:48.725 "aliases": [ 00:08:48.725 "dd1343ac-6d7e-4e0a-8c47-0ecbe60a3d32" 00:08:48.725 ], 00:08:48.725 "product_name": "Malloc disk", 00:08:48.725 "block_size": 512, 00:08:48.725 "num_blocks": 262144, 00:08:48.725 "uuid": "dd1343ac-6d7e-4e0a-8c47-0ecbe60a3d32", 00:08:48.725 "assigned_rate_limits": { 00:08:48.725 "rw_ios_per_sec": 0, 00:08:48.725 "rw_mbytes_per_sec": 0, 00:08:48.725 "r_mbytes_per_sec": 0, 00:08:48.725 "w_mbytes_per_sec": 0 00:08:48.725 }, 00:08:48.725 "claimed": false, 00:08:48.725 "zoned": false, 00:08:48.725 "supported_io_types": { 00:08:48.725 "read": true, 00:08:48.725 "write": true, 00:08:48.725 "unmap": true, 00:08:48.725 "flush": true, 00:08:48.725 "reset": true, 00:08:48.725 "nvme_admin": false, 00:08:48.725 "nvme_io": false, 00:08:48.725 "nvme_io_md": false, 00:08:48.725 "write_zeroes": true, 00:08:48.725 "zcopy": true, 00:08:48.725 "get_zone_info": false, 00:08:48.725 "zone_management": false, 00:08:48.725 "zone_append": false, 00:08:48.725 "compare": false, 00:08:48.725 "compare_and_write": false, 00:08:48.725 "abort": true, 00:08:48.725 "seek_hole": false, 00:08:48.725 "seek_data": false, 00:08:48.725 "copy": true, 00:08:48.725 "nvme_iov_md": false 00:08:48.725 }, 00:08:48.725 "memory_domains": [ 00:08:48.725 { 00:08:48.725 "dma_device_id": "system", 00:08:48.725 "dma_device_type": 1 00:08:48.725 }, 00:08:48.725 { 00:08:48.725 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:08:48.725 "dma_device_type": 2 00:08:48.725 } 00:08:48.725 ], 00:08:48.725 "driver_specific": {} 00:08:48.725 } 00:08:48.725 ] 00:08:48.725 23:30:33 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:48.725 23:30:33 blockdev_general.bdev_error -- common/autotest_common.sh@907 -- # return 0 00:08:48.725 23:30:33 blockdev_general.bdev_error -- bdev/blockdev.sh@511 -- # rpc_cmd bdev_error_inject_error EE_Dev_1 all failure -n 5 00:08:48.725 23:30:33 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:48.725 23:30:33 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:08:48.725 23:30:33 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:48.725 23:30:33 blockdev_general.bdev_error -- bdev/blockdev.sh@514 -- # NOT wait 240384 00:08:48.725 23:30:33 blockdev_general.bdev_error -- common/autotest_common.sh@650 -- # local es=0 00:08:48.725 23:30:33 blockdev_general.bdev_error -- common/autotest_common.sh@652 -- # valid_exec_arg wait 240384 00:08:48.725 23:30:33 blockdev_general.bdev_error -- common/autotest_common.sh@638 -- # local arg=wait 00:08:48.725 23:30:33 blockdev_general.bdev_error -- bdev/blockdev.sh@513 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -t 1 perform_tests 00:08:48.725 23:30:33 blockdev_general.bdev_error -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:08:48.725 23:30:33 blockdev_general.bdev_error -- common/autotest_common.sh@642 -- # type -t wait 00:08:48.725 23:30:33 blockdev_general.bdev_error -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:08:48.725 23:30:33 blockdev_general.bdev_error -- common/autotest_common.sh@653 -- # wait 240384 00:08:48.725 Running I/O for 5 seconds... 00:08:48.725 task offset: 72632 on job bdev=EE_Dev_1 fails 00:08:48.725 00:08:48.725 Latency(us) 00:08:48.725 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:08:48.725 Job: EE_Dev_1 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:08:48.725 Job: EE_Dev_1 ended in about 0.00 seconds with error 00:08:48.725 EE_Dev_1 : 0.00 44534.41 173.96 10121.46 0.00 243.29 90.21 433.01 00:08:48.725 Job: Dev_2 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:08:48.725 Dev_2 : 0.00 27373.82 106.93 0.00 0.00 434.76 86.80 807.50 00:08:48.725 =================================================================================================================== 00:08:48.725 Total : 71908.24 280.89 10121.46 0.00 347.14 86.80 807.50 00:08:48.725 [2024-07-24 23:30:33.567888] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:08:48.725 request: 00:08:48.725 { 00:08:48.725 "method": "perform_tests", 00:08:48.725 "req_id": 1 00:08:48.725 } 00:08:48.725 Got JSON-RPC error response 00:08:48.725 response: 00:08:48.725 { 00:08:48.725 "code": -32603, 00:08:48.725 "message": "bdevperf failed with error Operation not permitted" 00:08:48.725 } 00:08:48.983 23:30:33 blockdev_general.bdev_error -- common/autotest_common.sh@653 -- # es=255 00:08:48.983 23:30:33 blockdev_general.bdev_error -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:08:48.983 23:30:33 blockdev_general.bdev_error -- common/autotest_common.sh@662 -- # es=127 00:08:48.983 23:30:33 blockdev_general.bdev_error -- common/autotest_common.sh@663 -- # case "$es" in 00:08:48.983 23:30:33 blockdev_general.bdev_error -- common/autotest_common.sh@670 -- # es=1 00:08:48.983 23:30:33 blockdev_general.bdev_error -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:08:48.983 00:08:48.983 real 0m8.518s 00:08:48.983 user 0m8.794s 00:08:48.983 sys 0m0.587s 00:08:48.983 23:30:33 blockdev_general.bdev_error -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:48.983 23:30:33 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:08:48.983 ************************************ 00:08:48.983 END TEST bdev_error 00:08:48.983 ************************************ 00:08:48.983 23:30:33 blockdev_general -- bdev/blockdev.sh@790 -- # run_test bdev_stat stat_test_suite '' 00:08:48.983 23:30:33 blockdev_general -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:08:48.983 23:30:33 blockdev_general -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:48.983 23:30:33 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:48.983 ************************************ 00:08:48.983 START TEST bdev_stat 00:08:48.983 ************************************ 00:08:48.983 23:30:33 blockdev_general.bdev_stat -- common/autotest_common.sh@1125 -- # stat_test_suite '' 00:08:48.983 23:30:33 blockdev_general.bdev_stat -- bdev/blockdev.sh@591 -- # STAT_DEV=Malloc_STAT 00:08:48.983 23:30:33 blockdev_general.bdev_stat -- bdev/blockdev.sh@595 -- # STAT_PID=240634 00:08:48.983 23:30:33 blockdev_general.bdev_stat -- bdev/blockdev.sh@596 -- # echo 'Process Bdev IO statistics testing pid: 240634' 00:08:48.983 Process Bdev IO statistics testing pid: 240634 00:08:48.983 23:30:33 blockdev_general.bdev_stat -- bdev/blockdev.sh@594 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x3 -q 256 -o 4096 -w randread -t 10 -C '' 00:08:48.983 23:30:33 blockdev_general.bdev_stat -- bdev/blockdev.sh@597 -- # trap 'cleanup; killprocess $STAT_PID; exit 1' SIGINT SIGTERM EXIT 00:08:48.983 23:30:33 blockdev_general.bdev_stat -- bdev/blockdev.sh@598 -- # waitforlisten 240634 00:08:48.983 23:30:33 blockdev_general.bdev_stat -- common/autotest_common.sh@831 -- # '[' -z 240634 ']' 00:08:48.983 23:30:33 blockdev_general.bdev_stat -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:48.983 23:30:33 blockdev_general.bdev_stat -- common/autotest_common.sh@836 -- # local max_retries=100 00:08:48.983 23:30:33 blockdev_general.bdev_stat -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:48.983 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:48.983 23:30:33 blockdev_general.bdev_stat -- common/autotest_common.sh@840 -- # xtrace_disable 00:08:48.983 23:30:33 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:08:48.983 [2024-07-24 23:30:33.921605] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:08:48.983 [2024-07-24 23:30:33.921646] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid240634 ] 00:08:49.241 [2024-07-24 23:30:33.985050] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:08:49.241 [2024-07-24 23:30:34.055500] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:49.241 [2024-07-24 23:30:34.055503] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:49.807 23:30:34 blockdev_general.bdev_stat -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:08:49.807 23:30:34 blockdev_general.bdev_stat -- common/autotest_common.sh@864 -- # return 0 00:08:49.807 23:30:34 blockdev_general.bdev_stat -- bdev/blockdev.sh@600 -- # rpc_cmd bdev_malloc_create -b Malloc_STAT 128 512 00:08:49.807 23:30:34 blockdev_general.bdev_stat -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:49.807 23:30:34 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:08:49.807 Malloc_STAT 00:08:49.807 23:30:34 blockdev_general.bdev_stat -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:49.807 23:30:34 blockdev_general.bdev_stat -- bdev/blockdev.sh@601 -- # waitforbdev Malloc_STAT 00:08:49.807 23:30:34 blockdev_general.bdev_stat -- common/autotest_common.sh@899 -- # local bdev_name=Malloc_STAT 00:08:49.807 23:30:34 blockdev_general.bdev_stat -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:08:49.807 23:30:34 blockdev_general.bdev_stat -- common/autotest_common.sh@901 -- # local i 00:08:49.807 23:30:34 blockdev_general.bdev_stat -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:08:49.807 23:30:34 blockdev_general.bdev_stat -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:08:49.807 23:30:34 blockdev_general.bdev_stat -- common/autotest_common.sh@904 -- # rpc_cmd bdev_wait_for_examine 00:08:49.807 23:30:34 blockdev_general.bdev_stat -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:49.807 23:30:34 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:08:49.807 23:30:34 blockdev_general.bdev_stat -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:49.807 23:30:34 blockdev_general.bdev_stat -- common/autotest_common.sh@906 -- # rpc_cmd bdev_get_bdevs -b Malloc_STAT -t 2000 00:08:49.807 23:30:34 blockdev_general.bdev_stat -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:49.807 23:30:34 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:08:49.807 [ 00:08:49.807 { 00:08:49.807 "name": "Malloc_STAT", 00:08:49.807 "aliases": [ 00:08:49.807 "9ed54802-b5ac-490d-b113-0fa7e9d5c047" 00:08:49.807 ], 00:08:49.807 "product_name": "Malloc disk", 00:08:49.807 "block_size": 512, 00:08:49.807 "num_blocks": 262144, 00:08:49.807 "uuid": "9ed54802-b5ac-490d-b113-0fa7e9d5c047", 00:08:49.807 "assigned_rate_limits": { 00:08:49.807 "rw_ios_per_sec": 0, 00:08:49.807 "rw_mbytes_per_sec": 0, 00:08:49.807 "r_mbytes_per_sec": 0, 00:08:49.807 "w_mbytes_per_sec": 0 00:08:49.807 }, 00:08:49.807 "claimed": false, 00:08:49.807 "zoned": false, 00:08:49.807 "supported_io_types": { 00:08:49.807 "read": true, 00:08:49.807 "write": true, 00:08:49.807 "unmap": true, 00:08:49.807 "flush": true, 00:08:49.807 "reset": true, 00:08:49.807 "nvme_admin": false, 00:08:49.807 "nvme_io": false, 00:08:49.807 "nvme_io_md": false, 00:08:49.807 "write_zeroes": true, 00:08:49.807 "zcopy": true, 00:08:49.807 "get_zone_info": false, 00:08:49.807 "zone_management": false, 00:08:49.807 "zone_append": false, 00:08:49.807 "compare": false, 00:08:49.807 "compare_and_write": false, 00:08:49.807 "abort": true, 00:08:49.807 "seek_hole": false, 00:08:49.807 "seek_data": false, 00:08:49.807 "copy": true, 00:08:49.807 "nvme_iov_md": false 00:08:49.807 }, 00:08:49.807 "memory_domains": [ 00:08:49.807 { 00:08:49.807 "dma_device_id": "system", 00:08:49.807 "dma_device_type": 1 00:08:49.807 }, 00:08:49.807 { 00:08:49.807 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:08:49.807 "dma_device_type": 2 00:08:49.807 } 00:08:49.807 ], 00:08:49.807 "driver_specific": {} 00:08:49.807 } 00:08:49.807 ] 00:08:49.807 23:30:34 blockdev_general.bdev_stat -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:49.807 23:30:34 blockdev_general.bdev_stat -- common/autotest_common.sh@907 -- # return 0 00:08:49.807 23:30:34 blockdev_general.bdev_stat -- bdev/blockdev.sh@604 -- # sleep 2 00:08:49.807 23:30:34 blockdev_general.bdev_stat -- bdev/blockdev.sh@603 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:08:50.065 Running I/O for 10 seconds... 00:08:51.964 23:30:36 blockdev_general.bdev_stat -- bdev/blockdev.sh@605 -- # stat_function_test Malloc_STAT 00:08:51.964 23:30:36 blockdev_general.bdev_stat -- bdev/blockdev.sh@558 -- # local bdev_name=Malloc_STAT 00:08:51.964 23:30:36 blockdev_general.bdev_stat -- bdev/blockdev.sh@559 -- # local iostats 00:08:51.964 23:30:36 blockdev_general.bdev_stat -- bdev/blockdev.sh@560 -- # local io_count1 00:08:51.964 23:30:36 blockdev_general.bdev_stat -- bdev/blockdev.sh@561 -- # local io_count2 00:08:51.964 23:30:36 blockdev_general.bdev_stat -- bdev/blockdev.sh@562 -- # local iostats_per_channel 00:08:51.964 23:30:36 blockdev_general.bdev_stat -- bdev/blockdev.sh@563 -- # local io_count_per_channel1 00:08:51.964 23:30:36 blockdev_general.bdev_stat -- bdev/blockdev.sh@564 -- # local io_count_per_channel2 00:08:51.964 23:30:36 blockdev_general.bdev_stat -- bdev/blockdev.sh@565 -- # local io_count_per_channel_all=0 00:08:51.964 23:30:36 blockdev_general.bdev_stat -- bdev/blockdev.sh@567 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT 00:08:51.964 23:30:36 blockdev_general.bdev_stat -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:51.964 23:30:36 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:08:51.964 23:30:36 blockdev_general.bdev_stat -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:51.964 23:30:36 blockdev_general.bdev_stat -- bdev/blockdev.sh@567 -- # iostats='{ 00:08:51.964 "tick_rate": 2100000000, 00:08:51.964 "ticks": 11878035520657462, 00:08:51.964 "bdevs": [ 00:08:51.964 { 00:08:51.964 "name": "Malloc_STAT", 00:08:51.964 "bytes_read": 979415552, 00:08:51.964 "num_read_ops": 239108, 00:08:51.964 "bytes_written": 0, 00:08:51.964 "num_write_ops": 0, 00:08:51.964 "bytes_unmapped": 0, 00:08:51.964 "num_unmap_ops": 0, 00:08:51.964 "bytes_copied": 0, 00:08:51.964 "num_copy_ops": 0, 00:08:51.964 "read_latency_ticks": 2064419242012, 00:08:51.964 "max_read_latency_ticks": 10641538, 00:08:51.964 "min_read_latency_ticks": 180798, 00:08:51.964 "write_latency_ticks": 0, 00:08:51.964 "max_write_latency_ticks": 0, 00:08:51.964 "min_write_latency_ticks": 0, 00:08:51.964 "unmap_latency_ticks": 0, 00:08:51.964 "max_unmap_latency_ticks": 0, 00:08:51.964 "min_unmap_latency_ticks": 0, 00:08:51.964 "copy_latency_ticks": 0, 00:08:51.964 "max_copy_latency_ticks": 0, 00:08:51.964 "min_copy_latency_ticks": 0, 00:08:51.964 "io_error": {} 00:08:51.964 } 00:08:51.964 ] 00:08:51.964 }' 00:08:51.964 23:30:36 blockdev_general.bdev_stat -- bdev/blockdev.sh@568 -- # jq -r '.bdevs[0].num_read_ops' 00:08:51.964 23:30:36 blockdev_general.bdev_stat -- bdev/blockdev.sh@568 -- # io_count1=239108 00:08:51.964 23:30:36 blockdev_general.bdev_stat -- bdev/blockdev.sh@570 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT -c 00:08:51.964 23:30:36 blockdev_general.bdev_stat -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:51.964 23:30:36 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:08:51.964 23:30:36 blockdev_general.bdev_stat -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:51.964 23:30:36 blockdev_general.bdev_stat -- bdev/blockdev.sh@570 -- # iostats_per_channel='{ 00:08:51.964 "tick_rate": 2100000000, 00:08:51.964 "ticks": 11878035644262628, 00:08:51.964 "name": "Malloc_STAT", 00:08:51.964 "channels": [ 00:08:51.964 { 00:08:51.964 "thread_id": 2, 00:08:51.964 "bytes_read": 502267904, 00:08:51.964 "num_read_ops": 122624, 00:08:51.964 "bytes_written": 0, 00:08:51.964 "num_write_ops": 0, 00:08:51.964 "bytes_unmapped": 0, 00:08:51.964 "num_unmap_ops": 0, 00:08:51.964 "bytes_copied": 0, 00:08:51.964 "num_copy_ops": 0, 00:08:51.964 "read_latency_ticks": 1063416797860, 00:08:51.964 "max_read_latency_ticks": 9251424, 00:08:51.964 "min_read_latency_ticks": 5666586, 00:08:51.964 "write_latency_ticks": 0, 00:08:51.964 "max_write_latency_ticks": 0, 00:08:51.964 "min_write_latency_ticks": 0, 00:08:51.964 "unmap_latency_ticks": 0, 00:08:51.964 "max_unmap_latency_ticks": 0, 00:08:51.964 "min_unmap_latency_ticks": 0, 00:08:51.964 "copy_latency_ticks": 0, 00:08:51.964 "max_copy_latency_ticks": 0, 00:08:51.964 "min_copy_latency_ticks": 0 00:08:51.964 }, 00:08:51.964 { 00:08:51.964 "thread_id": 3, 00:08:51.964 "bytes_read": 506462208, 00:08:51.964 "num_read_ops": 123648, 00:08:51.964 "bytes_written": 0, 00:08:51.964 "num_write_ops": 0, 00:08:51.964 "bytes_unmapped": 0, 00:08:51.964 "num_unmap_ops": 0, 00:08:51.964 "bytes_copied": 0, 00:08:51.964 "num_copy_ops": 0, 00:08:51.964 "read_latency_ticks": 1063777104194, 00:08:51.964 "max_read_latency_ticks": 10641538, 00:08:51.964 "min_read_latency_ticks": 5685286, 00:08:51.964 "write_latency_ticks": 0, 00:08:51.964 "max_write_latency_ticks": 0, 00:08:51.964 "min_write_latency_ticks": 0, 00:08:51.964 "unmap_latency_ticks": 0, 00:08:51.964 "max_unmap_latency_ticks": 0, 00:08:51.964 "min_unmap_latency_ticks": 0, 00:08:51.964 "copy_latency_ticks": 0, 00:08:51.964 "max_copy_latency_ticks": 0, 00:08:51.964 "min_copy_latency_ticks": 0 00:08:51.964 } 00:08:51.964 ] 00:08:51.964 }' 00:08:51.964 23:30:36 blockdev_general.bdev_stat -- bdev/blockdev.sh@571 -- # jq -r '.channels[0].num_read_ops' 00:08:51.964 23:30:36 blockdev_general.bdev_stat -- bdev/blockdev.sh@571 -- # io_count_per_channel1=122624 00:08:51.964 23:30:36 blockdev_general.bdev_stat -- bdev/blockdev.sh@572 -- # io_count_per_channel_all=122624 00:08:51.964 23:30:36 blockdev_general.bdev_stat -- bdev/blockdev.sh@573 -- # jq -r '.channels[1].num_read_ops' 00:08:51.964 23:30:36 blockdev_general.bdev_stat -- bdev/blockdev.sh@573 -- # io_count_per_channel2=123648 00:08:51.964 23:30:36 blockdev_general.bdev_stat -- bdev/blockdev.sh@574 -- # io_count_per_channel_all=246272 00:08:51.964 23:30:36 blockdev_general.bdev_stat -- bdev/blockdev.sh@576 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT 00:08:51.964 23:30:36 blockdev_general.bdev_stat -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:51.964 23:30:36 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:08:51.964 23:30:36 blockdev_general.bdev_stat -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:51.964 23:30:36 blockdev_general.bdev_stat -- bdev/blockdev.sh@576 -- # iostats='{ 00:08:51.964 "tick_rate": 2100000000, 00:08:51.964 "ticks": 11878035828773864, 00:08:51.964 "bdevs": [ 00:08:51.964 { 00:08:51.964 "name": "Malloc_STAT", 00:08:51.964 "bytes_read": 1053864448, 00:08:51.964 "num_read_ops": 257284, 00:08:51.964 "bytes_written": 0, 00:08:51.964 "num_write_ops": 0, 00:08:51.964 "bytes_unmapped": 0, 00:08:51.964 "num_unmap_ops": 0, 00:08:51.964 "bytes_copied": 0, 00:08:51.964 "num_copy_ops": 0, 00:08:51.964 "read_latency_ticks": 2223289563952, 00:08:51.964 "max_read_latency_ticks": 10641538, 00:08:51.964 "min_read_latency_ticks": 180798, 00:08:51.964 "write_latency_ticks": 0, 00:08:51.964 "max_write_latency_ticks": 0, 00:08:51.964 "min_write_latency_ticks": 0, 00:08:51.964 "unmap_latency_ticks": 0, 00:08:51.964 "max_unmap_latency_ticks": 0, 00:08:51.964 "min_unmap_latency_ticks": 0, 00:08:51.964 "copy_latency_ticks": 0, 00:08:51.964 "max_copy_latency_ticks": 0, 00:08:51.964 "min_copy_latency_ticks": 0, 00:08:51.964 "io_error": {} 00:08:51.964 } 00:08:51.964 ] 00:08:51.964 }' 00:08:51.964 23:30:36 blockdev_general.bdev_stat -- bdev/blockdev.sh@577 -- # jq -r '.bdevs[0].num_read_ops' 00:08:52.223 23:30:36 blockdev_general.bdev_stat -- bdev/blockdev.sh@577 -- # io_count2=257284 00:08:52.223 23:30:36 blockdev_general.bdev_stat -- bdev/blockdev.sh@582 -- # '[' 246272 -lt 239108 ']' 00:08:52.223 23:30:36 blockdev_general.bdev_stat -- bdev/blockdev.sh@582 -- # '[' 246272 -gt 257284 ']' 00:08:52.223 23:30:36 blockdev_general.bdev_stat -- bdev/blockdev.sh@607 -- # rpc_cmd bdev_malloc_delete Malloc_STAT 00:08:52.223 23:30:36 blockdev_general.bdev_stat -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:52.223 23:30:36 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:08:52.223 00:08:52.223 Latency(us) 00:08:52.223 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:08:52.223 Job: Malloc_STAT (Core Mask 0x1, workload: randread, depth: 256, IO size: 4096) 00:08:52.223 Malloc_STAT : 2.13 61842.57 241.57 0.00 0.00 4130.70 1076.66 4431.48 00:08:52.223 Job: Malloc_STAT (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:08:52.223 Malloc_STAT : 2.13 62404.25 243.77 0.00 0.00 4093.88 651.46 5086.84 00:08:52.223 =================================================================================================================== 00:08:52.223 Total : 124246.82 485.34 0.00 0.00 4112.20 651.46 5086.84 00:08:52.223 0 00:08:52.223 23:30:37 blockdev_general.bdev_stat -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:52.223 23:30:37 blockdev_general.bdev_stat -- bdev/blockdev.sh@608 -- # killprocess 240634 00:08:52.223 23:30:37 blockdev_general.bdev_stat -- common/autotest_common.sh@950 -- # '[' -z 240634 ']' 00:08:52.223 23:30:37 blockdev_general.bdev_stat -- common/autotest_common.sh@954 -- # kill -0 240634 00:08:52.223 23:30:37 blockdev_general.bdev_stat -- common/autotest_common.sh@955 -- # uname 00:08:52.223 23:30:37 blockdev_general.bdev_stat -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:08:52.223 23:30:37 blockdev_general.bdev_stat -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 240634 00:08:52.223 23:30:37 blockdev_general.bdev_stat -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:08:52.223 23:30:37 blockdev_general.bdev_stat -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:08:52.223 23:30:37 blockdev_general.bdev_stat -- common/autotest_common.sh@968 -- # echo 'killing process with pid 240634' 00:08:52.223 killing process with pid 240634 00:08:52.223 23:30:37 blockdev_general.bdev_stat -- common/autotest_common.sh@969 -- # kill 240634 00:08:52.223 Received shutdown signal, test time was about 2.199319 seconds 00:08:52.223 00:08:52.223 Latency(us) 00:08:52.223 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:08:52.223 =================================================================================================================== 00:08:52.223 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:08:52.223 23:30:37 blockdev_general.bdev_stat -- common/autotest_common.sh@974 -- # wait 240634 00:08:52.482 23:30:37 blockdev_general.bdev_stat -- bdev/blockdev.sh@609 -- # trap - SIGINT SIGTERM EXIT 00:08:52.482 00:08:52.482 real 0m3.359s 00:08:52.482 user 0m6.767s 00:08:52.482 sys 0m0.313s 00:08:52.482 23:30:37 blockdev_general.bdev_stat -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:52.482 23:30:37 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:08:52.482 ************************************ 00:08:52.482 END TEST bdev_stat 00:08:52.482 ************************************ 00:08:52.482 23:30:37 blockdev_general -- bdev/blockdev.sh@793 -- # [[ bdev == gpt ]] 00:08:52.482 23:30:37 blockdev_general -- bdev/blockdev.sh@797 -- # [[ bdev == crypto_sw ]] 00:08:52.482 23:30:37 blockdev_general -- bdev/blockdev.sh@809 -- # trap - SIGINT SIGTERM EXIT 00:08:52.482 23:30:37 blockdev_general -- bdev/blockdev.sh@810 -- # cleanup 00:08:52.482 23:30:37 blockdev_general -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:08:52.482 23:30:37 blockdev_general -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:08:52.482 23:30:37 blockdev_general -- bdev/blockdev.sh@26 -- # [[ bdev == rbd ]] 00:08:52.482 23:30:37 blockdev_general -- bdev/blockdev.sh@30 -- # [[ bdev == daos ]] 00:08:52.482 23:30:37 blockdev_general -- bdev/blockdev.sh@34 -- # [[ bdev = \g\p\t ]] 00:08:52.482 23:30:37 blockdev_general -- bdev/blockdev.sh@40 -- # [[ bdev == xnvme ]] 00:08:52.482 00:08:52.482 real 1m43.633s 00:08:52.482 user 7m2.505s 00:08:52.482 sys 0m14.597s 00:08:52.482 23:30:37 blockdev_general -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:52.482 23:30:37 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:52.482 ************************************ 00:08:52.482 END TEST blockdev_general 00:08:52.482 ************************************ 00:08:52.482 23:30:37 -- spdk/autotest.sh@194 -- # run_test bdev_raid /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh 00:08:52.482 23:30:37 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:08:52.482 23:30:37 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:52.482 23:30:37 -- common/autotest_common.sh@10 -- # set +x 00:08:52.482 ************************************ 00:08:52.482 START TEST bdev_raid 00:08:52.482 ************************************ 00:08:52.482 23:30:37 bdev_raid -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh 00:08:52.482 * Looking for test storage... 00:08:52.482 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:08:52.482 23:30:37 bdev_raid -- bdev/bdev_raid.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:08:52.482 23:30:37 bdev_raid -- bdev/nbd_common.sh@6 -- # set -e 00:08:52.482 23:30:37 bdev_raid -- bdev/bdev_raid.sh@15 -- # rpc_py='/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock' 00:08:52.482 23:30:37 bdev_raid -- bdev/bdev_raid.sh@851 -- # mkdir -p /raidtest 00:08:52.482 23:30:37 bdev_raid -- bdev/bdev_raid.sh@852 -- # trap 'cleanup; exit 1' EXIT 00:08:52.482 23:30:37 bdev_raid -- bdev/bdev_raid.sh@854 -- # base_blocklen=512 00:08:52.482 23:30:37 bdev_raid -- bdev/bdev_raid.sh@856 -- # uname -s 00:08:52.482 23:30:37 bdev_raid -- bdev/bdev_raid.sh@856 -- # '[' Linux = Linux ']' 00:08:52.482 23:30:37 bdev_raid -- bdev/bdev_raid.sh@856 -- # modprobe -n nbd 00:08:52.482 23:30:37 bdev_raid -- bdev/bdev_raid.sh@857 -- # has_nbd=true 00:08:52.482 23:30:37 bdev_raid -- bdev/bdev_raid.sh@858 -- # modprobe nbd 00:08:52.482 23:30:37 bdev_raid -- bdev/bdev_raid.sh@859 -- # run_test raid_function_test_raid0 raid_function_test raid0 00:08:52.482 23:30:37 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:08:52.482 23:30:37 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:52.482 23:30:37 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:08:52.482 ************************************ 00:08:52.482 START TEST raid_function_test_raid0 00:08:52.482 ************************************ 00:08:52.482 23:30:37 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@1125 -- # raid_function_test raid0 00:08:52.482 23:30:37 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@80 -- # local raid_level=raid0 00:08:52.483 23:30:37 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@81 -- # local nbd=/dev/nbd0 00:08:52.483 23:30:37 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@82 -- # local raid_bdev 00:08:52.483 23:30:37 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@85 -- # raid_pid=241399 00:08:52.483 23:30:37 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@86 -- # echo 'Process raid pid: 241399' 00:08:52.483 Process raid pid: 241399 00:08:52.483 23:30:37 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@84 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:08:52.483 23:30:37 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@87 -- # waitforlisten 241399 /var/tmp/spdk-raid.sock 00:08:52.483 23:30:37 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@831 -- # '[' -z 241399 ']' 00:08:52.483 23:30:37 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:08:52.483 23:30:37 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@836 -- # local max_retries=100 00:08:52.483 23:30:37 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:08:52.483 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:08:52.483 23:30:37 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@840 -- # xtrace_disable 00:08:52.483 23:30:37 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@10 -- # set +x 00:08:52.741 [2024-07-24 23:30:37.516738] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:08:52.741 [2024-07-24 23:30:37.516793] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:08:52.741 [2024-07-24 23:30:37.581251] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:52.741 [2024-07-24 23:30:37.658048] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:52.741 [2024-07-24 23:30:37.706619] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:08:52.741 [2024-07-24 23:30:37.706653] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:08:53.307 23:30:38 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:08:53.307 23:30:38 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@864 -- # return 0 00:08:53.307 23:30:38 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@89 -- # configure_raid_bdev raid0 00:08:53.307 23:30:38 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@66 -- # local raid_level=raid0 00:08:53.307 23:30:38 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@67 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:08:53.307 23:30:38 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@69 -- # cat 00:08:53.307 23:30:38 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@74 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock 00:08:53.565 [2024-07-24 23:30:38.468925] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:08:53.565 [2024-07-24 23:30:38.469830] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:08:53.565 [2024-07-24 23:30:38.469872] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x222fc80 00:08:53.565 [2024-07-24 23:30:38.469878] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:08:53.565 [2024-07-24 23:30:38.469997] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x222fbc0 00:08:53.565 [2024-07-24 23:30:38.470076] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x222fc80 00:08:53.565 [2024-07-24 23:30:38.470081] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid, raid_bdev 0x222fc80 00:08:53.565 [2024-07-24 23:30:38.470150] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:08:53.565 Base_1 00:08:53.565 Base_2 00:08:53.565 23:30:38 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@76 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:08:53.566 23:30:38 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:08:53.566 23:30:38 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # jq -r '.[0]["name"] | select(.)' 00:08:53.824 23:30:38 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # raid_bdev=raid 00:08:53.824 23:30:38 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@91 -- # '[' raid = '' ']' 00:08:53.824 23:30:38 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@96 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid /dev/nbd0 00:08:53.824 23:30:38 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:08:53.824 23:30:38 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@10 -- # bdev_list=('raid') 00:08:53.824 23:30:38 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:08:53.824 23:30:38 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:08:53.824 23:30:38 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:08:53.824 23:30:38 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@12 -- # local i 00:08:53.824 23:30:38 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:08:53.824 23:30:38 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:08:53.824 23:30:38 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid /dev/nbd0 00:08:53.824 [2024-07-24 23:30:38.805811] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x222fbc0 00:08:53.824 /dev/nbd0 00:08:54.083 23:30:38 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:08:54.083 23:30:38 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:08:54.083 23:30:38 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:08:54.083 23:30:38 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@869 -- # local i 00:08:54.083 23:30:38 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:08:54.083 23:30:38 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:08:54.083 23:30:38 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:08:54.083 23:30:38 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@873 -- # break 00:08:54.083 23:30:38 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:08:54.083 23:30:38 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:08:54.083 23:30:38 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:54.083 1+0 records in 00:08:54.083 1+0 records out 00:08:54.083 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000237857 s, 17.2 MB/s 00:08:54.083 23:30:38 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:54.083 23:30:38 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@886 -- # size=4096 00:08:54.083 23:30:38 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:54.083 23:30:38 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:08:54.083 23:30:38 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@889 -- # return 0 00:08:54.083 23:30:38 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:54.083 23:30:38 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:08:54.083 23:30:38 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@97 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:08:54.083 23:30:38 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:08:54.083 23:30:38 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:08:54.083 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:08:54.083 { 00:08:54.083 "nbd_device": "/dev/nbd0", 00:08:54.083 "bdev_name": "raid" 00:08:54.083 } 00:08:54.083 ]' 00:08:54.083 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # echo '[ 00:08:54.083 { 00:08:54.083 "nbd_device": "/dev/nbd0", 00:08:54.083 "bdev_name": "raid" 00:08:54.083 } 00:08:54.083 ]' 00:08:54.083 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:08:54.083 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # nbd_disks_name=/dev/nbd0 00:08:54.083 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # echo /dev/nbd0 00:08:54.083 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:08:54.341 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # count=1 00:08:54.341 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@66 -- # echo 1 00:08:54.341 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@97 -- # count=1 00:08:54.341 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@98 -- # '[' 1 -ne 1 ']' 00:08:54.341 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@102 -- # raid_unmap_data_verify /dev/nbd0 /var/tmp/spdk-raid.sock 00:08:54.341 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@18 -- # hash blkdiscard 00:08:54.341 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@19 -- # local nbd=/dev/nbd0 00:08:54.341 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@20 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:08:54.341 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@21 -- # local blksize 00:08:54.341 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # lsblk -o LOG-SEC /dev/nbd0 00:08:54.341 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # grep -v LOG-SEC 00:08:54.341 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # cut -d ' ' -f 5 00:08:54.341 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # blksize=512 00:08:54.341 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@23 -- # local rw_blk_num=4096 00:08:54.341 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@24 -- # local rw_len=2097152 00:08:54.341 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@25 -- # unmap_blk_offs=('0' '1028' '321') 00:08:54.341 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@25 -- # local unmap_blk_offs 00:08:54.341 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@26 -- # unmap_blk_nums=('128' '2035' '456') 00:08:54.341 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@26 -- # local unmap_blk_nums 00:08:54.341 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@27 -- # local unmap_off 00:08:54.341 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@28 -- # local unmap_len 00:08:54.341 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@31 -- # dd if=/dev/urandom of=/raidtest/raidrandtest bs=512 count=4096 00:08:54.341 4096+0 records in 00:08:54.341 4096+0 records out 00:08:54.341 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.0241478 s, 86.8 MB/s 00:08:54.341 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@32 -- # dd if=/raidtest/raidrandtest of=/dev/nbd0 bs=512 count=4096 oflag=direct 00:08:54.341 4096+0 records in 00:08:54.341 4096+0 records out 00:08:54.341 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.145724 s, 14.4 MB/s 00:08:54.341 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@33 -- # blockdev --flushbufs /dev/nbd0 00:08:54.341 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@36 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:08:54.341 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i = 0 )) 00:08:54.341 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:08:54.341 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=0 00:08:54.341 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=65536 00:08:54.341 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=0 count=128 conv=notrunc 00:08:54.341 128+0 records in 00:08:54.341 128+0 records out 00:08:54.341 65536 bytes (66 kB, 64 KiB) copied, 0.000353112 s, 186 MB/s 00:08:54.341 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 0 -l 65536 /dev/nbd0 00:08:54.341 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:08:54.341 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:08:54.341 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:08:54.341 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:08:54.341 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=526336 00:08:54.341 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=1041920 00:08:54.341 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=1028 count=2035 conv=notrunc 00:08:54.341 2035+0 records in 00:08:54.341 2035+0 records out 00:08:54.341 1041920 bytes (1.0 MB, 1018 KiB) copied, 0.00341712 s, 305 MB/s 00:08:54.341 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 526336 -l 1041920 /dev/nbd0 00:08:54.341 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:08:54.342 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:08:54.342 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:08:54.342 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:08:54.342 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=164352 00:08:54.342 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=233472 00:08:54.342 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=321 count=456 conv=notrunc 00:08:54.342 456+0 records in 00:08:54.342 456+0 records out 00:08:54.342 233472 bytes (233 kB, 228 KiB) copied, 0.00114288 s, 204 MB/s 00:08:54.342 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 164352 -l 233472 /dev/nbd0 00:08:54.342 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:08:54.342 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:08:54.600 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:08:54.600 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:08:54.600 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@54 -- # return 0 00:08:54.600 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@104 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:08:54.600 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:08:54.600 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:08:54.600 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:08:54.600 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@51 -- # local i 00:08:54.600 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:54.600 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:08:54.600 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:08:54.600 [2024-07-24 23:30:39.516924] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:08:54.600 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:08:54.600 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:08:54.600 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:54.600 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:54.600 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:08:54.600 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@41 -- # break 00:08:54.600 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@45 -- # return 0 00:08:54.600 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@105 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:08:54.600 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:08:54.600 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:08:54.859 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:08:54.859 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:08:54.859 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:08:54.859 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:08:54.859 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:08:54.859 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # echo '' 00:08:54.859 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # true 00:08:54.859 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # count=0 00:08:54.859 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@66 -- # echo 0 00:08:54.859 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@105 -- # count=0 00:08:54.859 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@106 -- # '[' 0 -ne 0 ']' 00:08:54.859 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@110 -- # killprocess 241399 00:08:54.859 23:30:39 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@950 -- # '[' -z 241399 ']' 00:08:54.859 23:30:39 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@954 -- # kill -0 241399 00:08:54.859 23:30:39 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@955 -- # uname 00:08:54.859 23:30:39 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:08:54.860 23:30:39 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 241399 00:08:54.860 23:30:39 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:08:54.860 23:30:39 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:08:54.860 23:30:39 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@968 -- # echo 'killing process with pid 241399' 00:08:54.860 killing process with pid 241399 00:08:54.860 23:30:39 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@969 -- # kill 241399 00:08:54.860 [2024-07-24 23:30:39.789092] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:08:54.860 [2024-07-24 23:30:39.789138] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:08:54.860 [2024-07-24 23:30:39.789165] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:08:54.860 [2024-07-24 23:30:39.789175] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x222fc80 name raid, state offline 00:08:54.860 23:30:39 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@974 -- # wait 241399 00:08:54.860 [2024-07-24 23:30:39.804383] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:08:55.118 23:30:39 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@112 -- # return 0 00:08:55.118 00:08:55.118 real 0m2.512s 00:08:55.118 user 0m3.373s 00:08:55.118 sys 0m0.764s 00:08:55.118 23:30:39 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:55.118 23:30:39 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@10 -- # set +x 00:08:55.118 ************************************ 00:08:55.118 END TEST raid_function_test_raid0 00:08:55.118 ************************************ 00:08:55.118 23:30:40 bdev_raid -- bdev/bdev_raid.sh@860 -- # run_test raid_function_test_concat raid_function_test concat 00:08:55.118 23:30:40 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:08:55.118 23:30:40 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:55.118 23:30:40 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:08:55.118 ************************************ 00:08:55.118 START TEST raid_function_test_concat 00:08:55.118 ************************************ 00:08:55.118 23:30:40 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@1125 -- # raid_function_test concat 00:08:55.119 23:30:40 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@80 -- # local raid_level=concat 00:08:55.119 23:30:40 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@81 -- # local nbd=/dev/nbd0 00:08:55.119 23:30:40 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@82 -- # local raid_bdev 00:08:55.119 23:30:40 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@85 -- # raid_pid=241855 00:08:55.119 23:30:40 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@86 -- # echo 'Process raid pid: 241855' 00:08:55.119 Process raid pid: 241855 00:08:55.119 23:30:40 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@84 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:08:55.119 23:30:40 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@87 -- # waitforlisten 241855 /var/tmp/spdk-raid.sock 00:08:55.119 23:30:40 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@831 -- # '[' -z 241855 ']' 00:08:55.119 23:30:40 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:08:55.119 23:30:40 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@836 -- # local max_retries=100 00:08:55.119 23:30:40 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:08:55.119 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:08:55.119 23:30:40 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@840 -- # xtrace_disable 00:08:55.119 23:30:40 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@10 -- # set +x 00:08:55.119 [2024-07-24 23:30:40.097398] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:08:55.119 [2024-07-24 23:30:40.097444] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:08:55.377 [2024-07-24 23:30:40.161858] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:55.377 [2024-07-24 23:30:40.240408] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:55.377 [2024-07-24 23:30:40.290937] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:08:55.377 [2024-07-24 23:30:40.290963] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:08:55.943 23:30:40 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:08:55.943 23:30:40 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@864 -- # return 0 00:08:55.943 23:30:40 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@89 -- # configure_raid_bdev concat 00:08:55.943 23:30:40 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@66 -- # local raid_level=concat 00:08:55.943 23:30:40 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@67 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:08:55.943 23:30:40 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@69 -- # cat 00:08:55.943 23:30:40 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@74 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock 00:08:56.202 [2024-07-24 23:30:41.077567] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:08:56.202 [2024-07-24 23:30:41.078436] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:08:56.202 [2024-07-24 23:30:41.078482] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1110c80 00:08:56.202 [2024-07-24 23:30:41.078488] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:08:56.202 [2024-07-24 23:30:41.078603] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1110bc0 00:08:56.202 [2024-07-24 23:30:41.078680] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1110c80 00:08:56.202 [2024-07-24 23:30:41.078685] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid, raid_bdev 0x1110c80 00:08:56.202 [2024-07-24 23:30:41.078751] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:08:56.202 Base_1 00:08:56.202 Base_2 00:08:56.202 23:30:41 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@76 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:08:56.202 23:30:41 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:08:56.202 23:30:41 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # jq -r '.[0]["name"] | select(.)' 00:08:56.460 23:30:41 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # raid_bdev=raid 00:08:56.460 23:30:41 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@91 -- # '[' raid = '' ']' 00:08:56.460 23:30:41 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@96 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid /dev/nbd0 00:08:56.460 23:30:41 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:08:56.460 23:30:41 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@10 -- # bdev_list=('raid') 00:08:56.460 23:30:41 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:08:56.460 23:30:41 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:08:56.460 23:30:41 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:08:56.460 23:30:41 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@12 -- # local i 00:08:56.460 23:30:41 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:08:56.460 23:30:41 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:08:56.460 23:30:41 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid /dev/nbd0 00:08:56.460 [2024-07-24 23:30:41.430486] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1110bc0 00:08:56.460 /dev/nbd0 00:08:56.460 23:30:41 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:08:56.460 23:30:41 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:08:56.460 23:30:41 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:08:56.460 23:30:41 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@869 -- # local i 00:08:56.460 23:30:41 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:08:56.460 23:30:41 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:08:56.460 23:30:41 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:08:56.460 23:30:41 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@873 -- # break 00:08:56.460 23:30:41 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:08:56.460 23:30:41 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:08:56.719 23:30:41 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:56.719 1+0 records in 00:08:56.719 1+0 records out 00:08:56.719 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00012825 s, 31.9 MB/s 00:08:56.719 23:30:41 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:56.719 23:30:41 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@886 -- # size=4096 00:08:56.719 23:30:41 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:56.719 23:30:41 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:08:56.719 23:30:41 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@889 -- # return 0 00:08:56.719 23:30:41 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:56.719 23:30:41 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:08:56.719 23:30:41 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@97 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:08:56.719 23:30:41 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:08:56.719 23:30:41 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:08:56.719 23:30:41 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:08:56.719 { 00:08:56.719 "nbd_device": "/dev/nbd0", 00:08:56.719 "bdev_name": "raid" 00:08:56.719 } 00:08:56.719 ]' 00:08:56.719 23:30:41 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # echo '[ 00:08:56.719 { 00:08:56.719 "nbd_device": "/dev/nbd0", 00:08:56.719 "bdev_name": "raid" 00:08:56.719 } 00:08:56.719 ]' 00:08:56.719 23:30:41 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:08:56.719 23:30:41 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # nbd_disks_name=/dev/nbd0 00:08:56.719 23:30:41 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # echo /dev/nbd0 00:08:56.719 23:30:41 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:08:56.719 23:30:41 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # count=1 00:08:56.719 23:30:41 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@66 -- # echo 1 00:08:56.719 23:30:41 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@97 -- # count=1 00:08:56.719 23:30:41 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@98 -- # '[' 1 -ne 1 ']' 00:08:56.719 23:30:41 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@102 -- # raid_unmap_data_verify /dev/nbd0 /var/tmp/spdk-raid.sock 00:08:56.719 23:30:41 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@18 -- # hash blkdiscard 00:08:56.719 23:30:41 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@19 -- # local nbd=/dev/nbd0 00:08:56.719 23:30:41 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@20 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:08:56.719 23:30:41 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@21 -- # local blksize 00:08:56.719 23:30:41 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # lsblk -o LOG-SEC /dev/nbd0 00:08:56.719 23:30:41 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # cut -d ' ' -f 5 00:08:56.719 23:30:41 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # grep -v LOG-SEC 00:08:56.719 23:30:41 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # blksize=512 00:08:56.719 23:30:41 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@23 -- # local rw_blk_num=4096 00:08:56.719 23:30:41 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@24 -- # local rw_len=2097152 00:08:56.719 23:30:41 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@25 -- # unmap_blk_offs=('0' '1028' '321') 00:08:56.719 23:30:41 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@25 -- # local unmap_blk_offs 00:08:56.719 23:30:41 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@26 -- # unmap_blk_nums=('128' '2035' '456') 00:08:56.719 23:30:41 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@26 -- # local unmap_blk_nums 00:08:56.719 23:30:41 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@27 -- # local unmap_off 00:08:56.719 23:30:41 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@28 -- # local unmap_len 00:08:56.720 23:30:41 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@31 -- # dd if=/dev/urandom of=/raidtest/raidrandtest bs=512 count=4096 00:08:56.978 4096+0 records in 00:08:56.978 4096+0 records out 00:08:56.978 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.0249579 s, 84.0 MB/s 00:08:56.978 23:30:41 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@32 -- # dd if=/raidtest/raidrandtest of=/dev/nbd0 bs=512 count=4096 oflag=direct 00:08:56.978 4096+0 records in 00:08:56.978 4096+0 records out 00:08:56.978 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.144857 s, 14.5 MB/s 00:08:56.978 23:30:41 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@33 -- # blockdev --flushbufs /dev/nbd0 00:08:56.978 23:30:41 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@36 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:08:56.978 23:30:41 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i = 0 )) 00:08:56.978 23:30:41 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:08:56.978 23:30:41 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=0 00:08:56.978 23:30:41 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=65536 00:08:56.978 23:30:41 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=0 count=128 conv=notrunc 00:08:56.978 128+0 records in 00:08:56.978 128+0 records out 00:08:56.978 65536 bytes (66 kB, 64 KiB) copied, 0.000371936 s, 176 MB/s 00:08:56.978 23:30:41 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 0 -l 65536 /dev/nbd0 00:08:56.978 23:30:41 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:08:56.978 23:30:41 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:08:56.978 23:30:41 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:08:56.978 23:30:41 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:08:56.978 23:30:41 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=526336 00:08:56.978 23:30:41 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=1041920 00:08:56.978 23:30:41 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=1028 count=2035 conv=notrunc 00:08:56.978 2035+0 records in 00:08:56.978 2035+0 records out 00:08:56.978 1041920 bytes (1.0 MB, 1018 KiB) copied, 0.00510573 s, 204 MB/s 00:08:56.978 23:30:41 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 526336 -l 1041920 /dev/nbd0 00:08:56.978 23:30:41 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:08:56.978 23:30:41 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:08:56.978 23:30:41 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:08:56.978 23:30:41 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:08:56.978 23:30:41 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=164352 00:08:56.978 23:30:41 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=233472 00:08:56.978 23:30:41 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=321 count=456 conv=notrunc 00:08:56.978 456+0 records in 00:08:56.978 456+0 records out 00:08:56.978 233472 bytes (233 kB, 228 KiB) copied, 0.00116532 s, 200 MB/s 00:08:56.978 23:30:41 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 164352 -l 233472 /dev/nbd0 00:08:56.978 23:30:41 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:08:56.978 23:30:41 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:08:56.978 23:30:41 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:08:56.978 23:30:41 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:08:56.978 23:30:41 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@54 -- # return 0 00:08:56.978 23:30:41 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@104 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:08:56.978 23:30:41 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:08:56.978 23:30:41 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:08:56.978 23:30:41 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:08:56.978 23:30:41 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@51 -- # local i 00:08:56.979 23:30:41 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:56.979 23:30:41 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:08:57.268 23:30:42 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:08:57.268 [2024-07-24 23:30:42.123970] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:08:57.268 23:30:42 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:08:57.268 23:30:42 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:08:57.268 23:30:42 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:57.268 23:30:42 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:57.268 23:30:42 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:08:57.268 23:30:42 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@41 -- # break 00:08:57.268 23:30:42 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@45 -- # return 0 00:08:57.268 23:30:42 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@105 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:08:57.268 23:30:42 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:08:57.268 23:30:42 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:08:57.531 23:30:42 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:08:57.531 23:30:42 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:08:57.531 23:30:42 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:08:57.531 23:30:42 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:08:57.531 23:30:42 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # echo '' 00:08:57.531 23:30:42 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:08:57.531 23:30:42 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # true 00:08:57.531 23:30:42 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # count=0 00:08:57.531 23:30:42 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@66 -- # echo 0 00:08:57.531 23:30:42 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@105 -- # count=0 00:08:57.531 23:30:42 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@106 -- # '[' 0 -ne 0 ']' 00:08:57.531 23:30:42 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@110 -- # killprocess 241855 00:08:57.531 23:30:42 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@950 -- # '[' -z 241855 ']' 00:08:57.531 23:30:42 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@954 -- # kill -0 241855 00:08:57.531 23:30:42 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@955 -- # uname 00:08:57.531 23:30:42 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:08:57.531 23:30:42 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 241855 00:08:57.531 23:30:42 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:08:57.531 23:30:42 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:08:57.531 23:30:42 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@968 -- # echo 'killing process with pid 241855' 00:08:57.531 killing process with pid 241855 00:08:57.531 23:30:42 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@969 -- # kill 241855 00:08:57.531 [2024-07-24 23:30:42.391193] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:08:57.531 [2024-07-24 23:30:42.391238] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:08:57.531 [2024-07-24 23:30:42.391269] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:08:57.531 [2024-07-24 23:30:42.391275] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1110c80 name raid, state offline 00:08:57.531 23:30:42 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@974 -- # wait 241855 00:08:57.531 [2024-07-24 23:30:42.406662] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:08:57.790 23:30:42 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@112 -- # return 0 00:08:57.790 00:08:57.790 real 0m2.528s 00:08:57.790 user 0m3.427s 00:08:57.790 sys 0m0.748s 00:08:57.790 23:30:42 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:57.790 23:30:42 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@10 -- # set +x 00:08:57.790 ************************************ 00:08:57.790 END TEST raid_function_test_concat 00:08:57.790 ************************************ 00:08:57.790 23:30:42 bdev_raid -- bdev/bdev_raid.sh@863 -- # run_test raid0_resize_test raid0_resize_test 00:08:57.790 23:30:42 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:08:57.790 23:30:42 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:57.790 23:30:42 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:08:57.790 ************************************ 00:08:57.790 START TEST raid0_resize_test 00:08:57.790 ************************************ 00:08:57.790 23:30:42 bdev_raid.raid0_resize_test -- common/autotest_common.sh@1125 -- # raid0_resize_test 00:08:57.790 23:30:42 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@347 -- # local blksize=512 00:08:57.790 23:30:42 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@348 -- # local bdev_size_mb=32 00:08:57.790 23:30:42 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@349 -- # local new_bdev_size_mb=64 00:08:57.790 23:30:42 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@350 -- # local blkcnt 00:08:57.790 23:30:42 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@351 -- # local raid_size_mb 00:08:57.790 23:30:42 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@352 -- # local new_raid_size_mb 00:08:57.790 23:30:42 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@355 -- # raid_pid=242307 00:08:57.790 23:30:42 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@356 -- # echo 'Process raid pid: 242307' 00:08:57.790 Process raid pid: 242307 00:08:57.790 23:30:42 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@357 -- # waitforlisten 242307 /var/tmp/spdk-raid.sock 00:08:57.790 23:30:42 bdev_raid.raid0_resize_test -- common/autotest_common.sh@831 -- # '[' -z 242307 ']' 00:08:57.790 23:30:42 bdev_raid.raid0_resize_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:08:57.790 23:30:42 bdev_raid.raid0_resize_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:08:57.790 23:30:42 bdev_raid.raid0_resize_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:08:57.790 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:08:57.790 23:30:42 bdev_raid.raid0_resize_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:08:57.790 23:30:42 bdev_raid.raid0_resize_test -- common/autotest_common.sh@10 -- # set +x 00:08:57.790 23:30:42 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@354 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:08:57.790 [2024-07-24 23:30:42.686443] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:08:57.790 [2024-07-24 23:30:42.686485] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:08:57.790 [2024-07-24 23:30:42.751453] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:58.049 [2024-07-24 23:30:42.829913] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:58.049 [2024-07-24 23:30:42.886973] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:08:58.049 [2024-07-24 23:30:42.886998] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:08:58.614 23:30:43 bdev_raid.raid0_resize_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:08:58.614 23:30:43 bdev_raid.raid0_resize_test -- common/autotest_common.sh@864 -- # return 0 00:08:58.614 23:30:43 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@359 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_1 32 512 00:08:58.871 Base_1 00:08:58.871 23:30:43 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@360 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_2 32 512 00:08:58.871 Base_2 00:08:58.871 23:30:43 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@362 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r 0 -b 'Base_1 Base_2' -n Raid 00:08:59.128 [2024-07-24 23:30:43.950424] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:08:59.128 [2024-07-24 23:30:43.951399] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:08:59.128 [2024-07-24 23:30:43.951433] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x20bee30 00:08:59.128 [2024-07-24 23:30:43.951437] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:08:59.128 [2024-07-24 23:30:43.951583] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x222b220 00:08:59.128 [2024-07-24 23:30:43.951644] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x20bee30 00:08:59.128 [2024-07-24 23:30:43.951649] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Raid, raid_bdev 0x20bee30 00:08:59.128 [2024-07-24 23:30:43.951719] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:08:59.128 23:30:43 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@365 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_1 64 00:08:59.128 [2024-07-24 23:30:44.118844] bdev_raid.c:2288:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:08:59.128 [2024-07-24 23:30:44.118857] bdev_raid.c:2301:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_1' was resized: old size 65536, new size 131072 00:08:59.128 true 00:08:59.386 23:30:44 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:08:59.386 23:30:44 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # jq '.[].num_blocks' 00:08:59.386 [2024-07-24 23:30:44.287369] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:08:59.386 23:30:44 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # blkcnt=131072 00:08:59.386 23:30:44 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@369 -- # raid_size_mb=64 00:08:59.386 23:30:44 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@370 -- # '[' 64 '!=' 64 ']' 00:08:59.386 23:30:44 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@376 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_2 64 00:08:59.643 [2024-07-24 23:30:44.451691] bdev_raid.c:2288:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:08:59.643 [2024-07-24 23:30:44.451703] bdev_raid.c:2301:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_2' was resized: old size 65536, new size 131072 00:08:59.643 [2024-07-24 23:30:44.451718] bdev_raid.c:2315:raid_bdev_resize_base_bdev: *NOTICE*: raid bdev 'Raid': block count was changed from 131072 to 262144 00:08:59.643 true 00:08:59.643 23:30:44 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:08:59.643 23:30:44 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # jq '.[].num_blocks' 00:08:59.643 [2024-07-24 23:30:44.620217] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:08:59.643 23:30:44 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # blkcnt=262144 00:08:59.643 23:30:44 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@380 -- # raid_size_mb=128 00:08:59.643 23:30:44 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@381 -- # '[' 128 '!=' 128 ']' 00:08:59.643 23:30:44 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@386 -- # killprocess 242307 00:08:59.643 23:30:44 bdev_raid.raid0_resize_test -- common/autotest_common.sh@950 -- # '[' -z 242307 ']' 00:08:59.643 23:30:44 bdev_raid.raid0_resize_test -- common/autotest_common.sh@954 -- # kill -0 242307 00:08:59.643 23:30:44 bdev_raid.raid0_resize_test -- common/autotest_common.sh@955 -- # uname 00:08:59.902 23:30:44 bdev_raid.raid0_resize_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:08:59.902 23:30:44 bdev_raid.raid0_resize_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 242307 00:08:59.902 23:30:44 bdev_raid.raid0_resize_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:08:59.902 23:30:44 bdev_raid.raid0_resize_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:08:59.902 23:30:44 bdev_raid.raid0_resize_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 242307' 00:08:59.902 killing process with pid 242307 00:08:59.902 23:30:44 bdev_raid.raid0_resize_test -- common/autotest_common.sh@969 -- # kill 242307 00:08:59.902 [2024-07-24 23:30:44.682253] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:08:59.902 [2024-07-24 23:30:44.682295] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:08:59.902 [2024-07-24 23:30:44.682322] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:08:59.902 [2024-07-24 23:30:44.682327] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20bee30 name Raid, state offline 00:08:59.902 23:30:44 bdev_raid.raid0_resize_test -- common/autotest_common.sh@974 -- # wait 242307 00:08:59.902 [2024-07-24 23:30:44.683364] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:08:59.902 23:30:44 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@388 -- # return 0 00:08:59.902 00:08:59.902 real 0m2.200s 00:08:59.902 user 0m3.323s 00:08:59.902 sys 0m0.414s 00:08:59.902 23:30:44 bdev_raid.raid0_resize_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:59.902 23:30:44 bdev_raid.raid0_resize_test -- common/autotest_common.sh@10 -- # set +x 00:08:59.902 ************************************ 00:08:59.902 END TEST raid0_resize_test 00:08:59.902 ************************************ 00:08:59.902 23:30:44 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:08:59.902 23:30:44 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:08:59.902 23:30:44 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 2 false 00:08:59.902 23:30:44 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:08:59.902 23:30:44 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:59.902 23:30:44 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:09:00.160 ************************************ 00:09:00.160 START TEST raid_state_function_test 00:09:00.160 ************************************ 00:09:00.160 23:30:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # raid_state_function_test raid0 2 false 00:09:00.160 23:30:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:09:00.160 23:30:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:09:00.160 23:30:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:09:00.160 23:30:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:09:00.160 23:30:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:09:00.160 23:30:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:09:00.160 23:30:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:09:00.160 23:30:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:09:00.160 23:30:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:09:00.160 23:30:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:09:00.160 23:30:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:09:00.160 23:30:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:09:00.160 23:30:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:09:00.160 23:30:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:09:00.160 23:30:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:09:00.160 23:30:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:09:00.160 23:30:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:09:00.160 23:30:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:09:00.160 23:30:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:09:00.160 23:30:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:09:00.160 23:30:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:09:00.160 23:30:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:09:00.160 23:30:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:09:00.160 23:30:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=242757 00:09:00.160 23:30:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 242757' 00:09:00.160 Process raid pid: 242757 00:09:00.160 23:30:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:09:00.160 23:30:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 242757 /var/tmp/spdk-raid.sock 00:09:00.160 23:30:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@831 -- # '[' -z 242757 ']' 00:09:00.160 23:30:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:09:00.160 23:30:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:09:00.160 23:30:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:09:00.160 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:09:00.160 23:30:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:09:00.160 23:30:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:00.160 [2024-07-24 23:30:44.965881] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:09:00.160 [2024-07-24 23:30:44.965920] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:09:00.160 [2024-07-24 23:30:45.031911] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:00.160 [2024-07-24 23:30:45.101493] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:00.160 [2024-07-24 23:30:45.152673] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:00.160 [2024-07-24 23:30:45.152698] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:01.093 23:30:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:09:01.093 23:30:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@864 -- # return 0 00:09:01.093 23:30:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:09:01.093 [2024-07-24 23:30:45.895934] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:09:01.093 [2024-07-24 23:30:45.895966] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:09:01.093 [2024-07-24 23:30:45.895972] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:09:01.093 [2024-07-24 23:30:45.895977] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:09:01.093 23:30:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:09:01.093 23:30:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:01.093 23:30:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:09:01.093 23:30:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:01.093 23:30:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:01.093 23:30:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:01.093 23:30:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:01.093 23:30:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:01.093 23:30:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:01.093 23:30:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:01.093 23:30:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:01.093 23:30:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:01.093 23:30:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:01.093 "name": "Existed_Raid", 00:09:01.093 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:01.093 "strip_size_kb": 64, 00:09:01.093 "state": "configuring", 00:09:01.093 "raid_level": "raid0", 00:09:01.093 "superblock": false, 00:09:01.093 "num_base_bdevs": 2, 00:09:01.093 "num_base_bdevs_discovered": 0, 00:09:01.093 "num_base_bdevs_operational": 2, 00:09:01.093 "base_bdevs_list": [ 00:09:01.093 { 00:09:01.093 "name": "BaseBdev1", 00:09:01.093 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:01.093 "is_configured": false, 00:09:01.093 "data_offset": 0, 00:09:01.093 "data_size": 0 00:09:01.093 }, 00:09:01.093 { 00:09:01.093 "name": "BaseBdev2", 00:09:01.093 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:01.093 "is_configured": false, 00:09:01.093 "data_offset": 0, 00:09:01.093 "data_size": 0 00:09:01.093 } 00:09:01.093 ] 00:09:01.093 }' 00:09:01.093 23:30:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:01.093 23:30:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:01.657 23:30:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:09:01.915 [2024-07-24 23:30:46.734026] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:09:01.915 [2024-07-24 23:30:46.734044] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x928b10 name Existed_Raid, state configuring 00:09:01.915 23:30:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:09:01.915 [2024-07-24 23:30:46.914505] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:09:01.915 [2024-07-24 23:30:46.914524] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:09:01.915 [2024-07-24 23:30:46.914529] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:09:01.915 [2024-07-24 23:30:46.914534] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:09:02.173 23:30:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:09:02.173 [2024-07-24 23:30:47.095257] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:09:02.173 BaseBdev1 00:09:02.173 23:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:09:02.173 23:30:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:09:02.173 23:30:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:09:02.173 23:30:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:09:02.173 23:30:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:09:02.173 23:30:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:09:02.173 23:30:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:09:02.431 23:30:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:09:02.690 [ 00:09:02.690 { 00:09:02.690 "name": "BaseBdev1", 00:09:02.690 "aliases": [ 00:09:02.690 "6f10584a-ca06-4cdd-8527-a2a82c155d08" 00:09:02.690 ], 00:09:02.690 "product_name": "Malloc disk", 00:09:02.690 "block_size": 512, 00:09:02.690 "num_blocks": 65536, 00:09:02.690 "uuid": "6f10584a-ca06-4cdd-8527-a2a82c155d08", 00:09:02.690 "assigned_rate_limits": { 00:09:02.690 "rw_ios_per_sec": 0, 00:09:02.690 "rw_mbytes_per_sec": 0, 00:09:02.690 "r_mbytes_per_sec": 0, 00:09:02.690 "w_mbytes_per_sec": 0 00:09:02.690 }, 00:09:02.690 "claimed": true, 00:09:02.690 "claim_type": "exclusive_write", 00:09:02.690 "zoned": false, 00:09:02.690 "supported_io_types": { 00:09:02.690 "read": true, 00:09:02.690 "write": true, 00:09:02.690 "unmap": true, 00:09:02.690 "flush": true, 00:09:02.690 "reset": true, 00:09:02.690 "nvme_admin": false, 00:09:02.690 "nvme_io": false, 00:09:02.690 "nvme_io_md": false, 00:09:02.690 "write_zeroes": true, 00:09:02.690 "zcopy": true, 00:09:02.690 "get_zone_info": false, 00:09:02.690 "zone_management": false, 00:09:02.690 "zone_append": false, 00:09:02.690 "compare": false, 00:09:02.690 "compare_and_write": false, 00:09:02.690 "abort": true, 00:09:02.690 "seek_hole": false, 00:09:02.690 "seek_data": false, 00:09:02.690 "copy": true, 00:09:02.690 "nvme_iov_md": false 00:09:02.690 }, 00:09:02.690 "memory_domains": [ 00:09:02.690 { 00:09:02.690 "dma_device_id": "system", 00:09:02.690 "dma_device_type": 1 00:09:02.690 }, 00:09:02.690 { 00:09:02.690 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:02.690 "dma_device_type": 2 00:09:02.690 } 00:09:02.690 ], 00:09:02.690 "driver_specific": {} 00:09:02.690 } 00:09:02.690 ] 00:09:02.690 23:30:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:09:02.690 23:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:09:02.690 23:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:02.690 23:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:09:02.690 23:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:02.690 23:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:02.690 23:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:02.690 23:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:02.690 23:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:02.690 23:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:02.690 23:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:02.690 23:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:02.690 23:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:02.690 23:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:02.690 "name": "Existed_Raid", 00:09:02.690 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:02.690 "strip_size_kb": 64, 00:09:02.690 "state": "configuring", 00:09:02.690 "raid_level": "raid0", 00:09:02.690 "superblock": false, 00:09:02.690 "num_base_bdevs": 2, 00:09:02.690 "num_base_bdevs_discovered": 1, 00:09:02.690 "num_base_bdevs_operational": 2, 00:09:02.690 "base_bdevs_list": [ 00:09:02.690 { 00:09:02.690 "name": "BaseBdev1", 00:09:02.690 "uuid": "6f10584a-ca06-4cdd-8527-a2a82c155d08", 00:09:02.690 "is_configured": true, 00:09:02.690 "data_offset": 0, 00:09:02.690 "data_size": 65536 00:09:02.690 }, 00:09:02.690 { 00:09:02.690 "name": "BaseBdev2", 00:09:02.690 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:02.690 "is_configured": false, 00:09:02.690 "data_offset": 0, 00:09:02.690 "data_size": 0 00:09:02.690 } 00:09:02.690 ] 00:09:02.690 }' 00:09:02.690 23:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:02.690 23:30:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:03.255 23:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:09:03.255 [2024-07-24 23:30:48.250255] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:09:03.255 [2024-07-24 23:30:48.250292] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x9283a0 name Existed_Raid, state configuring 00:09:03.513 23:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:09:03.513 [2024-07-24 23:30:48.422721] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:09:03.513 [2024-07-24 23:30:48.423723] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:09:03.513 [2024-07-24 23:30:48.423748] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:09:03.513 23:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:09:03.513 23:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:09:03.513 23:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:09:03.513 23:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:03.513 23:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:09:03.513 23:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:03.513 23:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:03.513 23:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:03.513 23:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:03.513 23:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:03.513 23:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:03.513 23:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:03.513 23:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:03.513 23:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:03.771 23:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:03.771 "name": "Existed_Raid", 00:09:03.771 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:03.771 "strip_size_kb": 64, 00:09:03.771 "state": "configuring", 00:09:03.771 "raid_level": "raid0", 00:09:03.771 "superblock": false, 00:09:03.771 "num_base_bdevs": 2, 00:09:03.771 "num_base_bdevs_discovered": 1, 00:09:03.771 "num_base_bdevs_operational": 2, 00:09:03.771 "base_bdevs_list": [ 00:09:03.771 { 00:09:03.771 "name": "BaseBdev1", 00:09:03.771 "uuid": "6f10584a-ca06-4cdd-8527-a2a82c155d08", 00:09:03.771 "is_configured": true, 00:09:03.771 "data_offset": 0, 00:09:03.771 "data_size": 65536 00:09:03.771 }, 00:09:03.771 { 00:09:03.771 "name": "BaseBdev2", 00:09:03.771 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:03.771 "is_configured": false, 00:09:03.771 "data_offset": 0, 00:09:03.771 "data_size": 0 00:09:03.771 } 00:09:03.771 ] 00:09:03.771 }' 00:09:03.771 23:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:03.771 23:30:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:04.337 23:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:09:04.337 [2024-07-24 23:30:49.255487] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:09:04.337 [2024-07-24 23:30:49.255515] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x929050 00:09:04.337 [2024-07-24 23:30:49.255520] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:09:04.337 [2024-07-24 23:30:49.255678] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x92e9d0 00:09:04.337 [2024-07-24 23:30:49.255757] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x929050 00:09:04.337 [2024-07-24 23:30:49.255762] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x929050 00:09:04.337 [2024-07-24 23:30:49.255877] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:04.337 BaseBdev2 00:09:04.338 23:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:09:04.338 23:30:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:09:04.338 23:30:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:09:04.338 23:30:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:09:04.338 23:30:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:09:04.338 23:30:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:09:04.338 23:30:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:09:04.595 23:30:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:09:04.854 [ 00:09:04.854 { 00:09:04.854 "name": "BaseBdev2", 00:09:04.854 "aliases": [ 00:09:04.854 "d39a094a-52eb-4f31-9073-c292d5b3aa06" 00:09:04.854 ], 00:09:04.854 "product_name": "Malloc disk", 00:09:04.854 "block_size": 512, 00:09:04.854 "num_blocks": 65536, 00:09:04.854 "uuid": "d39a094a-52eb-4f31-9073-c292d5b3aa06", 00:09:04.854 "assigned_rate_limits": { 00:09:04.854 "rw_ios_per_sec": 0, 00:09:04.854 "rw_mbytes_per_sec": 0, 00:09:04.854 "r_mbytes_per_sec": 0, 00:09:04.854 "w_mbytes_per_sec": 0 00:09:04.854 }, 00:09:04.854 "claimed": true, 00:09:04.854 "claim_type": "exclusive_write", 00:09:04.854 "zoned": false, 00:09:04.854 "supported_io_types": { 00:09:04.854 "read": true, 00:09:04.854 "write": true, 00:09:04.854 "unmap": true, 00:09:04.854 "flush": true, 00:09:04.854 "reset": true, 00:09:04.854 "nvme_admin": false, 00:09:04.854 "nvme_io": false, 00:09:04.854 "nvme_io_md": false, 00:09:04.854 "write_zeroes": true, 00:09:04.854 "zcopy": true, 00:09:04.854 "get_zone_info": false, 00:09:04.854 "zone_management": false, 00:09:04.854 "zone_append": false, 00:09:04.854 "compare": false, 00:09:04.854 "compare_and_write": false, 00:09:04.854 "abort": true, 00:09:04.854 "seek_hole": false, 00:09:04.854 "seek_data": false, 00:09:04.854 "copy": true, 00:09:04.854 "nvme_iov_md": false 00:09:04.854 }, 00:09:04.854 "memory_domains": [ 00:09:04.854 { 00:09:04.854 "dma_device_id": "system", 00:09:04.854 "dma_device_type": 1 00:09:04.854 }, 00:09:04.854 { 00:09:04.854 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:04.854 "dma_device_type": 2 00:09:04.854 } 00:09:04.854 ], 00:09:04.854 "driver_specific": {} 00:09:04.854 } 00:09:04.854 ] 00:09:04.854 23:30:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:09:04.854 23:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:09:04.854 23:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:09:04.854 23:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 2 00:09:04.854 23:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:04.854 23:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:09:04.854 23:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:04.854 23:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:04.854 23:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:04.854 23:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:04.854 23:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:04.854 23:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:04.854 23:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:04.854 23:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:04.854 23:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:04.854 23:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:04.854 "name": "Existed_Raid", 00:09:04.854 "uuid": "00caa1b4-693e-44e8-83d9-86f1c3a88509", 00:09:04.854 "strip_size_kb": 64, 00:09:04.854 "state": "online", 00:09:04.854 "raid_level": "raid0", 00:09:04.854 "superblock": false, 00:09:04.854 "num_base_bdevs": 2, 00:09:04.854 "num_base_bdevs_discovered": 2, 00:09:04.854 "num_base_bdevs_operational": 2, 00:09:04.854 "base_bdevs_list": [ 00:09:04.854 { 00:09:04.854 "name": "BaseBdev1", 00:09:04.854 "uuid": "6f10584a-ca06-4cdd-8527-a2a82c155d08", 00:09:04.854 "is_configured": true, 00:09:04.854 "data_offset": 0, 00:09:04.854 "data_size": 65536 00:09:04.854 }, 00:09:04.854 { 00:09:04.854 "name": "BaseBdev2", 00:09:04.854 "uuid": "d39a094a-52eb-4f31-9073-c292d5b3aa06", 00:09:04.854 "is_configured": true, 00:09:04.854 "data_offset": 0, 00:09:04.854 "data_size": 65536 00:09:04.854 } 00:09:04.854 ] 00:09:04.854 }' 00:09:04.854 23:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:04.854 23:30:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:05.420 23:30:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:09:05.420 23:30:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:09:05.420 23:30:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:09:05.420 23:30:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:09:05.420 23:30:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:09:05.420 23:30:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:09:05.420 23:30:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:09:05.421 23:30:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:09:05.421 [2024-07-24 23:30:50.394648] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:09:05.421 23:30:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:09:05.421 "name": "Existed_Raid", 00:09:05.421 "aliases": [ 00:09:05.421 "00caa1b4-693e-44e8-83d9-86f1c3a88509" 00:09:05.421 ], 00:09:05.421 "product_name": "Raid Volume", 00:09:05.421 "block_size": 512, 00:09:05.421 "num_blocks": 131072, 00:09:05.421 "uuid": "00caa1b4-693e-44e8-83d9-86f1c3a88509", 00:09:05.421 "assigned_rate_limits": { 00:09:05.421 "rw_ios_per_sec": 0, 00:09:05.421 "rw_mbytes_per_sec": 0, 00:09:05.421 "r_mbytes_per_sec": 0, 00:09:05.421 "w_mbytes_per_sec": 0 00:09:05.421 }, 00:09:05.421 "claimed": false, 00:09:05.421 "zoned": false, 00:09:05.421 "supported_io_types": { 00:09:05.421 "read": true, 00:09:05.421 "write": true, 00:09:05.421 "unmap": true, 00:09:05.421 "flush": true, 00:09:05.421 "reset": true, 00:09:05.421 "nvme_admin": false, 00:09:05.421 "nvme_io": false, 00:09:05.421 "nvme_io_md": false, 00:09:05.421 "write_zeroes": true, 00:09:05.421 "zcopy": false, 00:09:05.421 "get_zone_info": false, 00:09:05.421 "zone_management": false, 00:09:05.421 "zone_append": false, 00:09:05.421 "compare": false, 00:09:05.421 "compare_and_write": false, 00:09:05.421 "abort": false, 00:09:05.421 "seek_hole": false, 00:09:05.421 "seek_data": false, 00:09:05.421 "copy": false, 00:09:05.421 "nvme_iov_md": false 00:09:05.421 }, 00:09:05.421 "memory_domains": [ 00:09:05.421 { 00:09:05.421 "dma_device_id": "system", 00:09:05.421 "dma_device_type": 1 00:09:05.421 }, 00:09:05.421 { 00:09:05.421 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:05.421 "dma_device_type": 2 00:09:05.421 }, 00:09:05.421 { 00:09:05.421 "dma_device_id": "system", 00:09:05.421 "dma_device_type": 1 00:09:05.421 }, 00:09:05.421 { 00:09:05.421 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:05.421 "dma_device_type": 2 00:09:05.421 } 00:09:05.421 ], 00:09:05.421 "driver_specific": { 00:09:05.421 "raid": { 00:09:05.421 "uuid": "00caa1b4-693e-44e8-83d9-86f1c3a88509", 00:09:05.421 "strip_size_kb": 64, 00:09:05.421 "state": "online", 00:09:05.421 "raid_level": "raid0", 00:09:05.421 "superblock": false, 00:09:05.421 "num_base_bdevs": 2, 00:09:05.421 "num_base_bdevs_discovered": 2, 00:09:05.421 "num_base_bdevs_operational": 2, 00:09:05.421 "base_bdevs_list": [ 00:09:05.421 { 00:09:05.421 "name": "BaseBdev1", 00:09:05.421 "uuid": "6f10584a-ca06-4cdd-8527-a2a82c155d08", 00:09:05.421 "is_configured": true, 00:09:05.421 "data_offset": 0, 00:09:05.421 "data_size": 65536 00:09:05.421 }, 00:09:05.421 { 00:09:05.421 "name": "BaseBdev2", 00:09:05.421 "uuid": "d39a094a-52eb-4f31-9073-c292d5b3aa06", 00:09:05.421 "is_configured": true, 00:09:05.421 "data_offset": 0, 00:09:05.421 "data_size": 65536 00:09:05.421 } 00:09:05.421 ] 00:09:05.421 } 00:09:05.421 } 00:09:05.421 }' 00:09:05.421 23:30:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:09:05.680 23:30:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:09:05.680 BaseBdev2' 00:09:05.680 23:30:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:09:05.680 23:30:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:09:05.680 23:30:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:09:05.680 23:30:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:09:05.680 "name": "BaseBdev1", 00:09:05.680 "aliases": [ 00:09:05.680 "6f10584a-ca06-4cdd-8527-a2a82c155d08" 00:09:05.680 ], 00:09:05.680 "product_name": "Malloc disk", 00:09:05.680 "block_size": 512, 00:09:05.680 "num_blocks": 65536, 00:09:05.680 "uuid": "6f10584a-ca06-4cdd-8527-a2a82c155d08", 00:09:05.680 "assigned_rate_limits": { 00:09:05.680 "rw_ios_per_sec": 0, 00:09:05.680 "rw_mbytes_per_sec": 0, 00:09:05.680 "r_mbytes_per_sec": 0, 00:09:05.680 "w_mbytes_per_sec": 0 00:09:05.680 }, 00:09:05.680 "claimed": true, 00:09:05.680 "claim_type": "exclusive_write", 00:09:05.680 "zoned": false, 00:09:05.680 "supported_io_types": { 00:09:05.680 "read": true, 00:09:05.680 "write": true, 00:09:05.680 "unmap": true, 00:09:05.680 "flush": true, 00:09:05.680 "reset": true, 00:09:05.680 "nvme_admin": false, 00:09:05.680 "nvme_io": false, 00:09:05.680 "nvme_io_md": false, 00:09:05.680 "write_zeroes": true, 00:09:05.680 "zcopy": true, 00:09:05.680 "get_zone_info": false, 00:09:05.680 "zone_management": false, 00:09:05.680 "zone_append": false, 00:09:05.680 "compare": false, 00:09:05.680 "compare_and_write": false, 00:09:05.680 "abort": true, 00:09:05.680 "seek_hole": false, 00:09:05.680 "seek_data": false, 00:09:05.680 "copy": true, 00:09:05.680 "nvme_iov_md": false 00:09:05.680 }, 00:09:05.680 "memory_domains": [ 00:09:05.680 { 00:09:05.680 "dma_device_id": "system", 00:09:05.680 "dma_device_type": 1 00:09:05.680 }, 00:09:05.680 { 00:09:05.680 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:05.680 "dma_device_type": 2 00:09:05.680 } 00:09:05.680 ], 00:09:05.680 "driver_specific": {} 00:09:05.680 }' 00:09:05.680 23:30:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:05.680 23:30:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:05.938 23:30:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:09:05.938 23:30:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:05.938 23:30:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:05.938 23:30:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:09:05.938 23:30:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:05.938 23:30:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:05.938 23:30:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:09:05.938 23:30:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:05.938 23:30:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:05.938 23:30:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:09:05.938 23:30:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:09:05.939 23:30:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:09:05.939 23:30:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:09:06.196 23:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:09:06.196 "name": "BaseBdev2", 00:09:06.196 "aliases": [ 00:09:06.196 "d39a094a-52eb-4f31-9073-c292d5b3aa06" 00:09:06.196 ], 00:09:06.196 "product_name": "Malloc disk", 00:09:06.196 "block_size": 512, 00:09:06.196 "num_blocks": 65536, 00:09:06.196 "uuid": "d39a094a-52eb-4f31-9073-c292d5b3aa06", 00:09:06.196 "assigned_rate_limits": { 00:09:06.196 "rw_ios_per_sec": 0, 00:09:06.196 "rw_mbytes_per_sec": 0, 00:09:06.196 "r_mbytes_per_sec": 0, 00:09:06.196 "w_mbytes_per_sec": 0 00:09:06.196 }, 00:09:06.196 "claimed": true, 00:09:06.196 "claim_type": "exclusive_write", 00:09:06.196 "zoned": false, 00:09:06.196 "supported_io_types": { 00:09:06.196 "read": true, 00:09:06.196 "write": true, 00:09:06.196 "unmap": true, 00:09:06.196 "flush": true, 00:09:06.196 "reset": true, 00:09:06.196 "nvme_admin": false, 00:09:06.196 "nvme_io": false, 00:09:06.196 "nvme_io_md": false, 00:09:06.196 "write_zeroes": true, 00:09:06.196 "zcopy": true, 00:09:06.196 "get_zone_info": false, 00:09:06.196 "zone_management": false, 00:09:06.196 "zone_append": false, 00:09:06.196 "compare": false, 00:09:06.196 "compare_and_write": false, 00:09:06.196 "abort": true, 00:09:06.196 "seek_hole": false, 00:09:06.196 "seek_data": false, 00:09:06.196 "copy": true, 00:09:06.196 "nvme_iov_md": false 00:09:06.196 }, 00:09:06.196 "memory_domains": [ 00:09:06.196 { 00:09:06.196 "dma_device_id": "system", 00:09:06.196 "dma_device_type": 1 00:09:06.196 }, 00:09:06.196 { 00:09:06.196 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:06.196 "dma_device_type": 2 00:09:06.196 } 00:09:06.196 ], 00:09:06.196 "driver_specific": {} 00:09:06.196 }' 00:09:06.196 23:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:06.196 23:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:06.196 23:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:09:06.196 23:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:06.454 23:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:06.454 23:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:09:06.454 23:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:06.454 23:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:06.454 23:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:09:06.454 23:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:06.454 23:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:06.454 23:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:09:06.454 23:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:09:06.712 [2024-07-24 23:30:51.537444] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:09:06.712 [2024-07-24 23:30:51.537463] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:09:06.712 [2024-07-24 23:30:51.537510] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:09:06.712 23:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:09:06.712 23:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:09:06.712 23:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:09:06.712 23:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:09:06.712 23:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:09:06.712 23:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 1 00:09:06.712 23:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:06.712 23:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:09:06.712 23:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:06.712 23:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:06.712 23:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:09:06.712 23:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:06.712 23:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:06.712 23:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:06.712 23:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:06.712 23:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:06.712 23:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:06.969 23:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:06.969 "name": "Existed_Raid", 00:09:06.969 "uuid": "00caa1b4-693e-44e8-83d9-86f1c3a88509", 00:09:06.969 "strip_size_kb": 64, 00:09:06.969 "state": "offline", 00:09:06.969 "raid_level": "raid0", 00:09:06.969 "superblock": false, 00:09:06.969 "num_base_bdevs": 2, 00:09:06.969 "num_base_bdevs_discovered": 1, 00:09:06.969 "num_base_bdevs_operational": 1, 00:09:06.969 "base_bdevs_list": [ 00:09:06.969 { 00:09:06.969 "name": null, 00:09:06.969 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:06.969 "is_configured": false, 00:09:06.969 "data_offset": 0, 00:09:06.969 "data_size": 65536 00:09:06.969 }, 00:09:06.969 { 00:09:06.969 "name": "BaseBdev2", 00:09:06.969 "uuid": "d39a094a-52eb-4f31-9073-c292d5b3aa06", 00:09:06.969 "is_configured": true, 00:09:06.969 "data_offset": 0, 00:09:06.969 "data_size": 65536 00:09:06.969 } 00:09:06.969 ] 00:09:06.969 }' 00:09:06.969 23:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:06.969 23:30:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:07.227 23:30:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:09:07.227 23:30:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:09:07.227 23:30:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:07.227 23:30:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:09:07.485 23:30:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:09:07.485 23:30:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:09:07.485 23:30:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:09:07.743 [2024-07-24 23:30:52.512858] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:09:07.743 [2024-07-24 23:30:52.512898] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x929050 name Existed_Raid, state offline 00:09:07.743 23:30:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:09:07.743 23:30:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:09:07.743 23:30:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:07.743 23:30:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:09:07.743 23:30:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:09:07.743 23:30:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:09:07.743 23:30:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:09:07.743 23:30:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 242757 00:09:07.743 23:30:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@950 -- # '[' -z 242757 ']' 00:09:07.743 23:30:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # kill -0 242757 00:09:07.743 23:30:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # uname 00:09:07.743 23:30:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:09:07.743 23:30:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 242757 00:09:08.002 23:30:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:09:08.002 23:30:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:09:08.002 23:30:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 242757' 00:09:08.002 killing process with pid 242757 00:09:08.002 23:30:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@969 -- # kill 242757 00:09:08.003 [2024-07-24 23:30:52.749339] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:09:08.003 23:30:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@974 -- # wait 242757 00:09:08.003 [2024-07-24 23:30:52.750118] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:09:08.003 23:30:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:09:08.003 00:09:08.003 real 0m8.015s 00:09:08.003 user 0m14.316s 00:09:08.003 sys 0m1.299s 00:09:08.003 23:30:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:08.003 23:30:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:08.003 ************************************ 00:09:08.003 END TEST raid_state_function_test 00:09:08.003 ************************************ 00:09:08.003 23:30:52 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 2 true 00:09:08.003 23:30:52 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:09:08.003 23:30:52 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:08.003 23:30:52 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:09:08.003 ************************************ 00:09:08.003 START TEST raid_state_function_test_sb 00:09:08.003 ************************************ 00:09:08.003 23:30:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # raid_state_function_test raid0 2 true 00:09:08.003 23:30:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:09:08.003 23:30:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:09:08.003 23:30:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:09:08.003 23:30:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:09:08.003 23:30:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:09:08.003 23:30:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:09:08.003 23:30:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:09:08.003 23:30:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:09:08.003 23:30:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:09:08.003 23:30:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:09:08.003 23:30:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:09:08.003 23:30:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:09:08.003 23:30:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:09:08.003 23:30:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:09:08.003 23:30:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:09:08.003 23:30:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:09:08.003 23:30:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:09:08.003 23:30:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:09:08.003 23:30:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:09:08.003 23:30:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:09:08.003 23:30:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:09:08.003 23:30:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:09:08.003 23:30:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:09:08.003 23:30:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=244352 00:09:08.003 23:30:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 244352' 00:09:08.003 Process raid pid: 244352 00:09:08.003 23:30:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:09:08.003 23:30:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 244352 /var/tmp/spdk-raid.sock 00:09:08.003 23:30:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@831 -- # '[' -z 244352 ']' 00:09:08.003 23:30:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:09:08.003 23:30:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # local max_retries=100 00:09:08.003 23:30:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:09:08.003 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:09:08.003 23:30:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@840 -- # xtrace_disable 00:09:08.003 23:30:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:09:08.262 [2024-07-24 23:30:53.047953] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:09:08.262 [2024-07-24 23:30:53.047986] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:09:08.262 [2024-07-24 23:30:53.110376] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:08.262 [2024-07-24 23:30:53.182590] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:08.262 [2024-07-24 23:30:53.232135] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:08.262 [2024-07-24 23:30:53.232159] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:09.196 23:30:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:09:09.196 23:30:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@864 -- # return 0 00:09:09.196 23:30:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:09:09.196 [2024-07-24 23:30:53.982784] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:09:09.196 [2024-07-24 23:30:53.982813] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:09:09.196 [2024-07-24 23:30:53.982819] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:09:09.196 [2024-07-24 23:30:53.982824] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:09:09.196 23:30:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:09:09.196 23:30:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:09.196 23:30:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:09:09.196 23:30:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:09.196 23:30:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:09.196 23:30:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:09.196 23:30:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:09.196 23:30:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:09.196 23:30:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:09.196 23:30:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:09.196 23:30:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:09.196 23:30:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:09.196 23:30:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:09.196 "name": "Existed_Raid", 00:09:09.196 "uuid": "fa0bcbe1-1c33-4cb6-ad38-331a2ef7cfd9", 00:09:09.196 "strip_size_kb": 64, 00:09:09.196 "state": "configuring", 00:09:09.196 "raid_level": "raid0", 00:09:09.196 "superblock": true, 00:09:09.196 "num_base_bdevs": 2, 00:09:09.196 "num_base_bdevs_discovered": 0, 00:09:09.196 "num_base_bdevs_operational": 2, 00:09:09.196 "base_bdevs_list": [ 00:09:09.196 { 00:09:09.196 "name": "BaseBdev1", 00:09:09.196 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:09.196 "is_configured": false, 00:09:09.196 "data_offset": 0, 00:09:09.196 "data_size": 0 00:09:09.196 }, 00:09:09.196 { 00:09:09.196 "name": "BaseBdev2", 00:09:09.196 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:09.196 "is_configured": false, 00:09:09.196 "data_offset": 0, 00:09:09.196 "data_size": 0 00:09:09.196 } 00:09:09.196 ] 00:09:09.196 }' 00:09:09.196 23:30:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:09.196 23:30:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:09:09.763 23:30:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:09:10.021 [2024-07-24 23:30:54.768755] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:09:10.021 [2024-07-24 23:30:54.768778] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x7d7b10 name Existed_Raid, state configuring 00:09:10.021 23:30:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:09:10.021 [2024-07-24 23:30:54.949226] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:09:10.021 [2024-07-24 23:30:54.949247] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:09:10.021 [2024-07-24 23:30:54.949252] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:09:10.021 [2024-07-24 23:30:54.949257] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:09:10.021 23:30:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:09:10.279 [2024-07-24 23:30:55.137965] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:09:10.279 BaseBdev1 00:09:10.279 23:30:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:09:10.279 23:30:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:09:10.279 23:30:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:09:10.279 23:30:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:09:10.279 23:30:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:09:10.279 23:30:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:09:10.279 23:30:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:09:10.537 23:30:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:09:10.537 [ 00:09:10.537 { 00:09:10.537 "name": "BaseBdev1", 00:09:10.537 "aliases": [ 00:09:10.537 "4df30685-5a5d-45b6-ad12-1e535dbcb4c9" 00:09:10.537 ], 00:09:10.537 "product_name": "Malloc disk", 00:09:10.537 "block_size": 512, 00:09:10.537 "num_blocks": 65536, 00:09:10.537 "uuid": "4df30685-5a5d-45b6-ad12-1e535dbcb4c9", 00:09:10.537 "assigned_rate_limits": { 00:09:10.537 "rw_ios_per_sec": 0, 00:09:10.537 "rw_mbytes_per_sec": 0, 00:09:10.537 "r_mbytes_per_sec": 0, 00:09:10.537 "w_mbytes_per_sec": 0 00:09:10.537 }, 00:09:10.537 "claimed": true, 00:09:10.537 "claim_type": "exclusive_write", 00:09:10.537 "zoned": false, 00:09:10.537 "supported_io_types": { 00:09:10.537 "read": true, 00:09:10.537 "write": true, 00:09:10.537 "unmap": true, 00:09:10.537 "flush": true, 00:09:10.537 "reset": true, 00:09:10.537 "nvme_admin": false, 00:09:10.537 "nvme_io": false, 00:09:10.537 "nvme_io_md": false, 00:09:10.537 "write_zeroes": true, 00:09:10.537 "zcopy": true, 00:09:10.537 "get_zone_info": false, 00:09:10.537 "zone_management": false, 00:09:10.537 "zone_append": false, 00:09:10.537 "compare": false, 00:09:10.537 "compare_and_write": false, 00:09:10.537 "abort": true, 00:09:10.537 "seek_hole": false, 00:09:10.537 "seek_data": false, 00:09:10.537 "copy": true, 00:09:10.537 "nvme_iov_md": false 00:09:10.537 }, 00:09:10.537 "memory_domains": [ 00:09:10.537 { 00:09:10.537 "dma_device_id": "system", 00:09:10.537 "dma_device_type": 1 00:09:10.537 }, 00:09:10.537 { 00:09:10.537 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:10.537 "dma_device_type": 2 00:09:10.537 } 00:09:10.537 ], 00:09:10.537 "driver_specific": {} 00:09:10.537 } 00:09:10.537 ] 00:09:10.537 23:30:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:09:10.537 23:30:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:09:10.537 23:30:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:10.537 23:30:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:09:10.537 23:30:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:10.537 23:30:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:10.537 23:30:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:10.537 23:30:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:10.537 23:30:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:10.537 23:30:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:10.537 23:30:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:10.537 23:30:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:10.537 23:30:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:10.795 23:30:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:10.795 "name": "Existed_Raid", 00:09:10.795 "uuid": "9b0b3298-eeb9-4472-bd1c-10db05a639cf", 00:09:10.795 "strip_size_kb": 64, 00:09:10.795 "state": "configuring", 00:09:10.795 "raid_level": "raid0", 00:09:10.795 "superblock": true, 00:09:10.795 "num_base_bdevs": 2, 00:09:10.795 "num_base_bdevs_discovered": 1, 00:09:10.795 "num_base_bdevs_operational": 2, 00:09:10.795 "base_bdevs_list": [ 00:09:10.795 { 00:09:10.795 "name": "BaseBdev1", 00:09:10.795 "uuid": "4df30685-5a5d-45b6-ad12-1e535dbcb4c9", 00:09:10.795 "is_configured": true, 00:09:10.795 "data_offset": 2048, 00:09:10.795 "data_size": 63488 00:09:10.795 }, 00:09:10.795 { 00:09:10.795 "name": "BaseBdev2", 00:09:10.795 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:10.795 "is_configured": false, 00:09:10.795 "data_offset": 0, 00:09:10.795 "data_size": 0 00:09:10.795 } 00:09:10.795 ] 00:09:10.795 }' 00:09:10.795 23:30:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:10.795 23:30:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:09:11.361 23:30:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:09:11.361 [2024-07-24 23:30:56.308980] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:09:11.361 [2024-07-24 23:30:56.309011] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x7d73a0 name Existed_Raid, state configuring 00:09:11.361 23:30:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:09:11.620 [2024-07-24 23:30:56.493480] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:09:11.620 [2024-07-24 23:30:56.494458] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:09:11.620 [2024-07-24 23:30:56.494491] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:09:11.620 23:30:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:09:11.620 23:30:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:09:11.620 23:30:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:09:11.620 23:30:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:11.620 23:30:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:09:11.620 23:30:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:11.620 23:30:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:11.620 23:30:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:11.620 23:30:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:11.620 23:30:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:11.620 23:30:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:11.620 23:30:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:11.620 23:30:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:11.620 23:30:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:11.878 23:30:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:11.878 "name": "Existed_Raid", 00:09:11.878 "uuid": "55ce2a0a-80b0-4345-b546-75485cecd25e", 00:09:11.878 "strip_size_kb": 64, 00:09:11.878 "state": "configuring", 00:09:11.878 "raid_level": "raid0", 00:09:11.878 "superblock": true, 00:09:11.878 "num_base_bdevs": 2, 00:09:11.878 "num_base_bdevs_discovered": 1, 00:09:11.878 "num_base_bdevs_operational": 2, 00:09:11.878 "base_bdevs_list": [ 00:09:11.878 { 00:09:11.878 "name": "BaseBdev1", 00:09:11.878 "uuid": "4df30685-5a5d-45b6-ad12-1e535dbcb4c9", 00:09:11.878 "is_configured": true, 00:09:11.878 "data_offset": 2048, 00:09:11.878 "data_size": 63488 00:09:11.878 }, 00:09:11.878 { 00:09:11.878 "name": "BaseBdev2", 00:09:11.878 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:11.878 "is_configured": false, 00:09:11.878 "data_offset": 0, 00:09:11.878 "data_size": 0 00:09:11.878 } 00:09:11.878 ] 00:09:11.878 }' 00:09:11.878 23:30:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:11.878 23:30:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:09:12.444 23:30:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:09:12.444 [2024-07-24 23:30:57.322158] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:09:12.444 [2024-07-24 23:30:57.322275] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x7d8050 00:09:12.444 [2024-07-24 23:30:57.322283] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:09:12.444 [2024-07-24 23:30:57.322394] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x7dbeb0 00:09:12.444 [2024-07-24 23:30:57.322475] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x7d8050 00:09:12.444 [2024-07-24 23:30:57.322480] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x7d8050 00:09:12.444 [2024-07-24 23:30:57.322558] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:12.444 BaseBdev2 00:09:12.444 23:30:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:09:12.444 23:30:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:09:12.444 23:30:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:09:12.444 23:30:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:09:12.444 23:30:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:09:12.444 23:30:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:09:12.444 23:30:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:09:12.703 23:30:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:09:12.703 [ 00:09:12.703 { 00:09:12.703 "name": "BaseBdev2", 00:09:12.703 "aliases": [ 00:09:12.703 "1abd99b5-5365-452c-9c82-c5d431fa0429" 00:09:12.703 ], 00:09:12.703 "product_name": "Malloc disk", 00:09:12.703 "block_size": 512, 00:09:12.703 "num_blocks": 65536, 00:09:12.703 "uuid": "1abd99b5-5365-452c-9c82-c5d431fa0429", 00:09:12.703 "assigned_rate_limits": { 00:09:12.703 "rw_ios_per_sec": 0, 00:09:12.703 "rw_mbytes_per_sec": 0, 00:09:12.703 "r_mbytes_per_sec": 0, 00:09:12.703 "w_mbytes_per_sec": 0 00:09:12.703 }, 00:09:12.703 "claimed": true, 00:09:12.703 "claim_type": "exclusive_write", 00:09:12.703 "zoned": false, 00:09:12.703 "supported_io_types": { 00:09:12.703 "read": true, 00:09:12.703 "write": true, 00:09:12.703 "unmap": true, 00:09:12.703 "flush": true, 00:09:12.703 "reset": true, 00:09:12.703 "nvme_admin": false, 00:09:12.703 "nvme_io": false, 00:09:12.703 "nvme_io_md": false, 00:09:12.703 "write_zeroes": true, 00:09:12.703 "zcopy": true, 00:09:12.703 "get_zone_info": false, 00:09:12.703 "zone_management": false, 00:09:12.703 "zone_append": false, 00:09:12.703 "compare": false, 00:09:12.703 "compare_and_write": false, 00:09:12.703 "abort": true, 00:09:12.703 "seek_hole": false, 00:09:12.703 "seek_data": false, 00:09:12.703 "copy": true, 00:09:12.703 "nvme_iov_md": false 00:09:12.703 }, 00:09:12.703 "memory_domains": [ 00:09:12.703 { 00:09:12.703 "dma_device_id": "system", 00:09:12.703 "dma_device_type": 1 00:09:12.703 }, 00:09:12.703 { 00:09:12.703 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:12.703 "dma_device_type": 2 00:09:12.703 } 00:09:12.703 ], 00:09:12.703 "driver_specific": {} 00:09:12.703 } 00:09:12.703 ] 00:09:12.703 23:30:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:09:12.703 23:30:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:09:12.703 23:30:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:09:12.703 23:30:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 2 00:09:12.703 23:30:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:12.703 23:30:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:09:12.703 23:30:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:12.703 23:30:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:12.703 23:30:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:12.703 23:30:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:12.703 23:30:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:12.703 23:30:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:12.703 23:30:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:12.703 23:30:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:12.703 23:30:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:12.961 23:30:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:12.961 "name": "Existed_Raid", 00:09:12.961 "uuid": "55ce2a0a-80b0-4345-b546-75485cecd25e", 00:09:12.961 "strip_size_kb": 64, 00:09:12.961 "state": "online", 00:09:12.961 "raid_level": "raid0", 00:09:12.961 "superblock": true, 00:09:12.961 "num_base_bdevs": 2, 00:09:12.961 "num_base_bdevs_discovered": 2, 00:09:12.961 "num_base_bdevs_operational": 2, 00:09:12.961 "base_bdevs_list": [ 00:09:12.961 { 00:09:12.961 "name": "BaseBdev1", 00:09:12.961 "uuid": "4df30685-5a5d-45b6-ad12-1e535dbcb4c9", 00:09:12.961 "is_configured": true, 00:09:12.961 "data_offset": 2048, 00:09:12.961 "data_size": 63488 00:09:12.961 }, 00:09:12.961 { 00:09:12.961 "name": "BaseBdev2", 00:09:12.961 "uuid": "1abd99b5-5365-452c-9c82-c5d431fa0429", 00:09:12.961 "is_configured": true, 00:09:12.961 "data_offset": 2048, 00:09:12.961 "data_size": 63488 00:09:12.961 } 00:09:12.961 ] 00:09:12.961 }' 00:09:12.961 23:30:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:12.961 23:30:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:09:13.526 23:30:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:09:13.526 23:30:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:09:13.526 23:30:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:09:13.526 23:30:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:09:13.526 23:30:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:09:13.526 23:30:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:09:13.526 23:30:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:09:13.526 23:30:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:09:13.526 [2024-07-24 23:30:58.517443] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:09:13.784 23:30:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:09:13.784 "name": "Existed_Raid", 00:09:13.784 "aliases": [ 00:09:13.784 "55ce2a0a-80b0-4345-b546-75485cecd25e" 00:09:13.784 ], 00:09:13.784 "product_name": "Raid Volume", 00:09:13.784 "block_size": 512, 00:09:13.784 "num_blocks": 126976, 00:09:13.784 "uuid": "55ce2a0a-80b0-4345-b546-75485cecd25e", 00:09:13.784 "assigned_rate_limits": { 00:09:13.784 "rw_ios_per_sec": 0, 00:09:13.784 "rw_mbytes_per_sec": 0, 00:09:13.784 "r_mbytes_per_sec": 0, 00:09:13.784 "w_mbytes_per_sec": 0 00:09:13.784 }, 00:09:13.784 "claimed": false, 00:09:13.784 "zoned": false, 00:09:13.784 "supported_io_types": { 00:09:13.784 "read": true, 00:09:13.784 "write": true, 00:09:13.784 "unmap": true, 00:09:13.784 "flush": true, 00:09:13.784 "reset": true, 00:09:13.784 "nvme_admin": false, 00:09:13.784 "nvme_io": false, 00:09:13.784 "nvme_io_md": false, 00:09:13.784 "write_zeroes": true, 00:09:13.784 "zcopy": false, 00:09:13.784 "get_zone_info": false, 00:09:13.784 "zone_management": false, 00:09:13.784 "zone_append": false, 00:09:13.784 "compare": false, 00:09:13.784 "compare_and_write": false, 00:09:13.784 "abort": false, 00:09:13.784 "seek_hole": false, 00:09:13.784 "seek_data": false, 00:09:13.784 "copy": false, 00:09:13.784 "nvme_iov_md": false 00:09:13.784 }, 00:09:13.784 "memory_domains": [ 00:09:13.784 { 00:09:13.784 "dma_device_id": "system", 00:09:13.784 "dma_device_type": 1 00:09:13.784 }, 00:09:13.784 { 00:09:13.784 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:13.784 "dma_device_type": 2 00:09:13.784 }, 00:09:13.784 { 00:09:13.784 "dma_device_id": "system", 00:09:13.784 "dma_device_type": 1 00:09:13.784 }, 00:09:13.784 { 00:09:13.784 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:13.784 "dma_device_type": 2 00:09:13.784 } 00:09:13.784 ], 00:09:13.784 "driver_specific": { 00:09:13.784 "raid": { 00:09:13.784 "uuid": "55ce2a0a-80b0-4345-b546-75485cecd25e", 00:09:13.784 "strip_size_kb": 64, 00:09:13.784 "state": "online", 00:09:13.784 "raid_level": "raid0", 00:09:13.784 "superblock": true, 00:09:13.784 "num_base_bdevs": 2, 00:09:13.784 "num_base_bdevs_discovered": 2, 00:09:13.784 "num_base_bdevs_operational": 2, 00:09:13.784 "base_bdevs_list": [ 00:09:13.784 { 00:09:13.784 "name": "BaseBdev1", 00:09:13.784 "uuid": "4df30685-5a5d-45b6-ad12-1e535dbcb4c9", 00:09:13.784 "is_configured": true, 00:09:13.784 "data_offset": 2048, 00:09:13.784 "data_size": 63488 00:09:13.784 }, 00:09:13.784 { 00:09:13.784 "name": "BaseBdev2", 00:09:13.784 "uuid": "1abd99b5-5365-452c-9c82-c5d431fa0429", 00:09:13.784 "is_configured": true, 00:09:13.784 "data_offset": 2048, 00:09:13.784 "data_size": 63488 00:09:13.784 } 00:09:13.784 ] 00:09:13.784 } 00:09:13.784 } 00:09:13.784 }' 00:09:13.784 23:30:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:09:13.784 23:30:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:09:13.784 BaseBdev2' 00:09:13.784 23:30:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:09:13.784 23:30:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:09:13.784 23:30:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:09:13.784 23:30:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:09:13.784 "name": "BaseBdev1", 00:09:13.784 "aliases": [ 00:09:13.784 "4df30685-5a5d-45b6-ad12-1e535dbcb4c9" 00:09:13.784 ], 00:09:13.784 "product_name": "Malloc disk", 00:09:13.784 "block_size": 512, 00:09:13.784 "num_blocks": 65536, 00:09:13.784 "uuid": "4df30685-5a5d-45b6-ad12-1e535dbcb4c9", 00:09:13.784 "assigned_rate_limits": { 00:09:13.784 "rw_ios_per_sec": 0, 00:09:13.784 "rw_mbytes_per_sec": 0, 00:09:13.784 "r_mbytes_per_sec": 0, 00:09:13.784 "w_mbytes_per_sec": 0 00:09:13.784 }, 00:09:13.784 "claimed": true, 00:09:13.784 "claim_type": "exclusive_write", 00:09:13.784 "zoned": false, 00:09:13.784 "supported_io_types": { 00:09:13.784 "read": true, 00:09:13.784 "write": true, 00:09:13.784 "unmap": true, 00:09:13.784 "flush": true, 00:09:13.784 "reset": true, 00:09:13.784 "nvme_admin": false, 00:09:13.784 "nvme_io": false, 00:09:13.784 "nvme_io_md": false, 00:09:13.784 "write_zeroes": true, 00:09:13.784 "zcopy": true, 00:09:13.784 "get_zone_info": false, 00:09:13.784 "zone_management": false, 00:09:13.784 "zone_append": false, 00:09:13.784 "compare": false, 00:09:13.784 "compare_and_write": false, 00:09:13.784 "abort": true, 00:09:13.784 "seek_hole": false, 00:09:13.784 "seek_data": false, 00:09:13.784 "copy": true, 00:09:13.784 "nvme_iov_md": false 00:09:13.784 }, 00:09:13.784 "memory_domains": [ 00:09:13.784 { 00:09:13.784 "dma_device_id": "system", 00:09:13.784 "dma_device_type": 1 00:09:13.784 }, 00:09:13.784 { 00:09:13.784 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:13.784 "dma_device_type": 2 00:09:13.784 } 00:09:13.784 ], 00:09:13.784 "driver_specific": {} 00:09:13.784 }' 00:09:13.784 23:30:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:14.042 23:30:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:14.042 23:30:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:09:14.042 23:30:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:14.042 23:30:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:14.042 23:30:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:09:14.042 23:30:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:14.042 23:30:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:14.042 23:30:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:09:14.042 23:30:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:14.042 23:30:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:14.300 23:30:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:09:14.300 23:30:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:09:14.300 23:30:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:09:14.300 23:30:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:09:14.300 23:30:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:09:14.300 "name": "BaseBdev2", 00:09:14.300 "aliases": [ 00:09:14.300 "1abd99b5-5365-452c-9c82-c5d431fa0429" 00:09:14.300 ], 00:09:14.300 "product_name": "Malloc disk", 00:09:14.300 "block_size": 512, 00:09:14.300 "num_blocks": 65536, 00:09:14.300 "uuid": "1abd99b5-5365-452c-9c82-c5d431fa0429", 00:09:14.300 "assigned_rate_limits": { 00:09:14.300 "rw_ios_per_sec": 0, 00:09:14.300 "rw_mbytes_per_sec": 0, 00:09:14.300 "r_mbytes_per_sec": 0, 00:09:14.300 "w_mbytes_per_sec": 0 00:09:14.300 }, 00:09:14.300 "claimed": true, 00:09:14.300 "claim_type": "exclusive_write", 00:09:14.300 "zoned": false, 00:09:14.300 "supported_io_types": { 00:09:14.300 "read": true, 00:09:14.300 "write": true, 00:09:14.300 "unmap": true, 00:09:14.300 "flush": true, 00:09:14.300 "reset": true, 00:09:14.300 "nvme_admin": false, 00:09:14.300 "nvme_io": false, 00:09:14.300 "nvme_io_md": false, 00:09:14.300 "write_zeroes": true, 00:09:14.300 "zcopy": true, 00:09:14.300 "get_zone_info": false, 00:09:14.300 "zone_management": false, 00:09:14.300 "zone_append": false, 00:09:14.300 "compare": false, 00:09:14.300 "compare_and_write": false, 00:09:14.300 "abort": true, 00:09:14.300 "seek_hole": false, 00:09:14.300 "seek_data": false, 00:09:14.300 "copy": true, 00:09:14.300 "nvme_iov_md": false 00:09:14.300 }, 00:09:14.300 "memory_domains": [ 00:09:14.300 { 00:09:14.300 "dma_device_id": "system", 00:09:14.300 "dma_device_type": 1 00:09:14.300 }, 00:09:14.300 { 00:09:14.300 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:14.300 "dma_device_type": 2 00:09:14.300 } 00:09:14.300 ], 00:09:14.300 "driver_specific": {} 00:09:14.300 }' 00:09:14.300 23:30:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:14.300 23:30:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:14.557 23:30:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:09:14.557 23:30:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:14.557 23:30:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:14.557 23:30:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:09:14.557 23:30:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:14.557 23:30:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:14.557 23:30:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:09:14.557 23:30:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:14.557 23:30:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:14.557 23:30:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:09:14.557 23:30:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:09:14.815 [2024-07-24 23:30:59.652234] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:09:14.815 [2024-07-24 23:30:59.652258] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:09:14.815 [2024-07-24 23:30:59.652285] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:09:14.815 23:30:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:09:14.815 23:30:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:09:14.815 23:30:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:09:14.815 23:30:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:09:14.815 23:30:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:09:14.815 23:30:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 1 00:09:14.815 23:30:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:14.815 23:30:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:09:14.815 23:30:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:14.815 23:30:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:14.815 23:30:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:09:14.815 23:30:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:14.815 23:30:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:14.815 23:30:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:14.815 23:30:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:14.815 23:30:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:14.815 23:30:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:15.073 23:30:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:15.073 "name": "Existed_Raid", 00:09:15.073 "uuid": "55ce2a0a-80b0-4345-b546-75485cecd25e", 00:09:15.073 "strip_size_kb": 64, 00:09:15.073 "state": "offline", 00:09:15.073 "raid_level": "raid0", 00:09:15.073 "superblock": true, 00:09:15.073 "num_base_bdevs": 2, 00:09:15.073 "num_base_bdevs_discovered": 1, 00:09:15.073 "num_base_bdevs_operational": 1, 00:09:15.073 "base_bdevs_list": [ 00:09:15.073 { 00:09:15.073 "name": null, 00:09:15.073 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:15.073 "is_configured": false, 00:09:15.073 "data_offset": 2048, 00:09:15.073 "data_size": 63488 00:09:15.073 }, 00:09:15.073 { 00:09:15.073 "name": "BaseBdev2", 00:09:15.073 "uuid": "1abd99b5-5365-452c-9c82-c5d431fa0429", 00:09:15.073 "is_configured": true, 00:09:15.073 "data_offset": 2048, 00:09:15.073 "data_size": 63488 00:09:15.073 } 00:09:15.073 ] 00:09:15.073 }' 00:09:15.073 23:30:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:15.073 23:30:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:09:15.331 23:31:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:09:15.331 23:31:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:09:15.331 23:31:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:15.331 23:31:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:09:15.589 23:31:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:09:15.589 23:31:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:09:15.589 23:31:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:09:15.847 [2024-07-24 23:31:00.651687] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:09:15.847 [2024-07-24 23:31:00.651726] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x7d8050 name Existed_Raid, state offline 00:09:15.847 23:31:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:09:15.847 23:31:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:09:15.847 23:31:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:15.847 23:31:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:09:15.847 23:31:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:09:15.847 23:31:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:09:15.847 23:31:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:09:15.847 23:31:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 244352 00:09:15.847 23:31:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@950 -- # '[' -z 244352 ']' 00:09:15.847 23:31:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # kill -0 244352 00:09:16.116 23:31:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # uname 00:09:16.116 23:31:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:09:16.116 23:31:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 244352 00:09:16.116 23:31:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:09:16.116 23:31:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:09:16.116 23:31:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # echo 'killing process with pid 244352' 00:09:16.116 killing process with pid 244352 00:09:16.116 23:31:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@969 -- # kill 244352 00:09:16.116 [2024-07-24 23:31:00.887912] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:09:16.116 23:31:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@974 -- # wait 244352 00:09:16.116 [2024-07-24 23:31:00.888694] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:09:16.116 23:31:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:09:16.116 00:09:16.116 real 0m8.067s 00:09:16.116 user 0m14.462s 00:09:16.116 sys 0m1.330s 00:09:16.116 23:31:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:16.116 23:31:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:09:16.116 ************************************ 00:09:16.116 END TEST raid_state_function_test_sb 00:09:16.116 ************************************ 00:09:16.116 23:31:01 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 2 00:09:16.116 23:31:01 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:09:16.116 23:31:01 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:16.116 23:31:01 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:09:16.401 ************************************ 00:09:16.401 START TEST raid_superblock_test 00:09:16.401 ************************************ 00:09:16.401 23:31:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # raid_superblock_test raid0 2 00:09:16.401 23:31:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:09:16.401 23:31:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:09:16.401 23:31:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:09:16.401 23:31:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:09:16.401 23:31:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:09:16.401 23:31:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:09:16.401 23:31:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:09:16.401 23:31:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:09:16.401 23:31:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:09:16.401 23:31:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:09:16.401 23:31:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:09:16.401 23:31:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:09:16.401 23:31:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:09:16.401 23:31:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:09:16.401 23:31:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:09:16.401 23:31:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:09:16.401 23:31:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=245947 00:09:16.401 23:31:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 245947 /var/tmp/spdk-raid.sock 00:09:16.401 23:31:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:09:16.401 23:31:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@831 -- # '[' -z 245947 ']' 00:09:16.401 23:31:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:09:16.401 23:31:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:09:16.401 23:31:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:09:16.401 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:09:16.401 23:31:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:09:16.402 23:31:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:09:16.402 [2024-07-24 23:31:01.173710] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:09:16.402 [2024-07-24 23:31:01.173749] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid245947 ] 00:09:16.402 [2024-07-24 23:31:01.237833] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:16.402 [2024-07-24 23:31:01.308819] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:16.402 [2024-07-24 23:31:01.360640] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:16.402 [2024-07-24 23:31:01.360670] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:16.981 23:31:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:09:16.981 23:31:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@864 -- # return 0 00:09:16.981 23:31:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:09:16.981 23:31:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:09:16.981 23:31:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:09:16.981 23:31:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:09:16.981 23:31:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:09:16.981 23:31:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:09:16.981 23:31:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:09:16.981 23:31:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:09:16.981 23:31:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:09:17.239 malloc1 00:09:17.239 23:31:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:09:17.497 [2024-07-24 23:31:02.300229] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:09:17.497 [2024-07-24 23:31:02.300267] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:17.497 [2024-07-24 23:31:02.300280] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd9ddd0 00:09:17.497 [2024-07-24 23:31:02.300302] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:17.497 [2024-07-24 23:31:02.301296] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:17.497 [2024-07-24 23:31:02.301315] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:09:17.497 pt1 00:09:17.497 23:31:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:09:17.497 23:31:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:09:17.497 23:31:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:09:17.497 23:31:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:09:17.497 23:31:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:09:17.497 23:31:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:09:17.497 23:31:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:09:17.497 23:31:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:09:17.497 23:31:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:09:17.497 malloc2 00:09:17.755 23:31:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:09:17.755 [2024-07-24 23:31:02.636523] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:09:17.755 [2024-07-24 23:31:02.636549] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:17.755 [2024-07-24 23:31:02.636560] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd9e8d0 00:09:17.755 [2024-07-24 23:31:02.636565] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:17.755 [2024-07-24 23:31:02.637475] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:17.755 [2024-07-24 23:31:02.637492] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:09:17.755 pt2 00:09:17.755 23:31:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:09:17.755 23:31:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:09:17.755 23:31:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2' -n raid_bdev1 -s 00:09:18.013 [2024-07-24 23:31:02.804958] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:09:18.013 [2024-07-24 23:31:02.805745] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:09:18.013 [2024-07-24 23:31:02.805838] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xda12d0 00:09:18.013 [2024-07-24 23:31:02.805846] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:09:18.013 [2024-07-24 23:31:02.805960] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xda0730 00:09:18.013 [2024-07-24 23:31:02.806052] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xda12d0 00:09:18.013 [2024-07-24 23:31:02.806058] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xda12d0 00:09:18.013 [2024-07-24 23:31:02.806115] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:18.013 23:31:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:09:18.013 23:31:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:09:18.013 23:31:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:09:18.013 23:31:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:18.013 23:31:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:18.013 23:31:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:18.013 23:31:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:18.013 23:31:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:18.013 23:31:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:18.013 23:31:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:18.013 23:31:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:18.013 23:31:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:09:18.013 23:31:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:18.013 "name": "raid_bdev1", 00:09:18.013 "uuid": "9b2fd66b-1150-4b5c-974e-1d159aa3ff0e", 00:09:18.013 "strip_size_kb": 64, 00:09:18.013 "state": "online", 00:09:18.013 "raid_level": "raid0", 00:09:18.013 "superblock": true, 00:09:18.013 "num_base_bdevs": 2, 00:09:18.013 "num_base_bdevs_discovered": 2, 00:09:18.013 "num_base_bdevs_operational": 2, 00:09:18.013 "base_bdevs_list": [ 00:09:18.013 { 00:09:18.013 "name": "pt1", 00:09:18.013 "uuid": "00000000-0000-0000-0000-000000000001", 00:09:18.013 "is_configured": true, 00:09:18.013 "data_offset": 2048, 00:09:18.013 "data_size": 63488 00:09:18.013 }, 00:09:18.013 { 00:09:18.013 "name": "pt2", 00:09:18.013 "uuid": "00000000-0000-0000-0000-000000000002", 00:09:18.013 "is_configured": true, 00:09:18.013 "data_offset": 2048, 00:09:18.013 "data_size": 63488 00:09:18.013 } 00:09:18.013 ] 00:09:18.013 }' 00:09:18.013 23:31:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:18.013 23:31:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:09:18.588 23:31:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:09:18.589 23:31:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:09:18.589 23:31:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:09:18.589 23:31:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:09:18.589 23:31:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:09:18.589 23:31:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:09:18.589 23:31:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:09:18.589 23:31:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:09:18.852 [2024-07-24 23:31:03.639274] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:09:18.852 23:31:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:09:18.852 "name": "raid_bdev1", 00:09:18.852 "aliases": [ 00:09:18.852 "9b2fd66b-1150-4b5c-974e-1d159aa3ff0e" 00:09:18.852 ], 00:09:18.852 "product_name": "Raid Volume", 00:09:18.852 "block_size": 512, 00:09:18.852 "num_blocks": 126976, 00:09:18.852 "uuid": "9b2fd66b-1150-4b5c-974e-1d159aa3ff0e", 00:09:18.852 "assigned_rate_limits": { 00:09:18.852 "rw_ios_per_sec": 0, 00:09:18.852 "rw_mbytes_per_sec": 0, 00:09:18.852 "r_mbytes_per_sec": 0, 00:09:18.852 "w_mbytes_per_sec": 0 00:09:18.852 }, 00:09:18.852 "claimed": false, 00:09:18.852 "zoned": false, 00:09:18.852 "supported_io_types": { 00:09:18.852 "read": true, 00:09:18.852 "write": true, 00:09:18.852 "unmap": true, 00:09:18.852 "flush": true, 00:09:18.852 "reset": true, 00:09:18.852 "nvme_admin": false, 00:09:18.852 "nvme_io": false, 00:09:18.852 "nvme_io_md": false, 00:09:18.852 "write_zeroes": true, 00:09:18.852 "zcopy": false, 00:09:18.852 "get_zone_info": false, 00:09:18.852 "zone_management": false, 00:09:18.852 "zone_append": false, 00:09:18.852 "compare": false, 00:09:18.852 "compare_and_write": false, 00:09:18.852 "abort": false, 00:09:18.852 "seek_hole": false, 00:09:18.852 "seek_data": false, 00:09:18.852 "copy": false, 00:09:18.852 "nvme_iov_md": false 00:09:18.852 }, 00:09:18.852 "memory_domains": [ 00:09:18.852 { 00:09:18.852 "dma_device_id": "system", 00:09:18.852 "dma_device_type": 1 00:09:18.852 }, 00:09:18.852 { 00:09:18.852 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:18.852 "dma_device_type": 2 00:09:18.852 }, 00:09:18.852 { 00:09:18.852 "dma_device_id": "system", 00:09:18.852 "dma_device_type": 1 00:09:18.852 }, 00:09:18.852 { 00:09:18.852 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:18.852 "dma_device_type": 2 00:09:18.852 } 00:09:18.852 ], 00:09:18.852 "driver_specific": { 00:09:18.852 "raid": { 00:09:18.852 "uuid": "9b2fd66b-1150-4b5c-974e-1d159aa3ff0e", 00:09:18.852 "strip_size_kb": 64, 00:09:18.852 "state": "online", 00:09:18.852 "raid_level": "raid0", 00:09:18.852 "superblock": true, 00:09:18.852 "num_base_bdevs": 2, 00:09:18.852 "num_base_bdevs_discovered": 2, 00:09:18.852 "num_base_bdevs_operational": 2, 00:09:18.852 "base_bdevs_list": [ 00:09:18.852 { 00:09:18.852 "name": "pt1", 00:09:18.852 "uuid": "00000000-0000-0000-0000-000000000001", 00:09:18.852 "is_configured": true, 00:09:18.852 "data_offset": 2048, 00:09:18.852 "data_size": 63488 00:09:18.852 }, 00:09:18.852 { 00:09:18.852 "name": "pt2", 00:09:18.852 "uuid": "00000000-0000-0000-0000-000000000002", 00:09:18.852 "is_configured": true, 00:09:18.852 "data_offset": 2048, 00:09:18.852 "data_size": 63488 00:09:18.852 } 00:09:18.852 ] 00:09:18.852 } 00:09:18.852 } 00:09:18.852 }' 00:09:18.852 23:31:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:09:18.852 23:31:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:09:18.852 pt2' 00:09:18.852 23:31:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:09:18.852 23:31:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:09:18.852 23:31:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:09:19.110 23:31:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:09:19.110 "name": "pt1", 00:09:19.110 "aliases": [ 00:09:19.110 "00000000-0000-0000-0000-000000000001" 00:09:19.110 ], 00:09:19.110 "product_name": "passthru", 00:09:19.110 "block_size": 512, 00:09:19.110 "num_blocks": 65536, 00:09:19.110 "uuid": "00000000-0000-0000-0000-000000000001", 00:09:19.110 "assigned_rate_limits": { 00:09:19.110 "rw_ios_per_sec": 0, 00:09:19.110 "rw_mbytes_per_sec": 0, 00:09:19.110 "r_mbytes_per_sec": 0, 00:09:19.110 "w_mbytes_per_sec": 0 00:09:19.110 }, 00:09:19.110 "claimed": true, 00:09:19.110 "claim_type": "exclusive_write", 00:09:19.110 "zoned": false, 00:09:19.110 "supported_io_types": { 00:09:19.110 "read": true, 00:09:19.110 "write": true, 00:09:19.110 "unmap": true, 00:09:19.110 "flush": true, 00:09:19.110 "reset": true, 00:09:19.110 "nvme_admin": false, 00:09:19.110 "nvme_io": false, 00:09:19.110 "nvme_io_md": false, 00:09:19.110 "write_zeroes": true, 00:09:19.110 "zcopy": true, 00:09:19.110 "get_zone_info": false, 00:09:19.110 "zone_management": false, 00:09:19.110 "zone_append": false, 00:09:19.110 "compare": false, 00:09:19.110 "compare_and_write": false, 00:09:19.110 "abort": true, 00:09:19.110 "seek_hole": false, 00:09:19.110 "seek_data": false, 00:09:19.110 "copy": true, 00:09:19.110 "nvme_iov_md": false 00:09:19.110 }, 00:09:19.110 "memory_domains": [ 00:09:19.110 { 00:09:19.110 "dma_device_id": "system", 00:09:19.110 "dma_device_type": 1 00:09:19.110 }, 00:09:19.110 { 00:09:19.110 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:19.110 "dma_device_type": 2 00:09:19.110 } 00:09:19.110 ], 00:09:19.110 "driver_specific": { 00:09:19.110 "passthru": { 00:09:19.110 "name": "pt1", 00:09:19.110 "base_bdev_name": "malloc1" 00:09:19.110 } 00:09:19.110 } 00:09:19.110 }' 00:09:19.110 23:31:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:19.110 23:31:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:19.110 23:31:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:09:19.110 23:31:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:19.110 23:31:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:19.110 23:31:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:09:19.110 23:31:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:19.110 23:31:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:19.110 23:31:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:09:19.110 23:31:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:19.368 23:31:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:19.368 23:31:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:09:19.368 23:31:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:09:19.368 23:31:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:09:19.368 23:31:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:09:19.368 23:31:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:09:19.368 "name": "pt2", 00:09:19.368 "aliases": [ 00:09:19.368 "00000000-0000-0000-0000-000000000002" 00:09:19.368 ], 00:09:19.368 "product_name": "passthru", 00:09:19.368 "block_size": 512, 00:09:19.368 "num_blocks": 65536, 00:09:19.368 "uuid": "00000000-0000-0000-0000-000000000002", 00:09:19.368 "assigned_rate_limits": { 00:09:19.368 "rw_ios_per_sec": 0, 00:09:19.368 "rw_mbytes_per_sec": 0, 00:09:19.368 "r_mbytes_per_sec": 0, 00:09:19.368 "w_mbytes_per_sec": 0 00:09:19.368 }, 00:09:19.368 "claimed": true, 00:09:19.368 "claim_type": "exclusive_write", 00:09:19.368 "zoned": false, 00:09:19.368 "supported_io_types": { 00:09:19.368 "read": true, 00:09:19.368 "write": true, 00:09:19.368 "unmap": true, 00:09:19.368 "flush": true, 00:09:19.368 "reset": true, 00:09:19.368 "nvme_admin": false, 00:09:19.368 "nvme_io": false, 00:09:19.368 "nvme_io_md": false, 00:09:19.368 "write_zeroes": true, 00:09:19.368 "zcopy": true, 00:09:19.368 "get_zone_info": false, 00:09:19.368 "zone_management": false, 00:09:19.368 "zone_append": false, 00:09:19.368 "compare": false, 00:09:19.368 "compare_and_write": false, 00:09:19.368 "abort": true, 00:09:19.368 "seek_hole": false, 00:09:19.368 "seek_data": false, 00:09:19.368 "copy": true, 00:09:19.368 "nvme_iov_md": false 00:09:19.368 }, 00:09:19.368 "memory_domains": [ 00:09:19.368 { 00:09:19.368 "dma_device_id": "system", 00:09:19.368 "dma_device_type": 1 00:09:19.368 }, 00:09:19.368 { 00:09:19.368 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:19.368 "dma_device_type": 2 00:09:19.368 } 00:09:19.368 ], 00:09:19.368 "driver_specific": { 00:09:19.368 "passthru": { 00:09:19.368 "name": "pt2", 00:09:19.368 "base_bdev_name": "malloc2" 00:09:19.368 } 00:09:19.368 } 00:09:19.368 }' 00:09:19.368 23:31:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:19.626 23:31:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:19.627 23:31:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:09:19.627 23:31:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:19.627 23:31:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:19.627 23:31:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:09:19.627 23:31:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:19.627 23:31:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:19.627 23:31:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:09:19.627 23:31:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:19.627 23:31:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:19.884 23:31:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:09:19.884 23:31:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:09:19.884 23:31:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:09:19.884 [2024-07-24 23:31:04.794260] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:09:19.884 23:31:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=9b2fd66b-1150-4b5c-974e-1d159aa3ff0e 00:09:19.884 23:31:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 9b2fd66b-1150-4b5c-974e-1d159aa3ff0e ']' 00:09:19.884 23:31:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:09:20.143 [2024-07-24 23:31:04.962538] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:09:20.143 [2024-07-24 23:31:04.962552] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:09:20.143 [2024-07-24 23:31:04.962589] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:09:20.143 [2024-07-24 23:31:04.962619] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:09:20.143 [2024-07-24 23:31:04.962624] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xda12d0 name raid_bdev1, state offline 00:09:20.143 23:31:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:20.143 23:31:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:09:20.143 23:31:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:09:20.143 23:31:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:09:20.143 23:31:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:09:20.143 23:31:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:09:20.401 23:31:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:09:20.401 23:31:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:09:20.660 23:31:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:09:20.660 23:31:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:09:20.660 23:31:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:09:20.660 23:31:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:09:20.660 23:31:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # local es=0 00:09:20.660 23:31:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:09:20.660 23:31:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:09:20.660 23:31:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:09:20.660 23:31:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:09:20.660 23:31:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:09:20.660 23:31:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:09:20.660 23:31:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:09:20.660 23:31:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:09:20.660 23:31:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:09:20.660 23:31:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:09:20.918 [2024-07-24 23:31:05.776648] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:09:20.918 [2024-07-24 23:31:05.777680] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:09:20.918 [2024-07-24 23:31:05.777723] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:09:20.918 [2024-07-24 23:31:05.777750] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:09:20.918 [2024-07-24 23:31:05.777760] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:09:20.918 [2024-07-24 23:31:05.777766] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xda1010 name raid_bdev1, state configuring 00:09:20.918 request: 00:09:20.918 { 00:09:20.918 "name": "raid_bdev1", 00:09:20.918 "raid_level": "raid0", 00:09:20.918 "base_bdevs": [ 00:09:20.918 "malloc1", 00:09:20.918 "malloc2" 00:09:20.918 ], 00:09:20.918 "strip_size_kb": 64, 00:09:20.918 "superblock": false, 00:09:20.918 "method": "bdev_raid_create", 00:09:20.918 "req_id": 1 00:09:20.918 } 00:09:20.918 Got JSON-RPC error response 00:09:20.918 response: 00:09:20.918 { 00:09:20.918 "code": -17, 00:09:20.918 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:09:20.918 } 00:09:20.918 23:31:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # es=1 00:09:20.918 23:31:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:09:20.918 23:31:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:09:20.918 23:31:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:09:20.918 23:31:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:20.918 23:31:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:09:21.176 23:31:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:09:21.176 23:31:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:09:21.176 23:31:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:09:21.176 [2024-07-24 23:31:06.097434] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:09:21.176 [2024-07-24 23:31:06.097462] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:21.176 [2024-07-24 23:31:06.097493] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd94720 00:09:21.176 [2024-07-24 23:31:06.097500] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:21.176 [2024-07-24 23:31:06.098669] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:21.176 [2024-07-24 23:31:06.098690] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:09:21.176 [2024-07-24 23:31:06.098733] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:09:21.176 [2024-07-24 23:31:06.098753] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:09:21.176 pt1 00:09:21.176 23:31:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 2 00:09:21.176 23:31:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:09:21.176 23:31:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:09:21.176 23:31:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:21.176 23:31:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:21.176 23:31:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:21.176 23:31:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:21.176 23:31:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:21.176 23:31:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:21.176 23:31:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:21.176 23:31:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:21.176 23:31:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:09:21.433 23:31:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:21.433 "name": "raid_bdev1", 00:09:21.433 "uuid": "9b2fd66b-1150-4b5c-974e-1d159aa3ff0e", 00:09:21.433 "strip_size_kb": 64, 00:09:21.433 "state": "configuring", 00:09:21.433 "raid_level": "raid0", 00:09:21.433 "superblock": true, 00:09:21.434 "num_base_bdevs": 2, 00:09:21.434 "num_base_bdevs_discovered": 1, 00:09:21.434 "num_base_bdevs_operational": 2, 00:09:21.434 "base_bdevs_list": [ 00:09:21.434 { 00:09:21.434 "name": "pt1", 00:09:21.434 "uuid": "00000000-0000-0000-0000-000000000001", 00:09:21.434 "is_configured": true, 00:09:21.434 "data_offset": 2048, 00:09:21.434 "data_size": 63488 00:09:21.434 }, 00:09:21.434 { 00:09:21.434 "name": null, 00:09:21.434 "uuid": "00000000-0000-0000-0000-000000000002", 00:09:21.434 "is_configured": false, 00:09:21.434 "data_offset": 2048, 00:09:21.434 "data_size": 63488 00:09:21.434 } 00:09:21.434 ] 00:09:21.434 }' 00:09:21.434 23:31:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:21.434 23:31:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:09:21.998 23:31:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:09:21.999 23:31:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:09:21.999 23:31:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:09:21.999 23:31:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:09:21.999 [2024-07-24 23:31:06.903543] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:09:21.999 [2024-07-24 23:31:06.903581] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:21.999 [2024-07-24 23:31:06.903592] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe60490 00:09:21.999 [2024-07-24 23:31:06.903597] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:21.999 [2024-07-24 23:31:06.903849] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:21.999 [2024-07-24 23:31:06.903859] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:09:21.999 [2024-07-24 23:31:06.903902] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:09:21.999 [2024-07-24 23:31:06.903915] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:09:21.999 [2024-07-24 23:31:06.903983] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xda0400 00:09:21.999 [2024-07-24 23:31:06.903988] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:09:21.999 [2024-07-24 23:31:06.904104] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe619a0 00:09:21.999 [2024-07-24 23:31:06.904188] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xda0400 00:09:21.999 [2024-07-24 23:31:06.904193] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xda0400 00:09:21.999 [2024-07-24 23:31:06.904258] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:21.999 pt2 00:09:21.999 23:31:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:09:21.999 23:31:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:09:21.999 23:31:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:09:21.999 23:31:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:09:21.999 23:31:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:09:21.999 23:31:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:21.999 23:31:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:21.999 23:31:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:21.999 23:31:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:21.999 23:31:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:21.999 23:31:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:21.999 23:31:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:21.999 23:31:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:09:21.999 23:31:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:22.256 23:31:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:22.256 "name": "raid_bdev1", 00:09:22.256 "uuid": "9b2fd66b-1150-4b5c-974e-1d159aa3ff0e", 00:09:22.256 "strip_size_kb": 64, 00:09:22.256 "state": "online", 00:09:22.256 "raid_level": "raid0", 00:09:22.256 "superblock": true, 00:09:22.256 "num_base_bdevs": 2, 00:09:22.256 "num_base_bdevs_discovered": 2, 00:09:22.256 "num_base_bdevs_operational": 2, 00:09:22.256 "base_bdevs_list": [ 00:09:22.256 { 00:09:22.256 "name": "pt1", 00:09:22.256 "uuid": "00000000-0000-0000-0000-000000000001", 00:09:22.256 "is_configured": true, 00:09:22.256 "data_offset": 2048, 00:09:22.256 "data_size": 63488 00:09:22.256 }, 00:09:22.256 { 00:09:22.256 "name": "pt2", 00:09:22.256 "uuid": "00000000-0000-0000-0000-000000000002", 00:09:22.256 "is_configured": true, 00:09:22.256 "data_offset": 2048, 00:09:22.256 "data_size": 63488 00:09:22.256 } 00:09:22.256 ] 00:09:22.256 }' 00:09:22.256 23:31:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:22.256 23:31:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:09:22.822 23:31:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:09:22.822 23:31:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:09:22.822 23:31:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:09:22.822 23:31:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:09:22.822 23:31:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:09:22.822 23:31:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:09:22.822 23:31:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:09:22.822 23:31:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:09:22.822 [2024-07-24 23:31:07.693726] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:09:22.822 23:31:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:09:22.822 "name": "raid_bdev1", 00:09:22.822 "aliases": [ 00:09:22.822 "9b2fd66b-1150-4b5c-974e-1d159aa3ff0e" 00:09:22.822 ], 00:09:22.822 "product_name": "Raid Volume", 00:09:22.822 "block_size": 512, 00:09:22.822 "num_blocks": 126976, 00:09:22.822 "uuid": "9b2fd66b-1150-4b5c-974e-1d159aa3ff0e", 00:09:22.822 "assigned_rate_limits": { 00:09:22.822 "rw_ios_per_sec": 0, 00:09:22.822 "rw_mbytes_per_sec": 0, 00:09:22.822 "r_mbytes_per_sec": 0, 00:09:22.822 "w_mbytes_per_sec": 0 00:09:22.822 }, 00:09:22.822 "claimed": false, 00:09:22.822 "zoned": false, 00:09:22.822 "supported_io_types": { 00:09:22.822 "read": true, 00:09:22.822 "write": true, 00:09:22.822 "unmap": true, 00:09:22.822 "flush": true, 00:09:22.822 "reset": true, 00:09:22.822 "nvme_admin": false, 00:09:22.822 "nvme_io": false, 00:09:22.822 "nvme_io_md": false, 00:09:22.822 "write_zeroes": true, 00:09:22.822 "zcopy": false, 00:09:22.822 "get_zone_info": false, 00:09:22.822 "zone_management": false, 00:09:22.822 "zone_append": false, 00:09:22.822 "compare": false, 00:09:22.822 "compare_and_write": false, 00:09:22.822 "abort": false, 00:09:22.822 "seek_hole": false, 00:09:22.822 "seek_data": false, 00:09:22.822 "copy": false, 00:09:22.822 "nvme_iov_md": false 00:09:22.822 }, 00:09:22.822 "memory_domains": [ 00:09:22.822 { 00:09:22.822 "dma_device_id": "system", 00:09:22.822 "dma_device_type": 1 00:09:22.822 }, 00:09:22.822 { 00:09:22.822 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:22.822 "dma_device_type": 2 00:09:22.822 }, 00:09:22.822 { 00:09:22.822 "dma_device_id": "system", 00:09:22.822 "dma_device_type": 1 00:09:22.822 }, 00:09:22.822 { 00:09:22.822 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:22.822 "dma_device_type": 2 00:09:22.822 } 00:09:22.822 ], 00:09:22.822 "driver_specific": { 00:09:22.822 "raid": { 00:09:22.822 "uuid": "9b2fd66b-1150-4b5c-974e-1d159aa3ff0e", 00:09:22.822 "strip_size_kb": 64, 00:09:22.822 "state": "online", 00:09:22.822 "raid_level": "raid0", 00:09:22.822 "superblock": true, 00:09:22.822 "num_base_bdevs": 2, 00:09:22.822 "num_base_bdevs_discovered": 2, 00:09:22.822 "num_base_bdevs_operational": 2, 00:09:22.822 "base_bdevs_list": [ 00:09:22.822 { 00:09:22.822 "name": "pt1", 00:09:22.822 "uuid": "00000000-0000-0000-0000-000000000001", 00:09:22.822 "is_configured": true, 00:09:22.822 "data_offset": 2048, 00:09:22.822 "data_size": 63488 00:09:22.822 }, 00:09:22.822 { 00:09:22.822 "name": "pt2", 00:09:22.822 "uuid": "00000000-0000-0000-0000-000000000002", 00:09:22.822 "is_configured": true, 00:09:22.822 "data_offset": 2048, 00:09:22.822 "data_size": 63488 00:09:22.822 } 00:09:22.822 ] 00:09:22.822 } 00:09:22.822 } 00:09:22.822 }' 00:09:22.822 23:31:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:09:22.822 23:31:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:09:22.822 pt2' 00:09:22.822 23:31:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:09:22.822 23:31:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:09:22.822 23:31:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:09:23.080 23:31:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:09:23.080 "name": "pt1", 00:09:23.080 "aliases": [ 00:09:23.080 "00000000-0000-0000-0000-000000000001" 00:09:23.080 ], 00:09:23.080 "product_name": "passthru", 00:09:23.080 "block_size": 512, 00:09:23.080 "num_blocks": 65536, 00:09:23.080 "uuid": "00000000-0000-0000-0000-000000000001", 00:09:23.080 "assigned_rate_limits": { 00:09:23.080 "rw_ios_per_sec": 0, 00:09:23.080 "rw_mbytes_per_sec": 0, 00:09:23.080 "r_mbytes_per_sec": 0, 00:09:23.080 "w_mbytes_per_sec": 0 00:09:23.080 }, 00:09:23.080 "claimed": true, 00:09:23.080 "claim_type": "exclusive_write", 00:09:23.080 "zoned": false, 00:09:23.080 "supported_io_types": { 00:09:23.080 "read": true, 00:09:23.080 "write": true, 00:09:23.080 "unmap": true, 00:09:23.080 "flush": true, 00:09:23.080 "reset": true, 00:09:23.080 "nvme_admin": false, 00:09:23.080 "nvme_io": false, 00:09:23.080 "nvme_io_md": false, 00:09:23.080 "write_zeroes": true, 00:09:23.080 "zcopy": true, 00:09:23.080 "get_zone_info": false, 00:09:23.080 "zone_management": false, 00:09:23.080 "zone_append": false, 00:09:23.080 "compare": false, 00:09:23.080 "compare_and_write": false, 00:09:23.080 "abort": true, 00:09:23.080 "seek_hole": false, 00:09:23.080 "seek_data": false, 00:09:23.080 "copy": true, 00:09:23.080 "nvme_iov_md": false 00:09:23.080 }, 00:09:23.080 "memory_domains": [ 00:09:23.080 { 00:09:23.080 "dma_device_id": "system", 00:09:23.080 "dma_device_type": 1 00:09:23.080 }, 00:09:23.080 { 00:09:23.080 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:23.080 "dma_device_type": 2 00:09:23.080 } 00:09:23.080 ], 00:09:23.080 "driver_specific": { 00:09:23.080 "passthru": { 00:09:23.080 "name": "pt1", 00:09:23.080 "base_bdev_name": "malloc1" 00:09:23.080 } 00:09:23.080 } 00:09:23.080 }' 00:09:23.080 23:31:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:23.080 23:31:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:23.080 23:31:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:09:23.080 23:31:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:23.080 23:31:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:23.080 23:31:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:09:23.080 23:31:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:23.338 23:31:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:23.338 23:31:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:09:23.338 23:31:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:23.338 23:31:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:23.338 23:31:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:09:23.338 23:31:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:09:23.338 23:31:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:09:23.338 23:31:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:09:23.596 23:31:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:09:23.596 "name": "pt2", 00:09:23.596 "aliases": [ 00:09:23.596 "00000000-0000-0000-0000-000000000002" 00:09:23.596 ], 00:09:23.596 "product_name": "passthru", 00:09:23.596 "block_size": 512, 00:09:23.596 "num_blocks": 65536, 00:09:23.596 "uuid": "00000000-0000-0000-0000-000000000002", 00:09:23.596 "assigned_rate_limits": { 00:09:23.596 "rw_ios_per_sec": 0, 00:09:23.596 "rw_mbytes_per_sec": 0, 00:09:23.596 "r_mbytes_per_sec": 0, 00:09:23.596 "w_mbytes_per_sec": 0 00:09:23.596 }, 00:09:23.596 "claimed": true, 00:09:23.596 "claim_type": "exclusive_write", 00:09:23.596 "zoned": false, 00:09:23.596 "supported_io_types": { 00:09:23.596 "read": true, 00:09:23.596 "write": true, 00:09:23.596 "unmap": true, 00:09:23.596 "flush": true, 00:09:23.596 "reset": true, 00:09:23.596 "nvme_admin": false, 00:09:23.596 "nvme_io": false, 00:09:23.596 "nvme_io_md": false, 00:09:23.596 "write_zeroes": true, 00:09:23.596 "zcopy": true, 00:09:23.596 "get_zone_info": false, 00:09:23.596 "zone_management": false, 00:09:23.596 "zone_append": false, 00:09:23.596 "compare": false, 00:09:23.596 "compare_and_write": false, 00:09:23.596 "abort": true, 00:09:23.596 "seek_hole": false, 00:09:23.596 "seek_data": false, 00:09:23.596 "copy": true, 00:09:23.596 "nvme_iov_md": false 00:09:23.596 }, 00:09:23.596 "memory_domains": [ 00:09:23.596 { 00:09:23.596 "dma_device_id": "system", 00:09:23.596 "dma_device_type": 1 00:09:23.596 }, 00:09:23.596 { 00:09:23.596 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:23.596 "dma_device_type": 2 00:09:23.596 } 00:09:23.596 ], 00:09:23.596 "driver_specific": { 00:09:23.596 "passthru": { 00:09:23.596 "name": "pt2", 00:09:23.596 "base_bdev_name": "malloc2" 00:09:23.596 } 00:09:23.596 } 00:09:23.596 }' 00:09:23.596 23:31:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:23.596 23:31:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:23.596 23:31:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:09:23.596 23:31:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:23.596 23:31:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:23.596 23:31:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:09:23.596 23:31:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:23.596 23:31:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:23.854 23:31:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:09:23.854 23:31:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:23.854 23:31:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:23.854 23:31:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:09:23.854 23:31:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:09:23.854 23:31:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:09:23.854 [2024-07-24 23:31:08.812609] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:09:23.854 23:31:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 9b2fd66b-1150-4b5c-974e-1d159aa3ff0e '!=' 9b2fd66b-1150-4b5c-974e-1d159aa3ff0e ']' 00:09:23.854 23:31:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:09:23.854 23:31:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:09:23.854 23:31:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:09:23.854 23:31:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 245947 00:09:23.854 23:31:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@950 -- # '[' -z 245947 ']' 00:09:23.854 23:31:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # kill -0 245947 00:09:23.854 23:31:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # uname 00:09:23.854 23:31:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:09:23.854 23:31:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 245947 00:09:24.112 23:31:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:09:24.112 23:31:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:09:24.112 23:31:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 245947' 00:09:24.112 killing process with pid 245947 00:09:24.112 23:31:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@969 -- # kill 245947 00:09:24.112 [2024-07-24 23:31:08.856726] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:09:24.112 [2024-07-24 23:31:08.856772] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:09:24.112 [2024-07-24 23:31:08.856804] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:09:24.112 [2024-07-24 23:31:08.856810] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xda0400 name raid_bdev1, state offline 00:09:24.112 23:31:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@974 -- # wait 245947 00:09:24.112 [2024-07-24 23:31:08.872011] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:09:24.112 23:31:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:09:24.112 00:09:24.112 real 0m7.919s 00:09:24.112 user 0m14.215s 00:09:24.112 sys 0m1.242s 00:09:24.112 23:31:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:24.112 23:31:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:09:24.112 ************************************ 00:09:24.112 END TEST raid_superblock_test 00:09:24.112 ************************************ 00:09:24.112 23:31:09 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 2 read 00:09:24.112 23:31:09 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:09:24.112 23:31:09 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:24.112 23:31:09 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:09:24.112 ************************************ 00:09:24.112 START TEST raid_read_error_test 00:09:24.112 ************************************ 00:09:24.112 23:31:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test raid0 2 read 00:09:24.112 23:31:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:09:24.112 23:31:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:09:24.112 23:31:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:09:24.112 23:31:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:09:24.112 23:31:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:09:24.112 23:31:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:09:24.112 23:31:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:09:24.112 23:31:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:09:24.112 23:31:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:09:24.112 23:31:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:09:24.112 23:31:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:09:24.112 23:31:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:09:24.112 23:31:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:09:24.112 23:31:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:09:24.112 23:31:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:09:24.112 23:31:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:09:24.112 23:31:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:09:24.112 23:31:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:09:24.370 23:31:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:09:24.370 23:31:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:09:24.370 23:31:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:09:24.371 23:31:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:09:24.371 23:31:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.b6XNxd7Prz 00:09:24.371 23:31:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=247538 00:09:24.371 23:31:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 247538 /var/tmp/spdk-raid.sock 00:09:24.371 23:31:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:09:24.371 23:31:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@831 -- # '[' -z 247538 ']' 00:09:24.371 23:31:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:09:24.371 23:31:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:09:24.371 23:31:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:09:24.371 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:09:24.371 23:31:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:09:24.371 23:31:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:09:24.371 [2024-07-24 23:31:09.166854] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:09:24.371 [2024-07-24 23:31:09.166891] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid247538 ] 00:09:24.371 [2024-07-24 23:31:09.230789] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:24.371 [2024-07-24 23:31:09.308897] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:24.371 [2024-07-24 23:31:09.363403] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:24.371 [2024-07-24 23:31:09.363429] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:25.304 23:31:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:09:25.304 23:31:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@864 -- # return 0 00:09:25.304 23:31:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:09:25.304 23:31:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:09:25.304 BaseBdev1_malloc 00:09:25.304 23:31:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:09:25.304 true 00:09:25.562 23:31:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:09:25.562 [2024-07-24 23:31:10.464515] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:09:25.562 [2024-07-24 23:31:10.464547] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:25.562 [2024-07-24 23:31:10.464559] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x282a550 00:09:25.562 [2024-07-24 23:31:10.464565] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:25.562 [2024-07-24 23:31:10.465804] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:25.562 [2024-07-24 23:31:10.465825] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:09:25.562 BaseBdev1 00:09:25.562 23:31:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:09:25.562 23:31:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:09:25.820 BaseBdev2_malloc 00:09:25.820 23:31:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:09:25.820 true 00:09:25.820 23:31:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:09:26.078 [2024-07-24 23:31:10.937305] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:09:26.078 [2024-07-24 23:31:10.937334] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:26.078 [2024-07-24 23:31:10.937345] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x282ed90 00:09:26.078 [2024-07-24 23:31:10.937350] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:26.078 [2024-07-24 23:31:10.938384] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:26.078 [2024-07-24 23:31:10.938404] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:09:26.078 BaseBdev2 00:09:26.078 23:31:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:09:26.337 [2024-07-24 23:31:11.097754] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:09:26.337 [2024-07-24 23:31:11.098677] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:09:26.337 [2024-07-24 23:31:11.098803] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x28307a0 00:09:26.337 [2024-07-24 23:31:11.098811] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:09:26.337 [2024-07-24 23:31:11.098943] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x282f6f0 00:09:26.337 [2024-07-24 23:31:11.099037] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x28307a0 00:09:26.337 [2024-07-24 23:31:11.099042] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x28307a0 00:09:26.337 [2024-07-24 23:31:11.099109] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:26.337 23:31:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:09:26.337 23:31:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:09:26.337 23:31:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:09:26.337 23:31:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:26.337 23:31:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:26.337 23:31:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:26.337 23:31:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:26.337 23:31:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:26.337 23:31:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:26.337 23:31:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:26.337 23:31:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:26.337 23:31:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:09:26.337 23:31:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:26.337 "name": "raid_bdev1", 00:09:26.337 "uuid": "6f29c042-1469-4d81-b52f-9ce8c4921fb5", 00:09:26.337 "strip_size_kb": 64, 00:09:26.337 "state": "online", 00:09:26.337 "raid_level": "raid0", 00:09:26.337 "superblock": true, 00:09:26.337 "num_base_bdevs": 2, 00:09:26.337 "num_base_bdevs_discovered": 2, 00:09:26.337 "num_base_bdevs_operational": 2, 00:09:26.337 "base_bdevs_list": [ 00:09:26.337 { 00:09:26.337 "name": "BaseBdev1", 00:09:26.337 "uuid": "2a494bc5-2baf-53cf-b7f1-475ca1b6a1b8", 00:09:26.337 "is_configured": true, 00:09:26.337 "data_offset": 2048, 00:09:26.337 "data_size": 63488 00:09:26.337 }, 00:09:26.337 { 00:09:26.337 "name": "BaseBdev2", 00:09:26.337 "uuid": "f0d26afa-0fc0-5305-90a5-4787c0296eeb", 00:09:26.337 "is_configured": true, 00:09:26.337 "data_offset": 2048, 00:09:26.337 "data_size": 63488 00:09:26.337 } 00:09:26.337 ] 00:09:26.337 }' 00:09:26.337 23:31:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:26.337 23:31:11 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:09:26.904 23:31:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:09:26.904 23:31:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:09:26.904 [2024-07-24 23:31:11.851912] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x282be30 00:09:27.839 23:31:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:09:28.098 23:31:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:09:28.098 23:31:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:09:28.098 23:31:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:09:28.098 23:31:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:09:28.098 23:31:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:09:28.098 23:31:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:09:28.098 23:31:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:28.098 23:31:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:28.098 23:31:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:28.098 23:31:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:28.098 23:31:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:28.098 23:31:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:28.098 23:31:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:28.098 23:31:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:09:28.098 23:31:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:28.357 23:31:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:28.357 "name": "raid_bdev1", 00:09:28.357 "uuid": "6f29c042-1469-4d81-b52f-9ce8c4921fb5", 00:09:28.357 "strip_size_kb": 64, 00:09:28.357 "state": "online", 00:09:28.357 "raid_level": "raid0", 00:09:28.357 "superblock": true, 00:09:28.357 "num_base_bdevs": 2, 00:09:28.357 "num_base_bdevs_discovered": 2, 00:09:28.357 "num_base_bdevs_operational": 2, 00:09:28.357 "base_bdevs_list": [ 00:09:28.357 { 00:09:28.357 "name": "BaseBdev1", 00:09:28.357 "uuid": "2a494bc5-2baf-53cf-b7f1-475ca1b6a1b8", 00:09:28.357 "is_configured": true, 00:09:28.357 "data_offset": 2048, 00:09:28.357 "data_size": 63488 00:09:28.357 }, 00:09:28.357 { 00:09:28.357 "name": "BaseBdev2", 00:09:28.357 "uuid": "f0d26afa-0fc0-5305-90a5-4787c0296eeb", 00:09:28.357 "is_configured": true, 00:09:28.357 "data_offset": 2048, 00:09:28.357 "data_size": 63488 00:09:28.357 } 00:09:28.357 ] 00:09:28.357 }' 00:09:28.357 23:31:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:28.357 23:31:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:09:28.614 23:31:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:09:28.872 [2024-07-24 23:31:13.739664] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:09:28.872 [2024-07-24 23:31:13.739700] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:09:28.872 [2024-07-24 23:31:13.741649] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:09:28.872 [2024-07-24 23:31:13.741670] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:28.872 [2024-07-24 23:31:13.741687] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:09:28.872 [2024-07-24 23:31:13.741692] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x28307a0 name raid_bdev1, state offline 00:09:28.872 0 00:09:28.872 23:31:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 247538 00:09:28.872 23:31:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@950 -- # '[' -z 247538 ']' 00:09:28.872 23:31:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # kill -0 247538 00:09:28.872 23:31:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # uname 00:09:28.872 23:31:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:09:28.872 23:31:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 247538 00:09:28.872 23:31:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:09:28.872 23:31:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:09:28.872 23:31:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 247538' 00:09:28.872 killing process with pid 247538 00:09:28.872 23:31:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@969 -- # kill 247538 00:09:28.872 [2024-07-24 23:31:13.805349] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:09:28.872 23:31:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@974 -- # wait 247538 00:09:28.872 [2024-07-24 23:31:13.814499] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:09:29.130 23:31:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.b6XNxd7Prz 00:09:29.130 23:31:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:09:29.130 23:31:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:09:29.130 23:31:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.53 00:09:29.130 23:31:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:09:29.130 23:31:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:09:29.130 23:31:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:09:29.130 23:31:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.53 != \0\.\0\0 ]] 00:09:29.130 00:09:29.130 real 0m4.896s 00:09:29.130 user 0m7.469s 00:09:29.130 sys 0m0.703s 00:09:29.130 23:31:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:29.130 23:31:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:09:29.130 ************************************ 00:09:29.130 END TEST raid_read_error_test 00:09:29.130 ************************************ 00:09:29.130 23:31:14 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 2 write 00:09:29.130 23:31:14 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:09:29.130 23:31:14 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:29.130 23:31:14 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:09:29.130 ************************************ 00:09:29.130 START TEST raid_write_error_test 00:09:29.130 ************************************ 00:09:29.130 23:31:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test raid0 2 write 00:09:29.130 23:31:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:09:29.130 23:31:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:09:29.130 23:31:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:09:29.130 23:31:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:09:29.130 23:31:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:09:29.130 23:31:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:09:29.130 23:31:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:09:29.130 23:31:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:09:29.130 23:31:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:09:29.130 23:31:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:09:29.130 23:31:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:09:29.130 23:31:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:09:29.130 23:31:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:09:29.130 23:31:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:09:29.130 23:31:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:09:29.130 23:31:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:09:29.130 23:31:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:09:29.130 23:31:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:09:29.130 23:31:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:09:29.130 23:31:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:09:29.130 23:31:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:09:29.130 23:31:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:09:29.130 23:31:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.bn7keQBuRy 00:09:29.130 23:31:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=248383 00:09:29.130 23:31:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 248383 /var/tmp/spdk-raid.sock 00:09:29.130 23:31:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:09:29.130 23:31:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@831 -- # '[' -z 248383 ']' 00:09:29.130 23:31:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:09:29.130 23:31:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:09:29.130 23:31:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:09:29.130 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:09:29.130 23:31:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:09:29.130 23:31:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:09:29.388 [2024-07-24 23:31:14.133864] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:09:29.388 [2024-07-24 23:31:14.133905] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid248383 ] 00:09:29.388 [2024-07-24 23:31:14.200194] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:29.388 [2024-07-24 23:31:14.273934] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:29.388 [2024-07-24 23:31:14.327376] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:29.388 [2024-07-24 23:31:14.327406] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:29.954 23:31:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:09:29.954 23:31:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@864 -- # return 0 00:09:29.954 23:31:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:09:29.954 23:31:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:09:30.212 BaseBdev1_malloc 00:09:30.212 23:31:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:09:30.471 true 00:09:30.471 23:31:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:09:30.471 [2024-07-24 23:31:15.407510] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:09:30.471 [2024-07-24 23:31:15.407543] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:30.471 [2024-07-24 23:31:15.407553] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f5a550 00:09:30.471 [2024-07-24 23:31:15.407559] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:30.471 [2024-07-24 23:31:15.408678] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:30.471 [2024-07-24 23:31:15.408697] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:09:30.471 BaseBdev1 00:09:30.471 23:31:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:09:30.471 23:31:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:09:30.729 BaseBdev2_malloc 00:09:30.729 23:31:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:09:30.987 true 00:09:30.987 23:31:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:09:30.987 [2024-07-24 23:31:15.920530] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:09:30.987 [2024-07-24 23:31:15.920561] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:30.987 [2024-07-24 23:31:15.920570] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f5ed90 00:09:30.987 [2024-07-24 23:31:15.920576] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:30.987 [2024-07-24 23:31:15.921534] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:30.987 [2024-07-24 23:31:15.921553] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:09:30.987 BaseBdev2 00:09:30.987 23:31:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:09:31.245 [2024-07-24 23:31:16.080969] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:09:31.245 [2024-07-24 23:31:16.081738] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:09:31.245 [2024-07-24 23:31:16.081858] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f607a0 00:09:31.245 [2024-07-24 23:31:16.081866] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:09:31.245 [2024-07-24 23:31:16.081977] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f5f6f0 00:09:31.245 [2024-07-24 23:31:16.082070] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f607a0 00:09:31.245 [2024-07-24 23:31:16.082075] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1f607a0 00:09:31.245 [2024-07-24 23:31:16.082136] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:31.245 23:31:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:09:31.245 23:31:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:09:31.245 23:31:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:09:31.245 23:31:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:31.245 23:31:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:31.245 23:31:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:31.245 23:31:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:31.245 23:31:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:31.245 23:31:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:31.245 23:31:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:31.245 23:31:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:31.245 23:31:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:09:31.503 23:31:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:31.503 "name": "raid_bdev1", 00:09:31.503 "uuid": "5faa2dfd-1fb3-4cce-84d0-051a7afc64c9", 00:09:31.503 "strip_size_kb": 64, 00:09:31.503 "state": "online", 00:09:31.503 "raid_level": "raid0", 00:09:31.503 "superblock": true, 00:09:31.503 "num_base_bdevs": 2, 00:09:31.503 "num_base_bdevs_discovered": 2, 00:09:31.503 "num_base_bdevs_operational": 2, 00:09:31.503 "base_bdevs_list": [ 00:09:31.503 { 00:09:31.503 "name": "BaseBdev1", 00:09:31.503 "uuid": "d24e9596-d90b-51b8-8c21-23b9ae447b0b", 00:09:31.503 "is_configured": true, 00:09:31.503 "data_offset": 2048, 00:09:31.503 "data_size": 63488 00:09:31.503 }, 00:09:31.503 { 00:09:31.503 "name": "BaseBdev2", 00:09:31.503 "uuid": "df8879eb-7509-5784-9753-3ef25fff3ee6", 00:09:31.503 "is_configured": true, 00:09:31.503 "data_offset": 2048, 00:09:31.503 "data_size": 63488 00:09:31.503 } 00:09:31.503 ] 00:09:31.503 }' 00:09:31.503 23:31:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:31.503 23:31:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:09:32.076 23:31:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:09:32.076 23:31:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:09:32.076 [2024-07-24 23:31:16.843172] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f5be30 00:09:33.011 23:31:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:09:33.011 23:31:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:09:33.011 23:31:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:09:33.011 23:31:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:09:33.011 23:31:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:09:33.011 23:31:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:09:33.011 23:31:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:09:33.011 23:31:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:33.011 23:31:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:33.011 23:31:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:33.011 23:31:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:33.011 23:31:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:33.011 23:31:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:33.011 23:31:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:33.011 23:31:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:33.011 23:31:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:09:33.269 23:31:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:33.269 "name": "raid_bdev1", 00:09:33.269 "uuid": "5faa2dfd-1fb3-4cce-84d0-051a7afc64c9", 00:09:33.269 "strip_size_kb": 64, 00:09:33.269 "state": "online", 00:09:33.269 "raid_level": "raid0", 00:09:33.269 "superblock": true, 00:09:33.269 "num_base_bdevs": 2, 00:09:33.269 "num_base_bdevs_discovered": 2, 00:09:33.269 "num_base_bdevs_operational": 2, 00:09:33.269 "base_bdevs_list": [ 00:09:33.269 { 00:09:33.269 "name": "BaseBdev1", 00:09:33.269 "uuid": "d24e9596-d90b-51b8-8c21-23b9ae447b0b", 00:09:33.269 "is_configured": true, 00:09:33.269 "data_offset": 2048, 00:09:33.269 "data_size": 63488 00:09:33.269 }, 00:09:33.269 { 00:09:33.269 "name": "BaseBdev2", 00:09:33.269 "uuid": "df8879eb-7509-5784-9753-3ef25fff3ee6", 00:09:33.269 "is_configured": true, 00:09:33.269 "data_offset": 2048, 00:09:33.269 "data_size": 63488 00:09:33.269 } 00:09:33.269 ] 00:09:33.269 }' 00:09:33.269 23:31:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:33.269 23:31:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:09:33.836 23:31:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:09:33.836 [2024-07-24 23:31:18.771474] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:09:33.836 [2024-07-24 23:31:18.771510] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:09:33.836 [2024-07-24 23:31:18.773578] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:09:33.836 [2024-07-24 23:31:18.773598] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:33.836 [2024-07-24 23:31:18.773616] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:09:33.837 [2024-07-24 23:31:18.773621] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f607a0 name raid_bdev1, state offline 00:09:33.837 0 00:09:33.837 23:31:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 248383 00:09:33.837 23:31:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@950 -- # '[' -z 248383 ']' 00:09:33.837 23:31:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # kill -0 248383 00:09:33.837 23:31:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # uname 00:09:33.837 23:31:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:09:33.837 23:31:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 248383 00:09:33.837 23:31:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:09:33.837 23:31:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:09:33.837 23:31:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 248383' 00:09:33.837 killing process with pid 248383 00:09:33.837 23:31:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@969 -- # kill 248383 00:09:33.837 [2024-07-24 23:31:18.830855] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:09:33.837 23:31:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@974 -- # wait 248383 00:09:34.096 [2024-07-24 23:31:18.840085] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:09:34.096 23:31:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.bn7keQBuRy 00:09:34.096 23:31:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:09:34.096 23:31:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:09:34.096 23:31:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.52 00:09:34.096 23:31:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:09:34.096 23:31:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:09:34.096 23:31:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:09:34.096 23:31:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.52 != \0\.\0\0 ]] 00:09:34.096 00:09:34.096 real 0m4.954s 00:09:34.096 user 0m7.597s 00:09:34.096 sys 0m0.723s 00:09:34.096 23:31:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:34.096 23:31:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:09:34.096 ************************************ 00:09:34.096 END TEST raid_write_error_test 00:09:34.096 ************************************ 00:09:34.096 23:31:19 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:09:34.096 23:31:19 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 2 false 00:09:34.096 23:31:19 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:09:34.096 23:31:19 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:34.096 23:31:19 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:09:34.096 ************************************ 00:09:34.096 START TEST raid_state_function_test 00:09:34.096 ************************************ 00:09:34.096 23:31:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # raid_state_function_test concat 2 false 00:09:34.096 23:31:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:09:34.096 23:31:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:09:34.096 23:31:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:09:34.096 23:31:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:09:34.096 23:31:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:09:34.096 23:31:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:09:34.096 23:31:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:09:34.096 23:31:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:09:34.096 23:31:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:09:34.096 23:31:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:09:34.096 23:31:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:09:34.096 23:31:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:09:34.355 23:31:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:09:34.355 23:31:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:09:34.355 23:31:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:09:34.355 23:31:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:09:34.355 23:31:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:09:34.355 23:31:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:09:34.355 23:31:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:09:34.355 23:31:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:09:34.355 23:31:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:09:34.355 23:31:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:09:34.355 23:31:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:09:34.355 23:31:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=249337 00:09:34.355 23:31:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 249337' 00:09:34.355 Process raid pid: 249337 00:09:34.355 23:31:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:09:34.355 23:31:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 249337 /var/tmp/spdk-raid.sock 00:09:34.355 23:31:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@831 -- # '[' -z 249337 ']' 00:09:34.355 23:31:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:09:34.355 23:31:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:09:34.355 23:31:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:09:34.355 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:09:34.355 23:31:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:09:34.355 23:31:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:34.355 [2024-07-24 23:31:19.148087] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:09:34.355 [2024-07-24 23:31:19.148125] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:09:34.355 [2024-07-24 23:31:19.212583] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:34.355 [2024-07-24 23:31:19.281878] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:34.355 [2024-07-24 23:31:19.330653] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:34.355 [2024-07-24 23:31:19.330676] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:35.290 23:31:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:09:35.290 23:31:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@864 -- # return 0 00:09:35.290 23:31:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:09:35.290 [2024-07-24 23:31:20.085576] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:09:35.290 [2024-07-24 23:31:20.085609] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:09:35.290 [2024-07-24 23:31:20.085615] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:09:35.290 [2024-07-24 23:31:20.085621] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:09:35.290 23:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:09:35.290 23:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:35.290 23:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:09:35.290 23:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:09:35.290 23:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:35.290 23:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:35.290 23:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:35.290 23:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:35.290 23:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:35.290 23:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:35.290 23:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:35.290 23:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:35.290 23:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:35.290 "name": "Existed_Raid", 00:09:35.290 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:35.290 "strip_size_kb": 64, 00:09:35.290 "state": "configuring", 00:09:35.290 "raid_level": "concat", 00:09:35.290 "superblock": false, 00:09:35.290 "num_base_bdevs": 2, 00:09:35.290 "num_base_bdevs_discovered": 0, 00:09:35.290 "num_base_bdevs_operational": 2, 00:09:35.290 "base_bdevs_list": [ 00:09:35.290 { 00:09:35.290 "name": "BaseBdev1", 00:09:35.290 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:35.290 "is_configured": false, 00:09:35.290 "data_offset": 0, 00:09:35.290 "data_size": 0 00:09:35.290 }, 00:09:35.290 { 00:09:35.290 "name": "BaseBdev2", 00:09:35.290 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:35.290 "is_configured": false, 00:09:35.290 "data_offset": 0, 00:09:35.290 "data_size": 0 00:09:35.290 } 00:09:35.290 ] 00:09:35.290 }' 00:09:35.290 23:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:35.290 23:31:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:35.857 23:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:09:36.115 [2024-07-24 23:31:20.915608] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:09:36.115 [2024-07-24 23:31:20.915631] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x134db10 name Existed_Raid, state configuring 00:09:36.115 23:31:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:09:36.115 [2024-07-24 23:31:21.100124] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:09:36.115 [2024-07-24 23:31:21.100144] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:09:36.115 [2024-07-24 23:31:21.100149] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:09:36.115 [2024-07-24 23:31:21.100154] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:09:36.373 23:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:09:36.373 [2024-07-24 23:31:21.280694] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:09:36.373 BaseBdev1 00:09:36.373 23:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:09:36.373 23:31:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:09:36.373 23:31:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:09:36.373 23:31:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:09:36.373 23:31:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:09:36.373 23:31:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:09:36.373 23:31:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:09:36.632 23:31:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:09:36.632 [ 00:09:36.632 { 00:09:36.632 "name": "BaseBdev1", 00:09:36.632 "aliases": [ 00:09:36.632 "dbe90875-c68f-45c6-8bf1-cc44313b7530" 00:09:36.632 ], 00:09:36.632 "product_name": "Malloc disk", 00:09:36.632 "block_size": 512, 00:09:36.632 "num_blocks": 65536, 00:09:36.632 "uuid": "dbe90875-c68f-45c6-8bf1-cc44313b7530", 00:09:36.632 "assigned_rate_limits": { 00:09:36.632 "rw_ios_per_sec": 0, 00:09:36.632 "rw_mbytes_per_sec": 0, 00:09:36.632 "r_mbytes_per_sec": 0, 00:09:36.632 "w_mbytes_per_sec": 0 00:09:36.632 }, 00:09:36.632 "claimed": true, 00:09:36.632 "claim_type": "exclusive_write", 00:09:36.632 "zoned": false, 00:09:36.632 "supported_io_types": { 00:09:36.632 "read": true, 00:09:36.632 "write": true, 00:09:36.632 "unmap": true, 00:09:36.632 "flush": true, 00:09:36.632 "reset": true, 00:09:36.632 "nvme_admin": false, 00:09:36.632 "nvme_io": false, 00:09:36.632 "nvme_io_md": false, 00:09:36.632 "write_zeroes": true, 00:09:36.632 "zcopy": true, 00:09:36.632 "get_zone_info": false, 00:09:36.632 "zone_management": false, 00:09:36.632 "zone_append": false, 00:09:36.632 "compare": false, 00:09:36.632 "compare_and_write": false, 00:09:36.632 "abort": true, 00:09:36.632 "seek_hole": false, 00:09:36.632 "seek_data": false, 00:09:36.632 "copy": true, 00:09:36.632 "nvme_iov_md": false 00:09:36.632 }, 00:09:36.632 "memory_domains": [ 00:09:36.632 { 00:09:36.632 "dma_device_id": "system", 00:09:36.632 "dma_device_type": 1 00:09:36.632 }, 00:09:36.632 { 00:09:36.632 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:36.632 "dma_device_type": 2 00:09:36.632 } 00:09:36.632 ], 00:09:36.632 "driver_specific": {} 00:09:36.632 } 00:09:36.632 ] 00:09:36.632 23:31:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:09:36.632 23:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:09:36.632 23:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:36.632 23:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:09:36.632 23:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:09:36.632 23:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:36.632 23:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:36.632 23:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:36.632 23:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:36.632 23:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:36.632 23:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:36.632 23:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:36.632 23:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:36.890 23:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:36.890 "name": "Existed_Raid", 00:09:36.890 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:36.890 "strip_size_kb": 64, 00:09:36.890 "state": "configuring", 00:09:36.890 "raid_level": "concat", 00:09:36.890 "superblock": false, 00:09:36.890 "num_base_bdevs": 2, 00:09:36.890 "num_base_bdevs_discovered": 1, 00:09:36.890 "num_base_bdevs_operational": 2, 00:09:36.890 "base_bdevs_list": [ 00:09:36.890 { 00:09:36.890 "name": "BaseBdev1", 00:09:36.890 "uuid": "dbe90875-c68f-45c6-8bf1-cc44313b7530", 00:09:36.890 "is_configured": true, 00:09:36.890 "data_offset": 0, 00:09:36.890 "data_size": 65536 00:09:36.890 }, 00:09:36.890 { 00:09:36.890 "name": "BaseBdev2", 00:09:36.890 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:36.890 "is_configured": false, 00:09:36.890 "data_offset": 0, 00:09:36.890 "data_size": 0 00:09:36.890 } 00:09:36.890 ] 00:09:36.890 }' 00:09:36.890 23:31:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:36.890 23:31:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:37.456 23:31:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:09:37.456 [2024-07-24 23:31:22.411621] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:09:37.456 [2024-07-24 23:31:22.411652] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x134d3a0 name Existed_Raid, state configuring 00:09:37.456 23:31:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:09:37.713 [2024-07-24 23:31:22.580073] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:09:37.713 [2024-07-24 23:31:22.581106] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:09:37.713 [2024-07-24 23:31:22.581131] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:09:37.713 23:31:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:09:37.713 23:31:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:09:37.713 23:31:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:09:37.713 23:31:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:37.713 23:31:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:09:37.713 23:31:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:09:37.714 23:31:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:37.714 23:31:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:37.714 23:31:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:37.714 23:31:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:37.714 23:31:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:37.714 23:31:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:37.714 23:31:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:37.714 23:31:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:37.972 23:31:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:37.972 "name": "Existed_Raid", 00:09:37.972 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:37.972 "strip_size_kb": 64, 00:09:37.972 "state": "configuring", 00:09:37.972 "raid_level": "concat", 00:09:37.972 "superblock": false, 00:09:37.972 "num_base_bdevs": 2, 00:09:37.972 "num_base_bdevs_discovered": 1, 00:09:37.972 "num_base_bdevs_operational": 2, 00:09:37.972 "base_bdevs_list": [ 00:09:37.972 { 00:09:37.972 "name": "BaseBdev1", 00:09:37.972 "uuid": "dbe90875-c68f-45c6-8bf1-cc44313b7530", 00:09:37.972 "is_configured": true, 00:09:37.972 "data_offset": 0, 00:09:37.972 "data_size": 65536 00:09:37.972 }, 00:09:37.972 { 00:09:37.972 "name": "BaseBdev2", 00:09:37.972 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:37.972 "is_configured": false, 00:09:37.972 "data_offset": 0, 00:09:37.972 "data_size": 0 00:09:37.972 } 00:09:37.972 ] 00:09:37.972 }' 00:09:37.972 23:31:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:37.972 23:31:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:38.538 23:31:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:09:38.538 [2024-07-24 23:31:23.412950] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:09:38.538 [2024-07-24 23:31:23.412980] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x134e050 00:09:38.538 [2024-07-24 23:31:23.412984] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:09:38.538 [2024-07-24 23:31:23.413145] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1350650 00:09:38.538 [2024-07-24 23:31:23.413227] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x134e050 00:09:38.538 [2024-07-24 23:31:23.413232] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x134e050 00:09:38.538 [2024-07-24 23:31:23.413350] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:38.538 BaseBdev2 00:09:38.538 23:31:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:09:38.538 23:31:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:09:38.538 23:31:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:09:38.538 23:31:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:09:38.538 23:31:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:09:38.538 23:31:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:09:38.538 23:31:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:09:38.796 23:31:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:09:38.796 [ 00:09:38.796 { 00:09:38.796 "name": "BaseBdev2", 00:09:38.796 "aliases": [ 00:09:38.796 "6610e634-05d7-451a-9070-8f6966dbc6c0" 00:09:38.796 ], 00:09:38.796 "product_name": "Malloc disk", 00:09:38.796 "block_size": 512, 00:09:38.796 "num_blocks": 65536, 00:09:38.796 "uuid": "6610e634-05d7-451a-9070-8f6966dbc6c0", 00:09:38.796 "assigned_rate_limits": { 00:09:38.796 "rw_ios_per_sec": 0, 00:09:38.796 "rw_mbytes_per_sec": 0, 00:09:38.796 "r_mbytes_per_sec": 0, 00:09:38.796 "w_mbytes_per_sec": 0 00:09:38.796 }, 00:09:38.796 "claimed": true, 00:09:38.796 "claim_type": "exclusive_write", 00:09:38.796 "zoned": false, 00:09:38.796 "supported_io_types": { 00:09:38.796 "read": true, 00:09:38.796 "write": true, 00:09:38.796 "unmap": true, 00:09:38.796 "flush": true, 00:09:38.796 "reset": true, 00:09:38.796 "nvme_admin": false, 00:09:38.796 "nvme_io": false, 00:09:38.796 "nvme_io_md": false, 00:09:38.796 "write_zeroes": true, 00:09:38.796 "zcopy": true, 00:09:38.796 "get_zone_info": false, 00:09:38.796 "zone_management": false, 00:09:38.796 "zone_append": false, 00:09:38.796 "compare": false, 00:09:38.796 "compare_and_write": false, 00:09:38.796 "abort": true, 00:09:38.796 "seek_hole": false, 00:09:38.796 "seek_data": false, 00:09:38.796 "copy": true, 00:09:38.796 "nvme_iov_md": false 00:09:38.796 }, 00:09:38.796 "memory_domains": [ 00:09:38.796 { 00:09:38.796 "dma_device_id": "system", 00:09:38.796 "dma_device_type": 1 00:09:38.796 }, 00:09:38.796 { 00:09:38.796 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:38.796 "dma_device_type": 2 00:09:38.796 } 00:09:38.796 ], 00:09:38.796 "driver_specific": {} 00:09:38.796 } 00:09:38.796 ] 00:09:38.796 23:31:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:09:38.796 23:31:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:09:38.796 23:31:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:09:38.796 23:31:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 2 00:09:38.796 23:31:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:38.796 23:31:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:09:38.796 23:31:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:09:38.796 23:31:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:38.796 23:31:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:38.796 23:31:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:38.796 23:31:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:38.796 23:31:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:38.796 23:31:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:38.796 23:31:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:38.796 23:31:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:39.055 23:31:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:39.055 "name": "Existed_Raid", 00:09:39.055 "uuid": "1bb963cb-4bee-4fd9-975c-f57759ea751b", 00:09:39.055 "strip_size_kb": 64, 00:09:39.055 "state": "online", 00:09:39.055 "raid_level": "concat", 00:09:39.055 "superblock": false, 00:09:39.055 "num_base_bdevs": 2, 00:09:39.055 "num_base_bdevs_discovered": 2, 00:09:39.055 "num_base_bdevs_operational": 2, 00:09:39.055 "base_bdevs_list": [ 00:09:39.055 { 00:09:39.055 "name": "BaseBdev1", 00:09:39.055 "uuid": "dbe90875-c68f-45c6-8bf1-cc44313b7530", 00:09:39.055 "is_configured": true, 00:09:39.055 "data_offset": 0, 00:09:39.055 "data_size": 65536 00:09:39.055 }, 00:09:39.055 { 00:09:39.055 "name": "BaseBdev2", 00:09:39.055 "uuid": "6610e634-05d7-451a-9070-8f6966dbc6c0", 00:09:39.055 "is_configured": true, 00:09:39.055 "data_offset": 0, 00:09:39.055 "data_size": 65536 00:09:39.055 } 00:09:39.055 ] 00:09:39.055 }' 00:09:39.055 23:31:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:39.055 23:31:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:39.620 23:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:09:39.620 23:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:09:39.620 23:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:09:39.621 23:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:09:39.621 23:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:09:39.621 23:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:09:39.621 23:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:09:39.621 23:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:09:39.621 [2024-07-24 23:31:24.572148] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:09:39.621 23:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:09:39.621 "name": "Existed_Raid", 00:09:39.621 "aliases": [ 00:09:39.621 "1bb963cb-4bee-4fd9-975c-f57759ea751b" 00:09:39.621 ], 00:09:39.621 "product_name": "Raid Volume", 00:09:39.621 "block_size": 512, 00:09:39.621 "num_blocks": 131072, 00:09:39.621 "uuid": "1bb963cb-4bee-4fd9-975c-f57759ea751b", 00:09:39.621 "assigned_rate_limits": { 00:09:39.621 "rw_ios_per_sec": 0, 00:09:39.621 "rw_mbytes_per_sec": 0, 00:09:39.621 "r_mbytes_per_sec": 0, 00:09:39.621 "w_mbytes_per_sec": 0 00:09:39.621 }, 00:09:39.621 "claimed": false, 00:09:39.621 "zoned": false, 00:09:39.621 "supported_io_types": { 00:09:39.621 "read": true, 00:09:39.621 "write": true, 00:09:39.621 "unmap": true, 00:09:39.621 "flush": true, 00:09:39.621 "reset": true, 00:09:39.621 "nvme_admin": false, 00:09:39.621 "nvme_io": false, 00:09:39.621 "nvme_io_md": false, 00:09:39.621 "write_zeroes": true, 00:09:39.621 "zcopy": false, 00:09:39.621 "get_zone_info": false, 00:09:39.621 "zone_management": false, 00:09:39.621 "zone_append": false, 00:09:39.621 "compare": false, 00:09:39.621 "compare_and_write": false, 00:09:39.621 "abort": false, 00:09:39.621 "seek_hole": false, 00:09:39.621 "seek_data": false, 00:09:39.621 "copy": false, 00:09:39.621 "nvme_iov_md": false 00:09:39.621 }, 00:09:39.621 "memory_domains": [ 00:09:39.621 { 00:09:39.621 "dma_device_id": "system", 00:09:39.621 "dma_device_type": 1 00:09:39.621 }, 00:09:39.621 { 00:09:39.621 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:39.621 "dma_device_type": 2 00:09:39.621 }, 00:09:39.621 { 00:09:39.621 "dma_device_id": "system", 00:09:39.621 "dma_device_type": 1 00:09:39.621 }, 00:09:39.621 { 00:09:39.621 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:39.621 "dma_device_type": 2 00:09:39.621 } 00:09:39.621 ], 00:09:39.621 "driver_specific": { 00:09:39.621 "raid": { 00:09:39.621 "uuid": "1bb963cb-4bee-4fd9-975c-f57759ea751b", 00:09:39.621 "strip_size_kb": 64, 00:09:39.621 "state": "online", 00:09:39.621 "raid_level": "concat", 00:09:39.621 "superblock": false, 00:09:39.621 "num_base_bdevs": 2, 00:09:39.621 "num_base_bdevs_discovered": 2, 00:09:39.621 "num_base_bdevs_operational": 2, 00:09:39.621 "base_bdevs_list": [ 00:09:39.621 { 00:09:39.621 "name": "BaseBdev1", 00:09:39.621 "uuid": "dbe90875-c68f-45c6-8bf1-cc44313b7530", 00:09:39.621 "is_configured": true, 00:09:39.621 "data_offset": 0, 00:09:39.621 "data_size": 65536 00:09:39.621 }, 00:09:39.621 { 00:09:39.621 "name": "BaseBdev2", 00:09:39.621 "uuid": "6610e634-05d7-451a-9070-8f6966dbc6c0", 00:09:39.621 "is_configured": true, 00:09:39.621 "data_offset": 0, 00:09:39.621 "data_size": 65536 00:09:39.621 } 00:09:39.621 ] 00:09:39.621 } 00:09:39.621 } 00:09:39.621 }' 00:09:39.621 23:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:09:39.879 23:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:09:39.879 BaseBdev2' 00:09:39.879 23:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:09:39.879 23:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:09:39.879 23:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:09:39.879 23:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:09:39.879 "name": "BaseBdev1", 00:09:39.879 "aliases": [ 00:09:39.879 "dbe90875-c68f-45c6-8bf1-cc44313b7530" 00:09:39.879 ], 00:09:39.879 "product_name": "Malloc disk", 00:09:39.879 "block_size": 512, 00:09:39.879 "num_blocks": 65536, 00:09:39.879 "uuid": "dbe90875-c68f-45c6-8bf1-cc44313b7530", 00:09:39.879 "assigned_rate_limits": { 00:09:39.879 "rw_ios_per_sec": 0, 00:09:39.879 "rw_mbytes_per_sec": 0, 00:09:39.879 "r_mbytes_per_sec": 0, 00:09:39.879 "w_mbytes_per_sec": 0 00:09:39.879 }, 00:09:39.879 "claimed": true, 00:09:39.879 "claim_type": "exclusive_write", 00:09:39.879 "zoned": false, 00:09:39.879 "supported_io_types": { 00:09:39.879 "read": true, 00:09:39.879 "write": true, 00:09:39.879 "unmap": true, 00:09:39.879 "flush": true, 00:09:39.879 "reset": true, 00:09:39.879 "nvme_admin": false, 00:09:39.879 "nvme_io": false, 00:09:39.879 "nvme_io_md": false, 00:09:39.879 "write_zeroes": true, 00:09:39.879 "zcopy": true, 00:09:39.879 "get_zone_info": false, 00:09:39.879 "zone_management": false, 00:09:39.879 "zone_append": false, 00:09:39.879 "compare": false, 00:09:39.879 "compare_and_write": false, 00:09:39.879 "abort": true, 00:09:39.879 "seek_hole": false, 00:09:39.879 "seek_data": false, 00:09:39.879 "copy": true, 00:09:39.879 "nvme_iov_md": false 00:09:39.879 }, 00:09:39.879 "memory_domains": [ 00:09:39.879 { 00:09:39.879 "dma_device_id": "system", 00:09:39.879 "dma_device_type": 1 00:09:39.879 }, 00:09:39.879 { 00:09:39.879 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:39.879 "dma_device_type": 2 00:09:39.879 } 00:09:39.879 ], 00:09:39.879 "driver_specific": {} 00:09:39.879 }' 00:09:39.879 23:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:39.879 23:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:40.138 23:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:09:40.138 23:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:40.138 23:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:40.138 23:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:09:40.138 23:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:40.138 23:31:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:40.138 23:31:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:09:40.138 23:31:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:40.138 23:31:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:40.138 23:31:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:09:40.138 23:31:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:09:40.138 23:31:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:09:40.138 23:31:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:09:40.396 23:31:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:09:40.396 "name": "BaseBdev2", 00:09:40.396 "aliases": [ 00:09:40.396 "6610e634-05d7-451a-9070-8f6966dbc6c0" 00:09:40.396 ], 00:09:40.396 "product_name": "Malloc disk", 00:09:40.396 "block_size": 512, 00:09:40.396 "num_blocks": 65536, 00:09:40.396 "uuid": "6610e634-05d7-451a-9070-8f6966dbc6c0", 00:09:40.396 "assigned_rate_limits": { 00:09:40.396 "rw_ios_per_sec": 0, 00:09:40.396 "rw_mbytes_per_sec": 0, 00:09:40.396 "r_mbytes_per_sec": 0, 00:09:40.396 "w_mbytes_per_sec": 0 00:09:40.396 }, 00:09:40.396 "claimed": true, 00:09:40.396 "claim_type": "exclusive_write", 00:09:40.396 "zoned": false, 00:09:40.396 "supported_io_types": { 00:09:40.396 "read": true, 00:09:40.396 "write": true, 00:09:40.396 "unmap": true, 00:09:40.396 "flush": true, 00:09:40.396 "reset": true, 00:09:40.396 "nvme_admin": false, 00:09:40.396 "nvme_io": false, 00:09:40.396 "nvme_io_md": false, 00:09:40.396 "write_zeroes": true, 00:09:40.396 "zcopy": true, 00:09:40.396 "get_zone_info": false, 00:09:40.396 "zone_management": false, 00:09:40.396 "zone_append": false, 00:09:40.396 "compare": false, 00:09:40.396 "compare_and_write": false, 00:09:40.396 "abort": true, 00:09:40.396 "seek_hole": false, 00:09:40.396 "seek_data": false, 00:09:40.396 "copy": true, 00:09:40.396 "nvme_iov_md": false 00:09:40.396 }, 00:09:40.396 "memory_domains": [ 00:09:40.396 { 00:09:40.396 "dma_device_id": "system", 00:09:40.396 "dma_device_type": 1 00:09:40.396 }, 00:09:40.396 { 00:09:40.396 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:40.396 "dma_device_type": 2 00:09:40.396 } 00:09:40.396 ], 00:09:40.396 "driver_specific": {} 00:09:40.396 }' 00:09:40.396 23:31:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:40.396 23:31:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:40.396 23:31:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:09:40.396 23:31:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:40.396 23:31:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:40.654 23:31:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:09:40.654 23:31:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:40.654 23:31:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:40.654 23:31:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:09:40.654 23:31:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:40.654 23:31:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:40.654 23:31:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:09:40.654 23:31:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:09:40.912 [2024-07-24 23:31:25.714930] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:09:40.912 [2024-07-24 23:31:25.714950] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:09:40.912 [2024-07-24 23:31:25.714978] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:09:40.912 23:31:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:09:40.912 23:31:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:09:40.912 23:31:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:09:40.912 23:31:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:09:40.912 23:31:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:09:40.912 23:31:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 1 00:09:40.912 23:31:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:40.912 23:31:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:09:40.912 23:31:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:09:40.912 23:31:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:40.912 23:31:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:09:40.912 23:31:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:40.912 23:31:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:40.912 23:31:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:40.912 23:31:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:40.912 23:31:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:40.912 23:31:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:40.912 23:31:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:40.912 "name": "Existed_Raid", 00:09:40.912 "uuid": "1bb963cb-4bee-4fd9-975c-f57759ea751b", 00:09:40.912 "strip_size_kb": 64, 00:09:40.912 "state": "offline", 00:09:40.912 "raid_level": "concat", 00:09:40.912 "superblock": false, 00:09:40.912 "num_base_bdevs": 2, 00:09:40.912 "num_base_bdevs_discovered": 1, 00:09:40.912 "num_base_bdevs_operational": 1, 00:09:40.912 "base_bdevs_list": [ 00:09:40.912 { 00:09:40.912 "name": null, 00:09:40.912 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:40.912 "is_configured": false, 00:09:40.912 "data_offset": 0, 00:09:40.912 "data_size": 65536 00:09:40.912 }, 00:09:40.912 { 00:09:40.912 "name": "BaseBdev2", 00:09:40.912 "uuid": "6610e634-05d7-451a-9070-8f6966dbc6c0", 00:09:40.912 "is_configured": true, 00:09:40.912 "data_offset": 0, 00:09:40.912 "data_size": 65536 00:09:40.912 } 00:09:40.912 ] 00:09:40.912 }' 00:09:40.913 23:31:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:40.913 23:31:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:41.480 23:31:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:09:41.480 23:31:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:09:41.480 23:31:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:41.480 23:31:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:09:41.738 23:31:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:09:41.738 23:31:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:09:41.738 23:31:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:09:41.738 [2024-07-24 23:31:26.734365] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:09:41.738 [2024-07-24 23:31:26.734406] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x134e050 name Existed_Raid, state offline 00:09:41.996 23:31:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:09:41.996 23:31:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:09:41.996 23:31:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:09:41.996 23:31:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:41.996 23:31:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:09:41.996 23:31:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:09:41.996 23:31:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:09:41.996 23:31:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 249337 00:09:41.996 23:31:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@950 -- # '[' -z 249337 ']' 00:09:41.996 23:31:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # kill -0 249337 00:09:41.996 23:31:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # uname 00:09:41.996 23:31:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:09:41.996 23:31:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 249337 00:09:41.996 23:31:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:09:41.996 23:31:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:09:41.996 23:31:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 249337' 00:09:41.996 killing process with pid 249337 00:09:41.996 23:31:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@969 -- # kill 249337 00:09:41.996 [2024-07-24 23:31:26.971436] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:09:41.996 23:31:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@974 -- # wait 249337 00:09:41.996 [2024-07-24 23:31:26.972228] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:09:42.254 23:31:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:09:42.254 00:09:42.254 real 0m8.053s 00:09:42.254 user 0m14.401s 00:09:42.254 sys 0m1.277s 00:09:42.254 23:31:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:42.254 23:31:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:42.254 ************************************ 00:09:42.254 END TEST raid_state_function_test 00:09:42.254 ************************************ 00:09:42.254 23:31:27 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 2 true 00:09:42.254 23:31:27 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:09:42.254 23:31:27 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:42.254 23:31:27 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:09:42.254 ************************************ 00:09:42.254 START TEST raid_state_function_test_sb 00:09:42.254 ************************************ 00:09:42.254 23:31:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # raid_state_function_test concat 2 true 00:09:42.254 23:31:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:09:42.254 23:31:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:09:42.255 23:31:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:09:42.255 23:31:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:09:42.255 23:31:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:09:42.255 23:31:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:09:42.255 23:31:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:09:42.255 23:31:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:09:42.255 23:31:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:09:42.255 23:31:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:09:42.255 23:31:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:09:42.255 23:31:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:09:42.255 23:31:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:09:42.255 23:31:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:09:42.255 23:31:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:09:42.255 23:31:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:09:42.255 23:31:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:09:42.255 23:31:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:09:42.255 23:31:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:09:42.255 23:31:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:09:42.255 23:31:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:09:42.255 23:31:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:09:42.255 23:31:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:09:42.255 23:31:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=250931 00:09:42.255 23:31:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 250931' 00:09:42.255 Process raid pid: 250931 00:09:42.255 23:31:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:09:42.255 23:31:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 250931 /var/tmp/spdk-raid.sock 00:09:42.255 23:31:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@831 -- # '[' -z 250931 ']' 00:09:42.255 23:31:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:09:42.255 23:31:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # local max_retries=100 00:09:42.255 23:31:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:09:42.255 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:09:42.255 23:31:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@840 -- # xtrace_disable 00:09:42.255 23:31:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:09:42.513 [2024-07-24 23:31:27.271053] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:09:42.513 [2024-07-24 23:31:27.271089] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:09:42.513 [2024-07-24 23:31:27.334053] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:42.513 [2024-07-24 23:31:27.412094] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:42.513 [2024-07-24 23:31:27.461905] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:42.513 [2024-07-24 23:31:27.461943] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:43.079 23:31:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:09:43.079 23:31:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@864 -- # return 0 00:09:43.079 23:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:09:43.336 [2024-07-24 23:31:28.208630] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:09:43.336 [2024-07-24 23:31:28.208659] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:09:43.336 [2024-07-24 23:31:28.208664] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:09:43.336 [2024-07-24 23:31:28.208670] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:09:43.336 23:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:09:43.336 23:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:43.336 23:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:09:43.336 23:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:09:43.336 23:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:43.336 23:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:43.336 23:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:43.336 23:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:43.336 23:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:43.336 23:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:43.336 23:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:43.336 23:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:43.594 23:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:43.594 "name": "Existed_Raid", 00:09:43.594 "uuid": "800c8432-f1d9-4162-8013-ef5941ff99b8", 00:09:43.594 "strip_size_kb": 64, 00:09:43.594 "state": "configuring", 00:09:43.594 "raid_level": "concat", 00:09:43.594 "superblock": true, 00:09:43.594 "num_base_bdevs": 2, 00:09:43.594 "num_base_bdevs_discovered": 0, 00:09:43.594 "num_base_bdevs_operational": 2, 00:09:43.594 "base_bdevs_list": [ 00:09:43.594 { 00:09:43.594 "name": "BaseBdev1", 00:09:43.594 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:43.594 "is_configured": false, 00:09:43.594 "data_offset": 0, 00:09:43.594 "data_size": 0 00:09:43.594 }, 00:09:43.594 { 00:09:43.594 "name": "BaseBdev2", 00:09:43.594 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:43.594 "is_configured": false, 00:09:43.594 "data_offset": 0, 00:09:43.594 "data_size": 0 00:09:43.594 } 00:09:43.594 ] 00:09:43.594 }' 00:09:43.594 23:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:43.594 23:31:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:09:44.159 23:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:09:44.159 [2024-07-24 23:31:29.050729] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:09:44.159 [2024-07-24 23:31:29.050753] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1498b10 name Existed_Raid, state configuring 00:09:44.159 23:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:09:44.417 [2024-07-24 23:31:29.235228] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:09:44.417 [2024-07-24 23:31:29.235246] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:09:44.417 [2024-07-24 23:31:29.235251] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:09:44.417 [2024-07-24 23:31:29.235256] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:09:44.417 23:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:09:44.417 [2024-07-24 23:31:29.415963] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:09:44.417 BaseBdev1 00:09:44.675 23:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:09:44.675 23:31:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:09:44.675 23:31:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:09:44.675 23:31:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:09:44.675 23:31:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:09:44.675 23:31:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:09:44.675 23:31:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:09:44.675 23:31:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:09:44.933 [ 00:09:44.933 { 00:09:44.933 "name": "BaseBdev1", 00:09:44.933 "aliases": [ 00:09:44.933 "497c3056-b145-4ff2-884e-01dd690f90bd" 00:09:44.933 ], 00:09:44.933 "product_name": "Malloc disk", 00:09:44.933 "block_size": 512, 00:09:44.933 "num_blocks": 65536, 00:09:44.933 "uuid": "497c3056-b145-4ff2-884e-01dd690f90bd", 00:09:44.933 "assigned_rate_limits": { 00:09:44.933 "rw_ios_per_sec": 0, 00:09:44.933 "rw_mbytes_per_sec": 0, 00:09:44.933 "r_mbytes_per_sec": 0, 00:09:44.933 "w_mbytes_per_sec": 0 00:09:44.933 }, 00:09:44.933 "claimed": true, 00:09:44.933 "claim_type": "exclusive_write", 00:09:44.933 "zoned": false, 00:09:44.933 "supported_io_types": { 00:09:44.933 "read": true, 00:09:44.933 "write": true, 00:09:44.933 "unmap": true, 00:09:44.933 "flush": true, 00:09:44.933 "reset": true, 00:09:44.933 "nvme_admin": false, 00:09:44.933 "nvme_io": false, 00:09:44.933 "nvme_io_md": false, 00:09:44.933 "write_zeroes": true, 00:09:44.933 "zcopy": true, 00:09:44.933 "get_zone_info": false, 00:09:44.933 "zone_management": false, 00:09:44.933 "zone_append": false, 00:09:44.933 "compare": false, 00:09:44.933 "compare_and_write": false, 00:09:44.933 "abort": true, 00:09:44.933 "seek_hole": false, 00:09:44.933 "seek_data": false, 00:09:44.933 "copy": true, 00:09:44.933 "nvme_iov_md": false 00:09:44.933 }, 00:09:44.933 "memory_domains": [ 00:09:44.933 { 00:09:44.933 "dma_device_id": "system", 00:09:44.933 "dma_device_type": 1 00:09:44.933 }, 00:09:44.933 { 00:09:44.933 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:44.933 "dma_device_type": 2 00:09:44.934 } 00:09:44.934 ], 00:09:44.934 "driver_specific": {} 00:09:44.934 } 00:09:44.934 ] 00:09:44.934 23:31:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:09:44.934 23:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:09:44.934 23:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:44.934 23:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:09:44.934 23:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:09:44.934 23:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:44.934 23:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:44.934 23:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:44.934 23:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:44.934 23:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:44.934 23:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:44.934 23:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:44.934 23:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:45.191 23:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:45.191 "name": "Existed_Raid", 00:09:45.191 "uuid": "5f28f16c-1922-457c-9c6f-fe4b95d8d62d", 00:09:45.191 "strip_size_kb": 64, 00:09:45.191 "state": "configuring", 00:09:45.191 "raid_level": "concat", 00:09:45.191 "superblock": true, 00:09:45.191 "num_base_bdevs": 2, 00:09:45.191 "num_base_bdevs_discovered": 1, 00:09:45.191 "num_base_bdevs_operational": 2, 00:09:45.191 "base_bdevs_list": [ 00:09:45.191 { 00:09:45.191 "name": "BaseBdev1", 00:09:45.191 "uuid": "497c3056-b145-4ff2-884e-01dd690f90bd", 00:09:45.191 "is_configured": true, 00:09:45.191 "data_offset": 2048, 00:09:45.191 "data_size": 63488 00:09:45.191 }, 00:09:45.191 { 00:09:45.191 "name": "BaseBdev2", 00:09:45.191 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:45.191 "is_configured": false, 00:09:45.191 "data_offset": 0, 00:09:45.191 "data_size": 0 00:09:45.191 } 00:09:45.191 ] 00:09:45.191 }' 00:09:45.191 23:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:45.191 23:31:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:09:45.449 23:31:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:09:45.707 [2024-07-24 23:31:30.595005] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:09:45.708 [2024-07-24 23:31:30.595036] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14983a0 name Existed_Raid, state configuring 00:09:45.708 23:31:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:09:45.965 [2024-07-24 23:31:30.771510] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:09:45.965 [2024-07-24 23:31:30.772548] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:09:45.966 [2024-07-24 23:31:30.772572] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:09:45.966 23:31:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:09:45.966 23:31:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:09:45.966 23:31:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:09:45.966 23:31:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:45.966 23:31:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:09:45.966 23:31:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:09:45.966 23:31:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:45.966 23:31:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:45.966 23:31:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:45.966 23:31:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:45.966 23:31:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:45.966 23:31:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:45.966 23:31:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:45.966 23:31:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:45.966 23:31:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:45.966 "name": "Existed_Raid", 00:09:45.966 "uuid": "67808d70-b6f2-4a62-974b-af85cc52db5f", 00:09:45.966 "strip_size_kb": 64, 00:09:45.966 "state": "configuring", 00:09:45.966 "raid_level": "concat", 00:09:45.966 "superblock": true, 00:09:45.966 "num_base_bdevs": 2, 00:09:45.966 "num_base_bdevs_discovered": 1, 00:09:45.966 "num_base_bdevs_operational": 2, 00:09:45.966 "base_bdevs_list": [ 00:09:45.966 { 00:09:45.966 "name": "BaseBdev1", 00:09:45.966 "uuid": "497c3056-b145-4ff2-884e-01dd690f90bd", 00:09:45.966 "is_configured": true, 00:09:45.966 "data_offset": 2048, 00:09:45.966 "data_size": 63488 00:09:45.966 }, 00:09:45.966 { 00:09:45.966 "name": "BaseBdev2", 00:09:45.966 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:45.966 "is_configured": false, 00:09:45.966 "data_offset": 0, 00:09:45.966 "data_size": 0 00:09:45.966 } 00:09:45.966 ] 00:09:45.966 }' 00:09:46.223 23:31:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:46.223 23:31:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:09:46.481 23:31:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:09:46.740 [2024-07-24 23:31:31.596182] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:09:46.740 [2024-07-24 23:31:31.596285] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1499050 00:09:46.740 [2024-07-24 23:31:31.596293] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:09:46.740 [2024-07-24 23:31:31.596405] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x149ceb0 00:09:46.740 [2024-07-24 23:31:31.596487] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1499050 00:09:46.740 [2024-07-24 23:31:31.596493] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1499050 00:09:46.740 [2024-07-24 23:31:31.596569] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:46.740 BaseBdev2 00:09:46.740 23:31:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:09:46.740 23:31:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:09:46.740 23:31:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:09:46.740 23:31:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:09:46.740 23:31:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:09:46.740 23:31:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:09:46.740 23:31:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:09:46.998 23:31:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:09:46.998 [ 00:09:46.998 { 00:09:46.998 "name": "BaseBdev2", 00:09:46.998 "aliases": [ 00:09:46.998 "b3fb316e-3737-49a3-8c21-382b48e19035" 00:09:46.998 ], 00:09:46.998 "product_name": "Malloc disk", 00:09:46.998 "block_size": 512, 00:09:46.998 "num_blocks": 65536, 00:09:46.998 "uuid": "b3fb316e-3737-49a3-8c21-382b48e19035", 00:09:46.998 "assigned_rate_limits": { 00:09:46.999 "rw_ios_per_sec": 0, 00:09:46.999 "rw_mbytes_per_sec": 0, 00:09:46.999 "r_mbytes_per_sec": 0, 00:09:46.999 "w_mbytes_per_sec": 0 00:09:46.999 }, 00:09:46.999 "claimed": true, 00:09:46.999 "claim_type": "exclusive_write", 00:09:46.999 "zoned": false, 00:09:46.999 "supported_io_types": { 00:09:46.999 "read": true, 00:09:46.999 "write": true, 00:09:46.999 "unmap": true, 00:09:46.999 "flush": true, 00:09:46.999 "reset": true, 00:09:46.999 "nvme_admin": false, 00:09:46.999 "nvme_io": false, 00:09:46.999 "nvme_io_md": false, 00:09:46.999 "write_zeroes": true, 00:09:46.999 "zcopy": true, 00:09:46.999 "get_zone_info": false, 00:09:46.999 "zone_management": false, 00:09:46.999 "zone_append": false, 00:09:46.999 "compare": false, 00:09:46.999 "compare_and_write": false, 00:09:46.999 "abort": true, 00:09:46.999 "seek_hole": false, 00:09:46.999 "seek_data": false, 00:09:46.999 "copy": true, 00:09:46.999 "nvme_iov_md": false 00:09:46.999 }, 00:09:46.999 "memory_domains": [ 00:09:46.999 { 00:09:46.999 "dma_device_id": "system", 00:09:46.999 "dma_device_type": 1 00:09:46.999 }, 00:09:46.999 { 00:09:46.999 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:46.999 "dma_device_type": 2 00:09:46.999 } 00:09:46.999 ], 00:09:46.999 "driver_specific": {} 00:09:46.999 } 00:09:46.999 ] 00:09:46.999 23:31:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:09:46.999 23:31:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:09:46.999 23:31:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:09:46.999 23:31:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 2 00:09:46.999 23:31:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:46.999 23:31:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:09:46.999 23:31:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:09:46.999 23:31:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:46.999 23:31:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:46.999 23:31:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:46.999 23:31:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:46.999 23:31:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:46.999 23:31:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:46.999 23:31:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:46.999 23:31:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:47.257 23:31:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:47.257 "name": "Existed_Raid", 00:09:47.257 "uuid": "67808d70-b6f2-4a62-974b-af85cc52db5f", 00:09:47.257 "strip_size_kb": 64, 00:09:47.257 "state": "online", 00:09:47.257 "raid_level": "concat", 00:09:47.257 "superblock": true, 00:09:47.257 "num_base_bdevs": 2, 00:09:47.257 "num_base_bdevs_discovered": 2, 00:09:47.257 "num_base_bdevs_operational": 2, 00:09:47.257 "base_bdevs_list": [ 00:09:47.257 { 00:09:47.257 "name": "BaseBdev1", 00:09:47.257 "uuid": "497c3056-b145-4ff2-884e-01dd690f90bd", 00:09:47.257 "is_configured": true, 00:09:47.257 "data_offset": 2048, 00:09:47.257 "data_size": 63488 00:09:47.257 }, 00:09:47.257 { 00:09:47.257 "name": "BaseBdev2", 00:09:47.257 "uuid": "b3fb316e-3737-49a3-8c21-382b48e19035", 00:09:47.257 "is_configured": true, 00:09:47.257 "data_offset": 2048, 00:09:47.257 "data_size": 63488 00:09:47.257 } 00:09:47.257 ] 00:09:47.258 }' 00:09:47.258 23:31:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:47.258 23:31:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:09:47.824 23:31:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:09:47.824 23:31:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:09:47.824 23:31:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:09:47.824 23:31:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:09:47.824 23:31:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:09:47.824 23:31:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:09:47.824 23:31:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:09:47.824 23:31:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:09:47.824 [2024-07-24 23:31:32.751320] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:09:47.824 23:31:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:09:47.824 "name": "Existed_Raid", 00:09:47.824 "aliases": [ 00:09:47.824 "67808d70-b6f2-4a62-974b-af85cc52db5f" 00:09:47.824 ], 00:09:47.824 "product_name": "Raid Volume", 00:09:47.824 "block_size": 512, 00:09:47.824 "num_blocks": 126976, 00:09:47.824 "uuid": "67808d70-b6f2-4a62-974b-af85cc52db5f", 00:09:47.824 "assigned_rate_limits": { 00:09:47.824 "rw_ios_per_sec": 0, 00:09:47.824 "rw_mbytes_per_sec": 0, 00:09:47.824 "r_mbytes_per_sec": 0, 00:09:47.824 "w_mbytes_per_sec": 0 00:09:47.824 }, 00:09:47.824 "claimed": false, 00:09:47.824 "zoned": false, 00:09:47.824 "supported_io_types": { 00:09:47.824 "read": true, 00:09:47.824 "write": true, 00:09:47.824 "unmap": true, 00:09:47.824 "flush": true, 00:09:47.824 "reset": true, 00:09:47.824 "nvme_admin": false, 00:09:47.824 "nvme_io": false, 00:09:47.824 "nvme_io_md": false, 00:09:47.824 "write_zeroes": true, 00:09:47.824 "zcopy": false, 00:09:47.824 "get_zone_info": false, 00:09:47.824 "zone_management": false, 00:09:47.824 "zone_append": false, 00:09:47.824 "compare": false, 00:09:47.824 "compare_and_write": false, 00:09:47.824 "abort": false, 00:09:47.824 "seek_hole": false, 00:09:47.824 "seek_data": false, 00:09:47.824 "copy": false, 00:09:47.824 "nvme_iov_md": false 00:09:47.824 }, 00:09:47.824 "memory_domains": [ 00:09:47.824 { 00:09:47.824 "dma_device_id": "system", 00:09:47.824 "dma_device_type": 1 00:09:47.824 }, 00:09:47.824 { 00:09:47.824 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:47.824 "dma_device_type": 2 00:09:47.824 }, 00:09:47.824 { 00:09:47.824 "dma_device_id": "system", 00:09:47.824 "dma_device_type": 1 00:09:47.824 }, 00:09:47.824 { 00:09:47.824 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:47.824 "dma_device_type": 2 00:09:47.824 } 00:09:47.824 ], 00:09:47.824 "driver_specific": { 00:09:47.824 "raid": { 00:09:47.824 "uuid": "67808d70-b6f2-4a62-974b-af85cc52db5f", 00:09:47.824 "strip_size_kb": 64, 00:09:47.824 "state": "online", 00:09:47.824 "raid_level": "concat", 00:09:47.824 "superblock": true, 00:09:47.824 "num_base_bdevs": 2, 00:09:47.824 "num_base_bdevs_discovered": 2, 00:09:47.824 "num_base_bdevs_operational": 2, 00:09:47.824 "base_bdevs_list": [ 00:09:47.824 { 00:09:47.824 "name": "BaseBdev1", 00:09:47.824 "uuid": "497c3056-b145-4ff2-884e-01dd690f90bd", 00:09:47.824 "is_configured": true, 00:09:47.824 "data_offset": 2048, 00:09:47.824 "data_size": 63488 00:09:47.824 }, 00:09:47.824 { 00:09:47.824 "name": "BaseBdev2", 00:09:47.824 "uuid": "b3fb316e-3737-49a3-8c21-382b48e19035", 00:09:47.824 "is_configured": true, 00:09:47.824 "data_offset": 2048, 00:09:47.824 "data_size": 63488 00:09:47.824 } 00:09:47.824 ] 00:09:47.824 } 00:09:47.824 } 00:09:47.824 }' 00:09:47.824 23:31:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:09:47.824 23:31:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:09:47.824 BaseBdev2' 00:09:47.824 23:31:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:09:47.825 23:31:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:09:47.825 23:31:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:09:48.083 23:31:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:09:48.083 "name": "BaseBdev1", 00:09:48.083 "aliases": [ 00:09:48.083 "497c3056-b145-4ff2-884e-01dd690f90bd" 00:09:48.083 ], 00:09:48.083 "product_name": "Malloc disk", 00:09:48.083 "block_size": 512, 00:09:48.083 "num_blocks": 65536, 00:09:48.083 "uuid": "497c3056-b145-4ff2-884e-01dd690f90bd", 00:09:48.083 "assigned_rate_limits": { 00:09:48.083 "rw_ios_per_sec": 0, 00:09:48.083 "rw_mbytes_per_sec": 0, 00:09:48.083 "r_mbytes_per_sec": 0, 00:09:48.083 "w_mbytes_per_sec": 0 00:09:48.083 }, 00:09:48.083 "claimed": true, 00:09:48.083 "claim_type": "exclusive_write", 00:09:48.083 "zoned": false, 00:09:48.083 "supported_io_types": { 00:09:48.083 "read": true, 00:09:48.083 "write": true, 00:09:48.083 "unmap": true, 00:09:48.083 "flush": true, 00:09:48.083 "reset": true, 00:09:48.083 "nvme_admin": false, 00:09:48.083 "nvme_io": false, 00:09:48.083 "nvme_io_md": false, 00:09:48.083 "write_zeroes": true, 00:09:48.083 "zcopy": true, 00:09:48.083 "get_zone_info": false, 00:09:48.083 "zone_management": false, 00:09:48.083 "zone_append": false, 00:09:48.083 "compare": false, 00:09:48.083 "compare_and_write": false, 00:09:48.083 "abort": true, 00:09:48.083 "seek_hole": false, 00:09:48.083 "seek_data": false, 00:09:48.083 "copy": true, 00:09:48.083 "nvme_iov_md": false 00:09:48.083 }, 00:09:48.083 "memory_domains": [ 00:09:48.083 { 00:09:48.083 "dma_device_id": "system", 00:09:48.083 "dma_device_type": 1 00:09:48.083 }, 00:09:48.083 { 00:09:48.083 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:48.083 "dma_device_type": 2 00:09:48.083 } 00:09:48.083 ], 00:09:48.083 "driver_specific": {} 00:09:48.083 }' 00:09:48.083 23:31:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:48.083 23:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:48.083 23:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:09:48.083 23:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:48.341 23:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:48.341 23:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:09:48.341 23:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:48.341 23:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:48.341 23:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:09:48.341 23:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:48.341 23:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:48.341 23:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:09:48.341 23:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:09:48.341 23:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:09:48.341 23:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:09:48.598 23:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:09:48.598 "name": "BaseBdev2", 00:09:48.598 "aliases": [ 00:09:48.598 "b3fb316e-3737-49a3-8c21-382b48e19035" 00:09:48.598 ], 00:09:48.598 "product_name": "Malloc disk", 00:09:48.598 "block_size": 512, 00:09:48.598 "num_blocks": 65536, 00:09:48.598 "uuid": "b3fb316e-3737-49a3-8c21-382b48e19035", 00:09:48.598 "assigned_rate_limits": { 00:09:48.598 "rw_ios_per_sec": 0, 00:09:48.598 "rw_mbytes_per_sec": 0, 00:09:48.598 "r_mbytes_per_sec": 0, 00:09:48.598 "w_mbytes_per_sec": 0 00:09:48.598 }, 00:09:48.598 "claimed": true, 00:09:48.598 "claim_type": "exclusive_write", 00:09:48.598 "zoned": false, 00:09:48.598 "supported_io_types": { 00:09:48.598 "read": true, 00:09:48.598 "write": true, 00:09:48.598 "unmap": true, 00:09:48.598 "flush": true, 00:09:48.598 "reset": true, 00:09:48.598 "nvme_admin": false, 00:09:48.598 "nvme_io": false, 00:09:48.598 "nvme_io_md": false, 00:09:48.598 "write_zeroes": true, 00:09:48.598 "zcopy": true, 00:09:48.598 "get_zone_info": false, 00:09:48.598 "zone_management": false, 00:09:48.598 "zone_append": false, 00:09:48.598 "compare": false, 00:09:48.598 "compare_and_write": false, 00:09:48.598 "abort": true, 00:09:48.598 "seek_hole": false, 00:09:48.598 "seek_data": false, 00:09:48.598 "copy": true, 00:09:48.598 "nvme_iov_md": false 00:09:48.598 }, 00:09:48.598 "memory_domains": [ 00:09:48.598 { 00:09:48.598 "dma_device_id": "system", 00:09:48.598 "dma_device_type": 1 00:09:48.598 }, 00:09:48.598 { 00:09:48.598 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:48.598 "dma_device_type": 2 00:09:48.598 } 00:09:48.598 ], 00:09:48.598 "driver_specific": {} 00:09:48.598 }' 00:09:48.598 23:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:48.598 23:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:48.598 23:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:09:48.598 23:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:48.598 23:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:48.855 23:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:09:48.855 23:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:48.855 23:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:48.855 23:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:09:48.855 23:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:48.855 23:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:48.855 23:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:09:48.855 23:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:09:49.147 [2024-07-24 23:31:33.938253] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:09:49.147 [2024-07-24 23:31:33.938273] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:09:49.147 [2024-07-24 23:31:33.938300] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:09:49.147 23:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:09:49.147 23:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:09:49.147 23:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:09:49.147 23:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:09:49.147 23:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:09:49.147 23:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 1 00:09:49.147 23:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:49.147 23:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:09:49.147 23:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:09:49.147 23:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:49.147 23:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:09:49.147 23:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:49.147 23:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:49.147 23:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:49.147 23:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:49.147 23:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:49.147 23:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:49.147 23:31:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:49.147 "name": "Existed_Raid", 00:09:49.147 "uuid": "67808d70-b6f2-4a62-974b-af85cc52db5f", 00:09:49.147 "strip_size_kb": 64, 00:09:49.147 "state": "offline", 00:09:49.147 "raid_level": "concat", 00:09:49.147 "superblock": true, 00:09:49.147 "num_base_bdevs": 2, 00:09:49.147 "num_base_bdevs_discovered": 1, 00:09:49.147 "num_base_bdevs_operational": 1, 00:09:49.147 "base_bdevs_list": [ 00:09:49.147 { 00:09:49.147 "name": null, 00:09:49.147 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:49.147 "is_configured": false, 00:09:49.147 "data_offset": 2048, 00:09:49.147 "data_size": 63488 00:09:49.147 }, 00:09:49.147 { 00:09:49.147 "name": "BaseBdev2", 00:09:49.147 "uuid": "b3fb316e-3737-49a3-8c21-382b48e19035", 00:09:49.147 "is_configured": true, 00:09:49.147 "data_offset": 2048, 00:09:49.147 "data_size": 63488 00:09:49.147 } 00:09:49.147 ] 00:09:49.147 }' 00:09:49.147 23:31:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:49.147 23:31:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:09:49.711 23:31:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:09:49.711 23:31:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:09:49.711 23:31:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:49.711 23:31:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:09:49.969 23:31:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:09:49.969 23:31:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:09:49.969 23:31:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:09:49.969 [2024-07-24 23:31:34.909751] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:09:49.969 [2024-07-24 23:31:34.909789] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1499050 name Existed_Raid, state offline 00:09:49.969 23:31:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:09:49.969 23:31:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:09:49.969 23:31:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:49.969 23:31:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:09:50.227 23:31:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:09:50.227 23:31:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:09:50.227 23:31:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:09:50.227 23:31:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 250931 00:09:50.227 23:31:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@950 -- # '[' -z 250931 ']' 00:09:50.227 23:31:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # kill -0 250931 00:09:50.227 23:31:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # uname 00:09:50.227 23:31:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:09:50.227 23:31:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 250931 00:09:50.227 23:31:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:09:50.227 23:31:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:09:50.227 23:31:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # echo 'killing process with pid 250931' 00:09:50.227 killing process with pid 250931 00:09:50.227 23:31:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@969 -- # kill 250931 00:09:50.227 [2024-07-24 23:31:35.148739] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:09:50.227 23:31:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@974 -- # wait 250931 00:09:50.227 [2024-07-24 23:31:35.149518] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:09:50.486 23:31:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:09:50.486 00:09:50.486 real 0m8.107s 00:09:50.486 user 0m14.535s 00:09:50.486 sys 0m1.278s 00:09:50.486 23:31:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:50.486 23:31:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:09:50.486 ************************************ 00:09:50.486 END TEST raid_state_function_test_sb 00:09:50.486 ************************************ 00:09:50.486 23:31:35 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 2 00:09:50.486 23:31:35 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:09:50.486 23:31:35 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:50.486 23:31:35 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:09:50.486 ************************************ 00:09:50.486 START TEST raid_superblock_test 00:09:50.486 ************************************ 00:09:50.486 23:31:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # raid_superblock_test concat 2 00:09:50.486 23:31:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:09:50.486 23:31:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:09:50.486 23:31:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:09:50.486 23:31:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:09:50.486 23:31:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:09:50.486 23:31:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:09:50.486 23:31:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:09:50.486 23:31:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:09:50.486 23:31:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:09:50.486 23:31:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:09:50.486 23:31:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:09:50.486 23:31:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:09:50.486 23:31:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:09:50.486 23:31:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:09:50.486 23:31:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:09:50.486 23:31:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:09:50.486 23:31:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=252525 00:09:50.487 23:31:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 252525 /var/tmp/spdk-raid.sock 00:09:50.487 23:31:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:09:50.487 23:31:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@831 -- # '[' -z 252525 ']' 00:09:50.487 23:31:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:09:50.487 23:31:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:09:50.487 23:31:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:09:50.487 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:09:50.487 23:31:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:09:50.487 23:31:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:09:50.487 [2024-07-24 23:31:35.426911] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:09:50.487 [2024-07-24 23:31:35.426947] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid252525 ] 00:09:50.746 [2024-07-24 23:31:35.489113] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:50.746 [2024-07-24 23:31:35.567081] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:50.746 [2024-07-24 23:31:35.618079] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:50.746 [2024-07-24 23:31:35.618103] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:51.312 23:31:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:09:51.312 23:31:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@864 -- # return 0 00:09:51.312 23:31:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:09:51.312 23:31:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:09:51.312 23:31:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:09:51.312 23:31:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:09:51.312 23:31:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:09:51.312 23:31:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:09:51.312 23:31:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:09:51.312 23:31:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:09:51.312 23:31:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:09:51.571 malloc1 00:09:51.571 23:31:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:09:51.571 [2024-07-24 23:31:36.525421] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:09:51.571 [2024-07-24 23:31:36.525455] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:51.571 [2024-07-24 23:31:36.525470] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1acedd0 00:09:51.571 [2024-07-24 23:31:36.525476] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:51.571 [2024-07-24 23:31:36.526562] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:51.571 [2024-07-24 23:31:36.526583] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:09:51.571 pt1 00:09:51.571 23:31:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:09:51.571 23:31:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:09:51.571 23:31:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:09:51.571 23:31:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:09:51.571 23:31:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:09:51.571 23:31:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:09:51.571 23:31:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:09:51.571 23:31:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:09:51.571 23:31:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:09:51.830 malloc2 00:09:51.830 23:31:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:09:52.088 [2024-07-24 23:31:36.849862] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:09:52.088 [2024-07-24 23:31:36.849891] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:52.088 [2024-07-24 23:31:36.849905] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1acf8d0 00:09:52.088 [2024-07-24 23:31:36.849910] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:52.088 [2024-07-24 23:31:36.850861] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:52.088 [2024-07-24 23:31:36.850881] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:09:52.088 pt2 00:09:52.088 23:31:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:09:52.088 23:31:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:09:52.088 23:31:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2' -n raid_bdev1 -s 00:09:52.088 [2024-07-24 23:31:37.030341] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:09:52.088 [2024-07-24 23:31:37.031223] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:09:52.088 [2024-07-24 23:31:37.031320] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1ad22d0 00:09:52.088 [2024-07-24 23:31:37.031328] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:09:52.088 [2024-07-24 23:31:37.031454] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ad1730 00:09:52.088 [2024-07-24 23:31:37.031561] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1ad22d0 00:09:52.088 [2024-07-24 23:31:37.031566] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1ad22d0 00:09:52.088 [2024-07-24 23:31:37.031628] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:52.088 23:31:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:09:52.088 23:31:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:09:52.088 23:31:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:09:52.088 23:31:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:09:52.088 23:31:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:52.088 23:31:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:52.089 23:31:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:52.089 23:31:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:52.089 23:31:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:52.089 23:31:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:52.089 23:31:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:52.089 23:31:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:09:52.347 23:31:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:52.347 "name": "raid_bdev1", 00:09:52.347 "uuid": "b967975f-4483-4cfd-a633-62a6e769aa49", 00:09:52.347 "strip_size_kb": 64, 00:09:52.347 "state": "online", 00:09:52.347 "raid_level": "concat", 00:09:52.347 "superblock": true, 00:09:52.347 "num_base_bdevs": 2, 00:09:52.347 "num_base_bdevs_discovered": 2, 00:09:52.347 "num_base_bdevs_operational": 2, 00:09:52.347 "base_bdevs_list": [ 00:09:52.347 { 00:09:52.347 "name": "pt1", 00:09:52.347 "uuid": "00000000-0000-0000-0000-000000000001", 00:09:52.347 "is_configured": true, 00:09:52.347 "data_offset": 2048, 00:09:52.347 "data_size": 63488 00:09:52.347 }, 00:09:52.347 { 00:09:52.347 "name": "pt2", 00:09:52.347 "uuid": "00000000-0000-0000-0000-000000000002", 00:09:52.347 "is_configured": true, 00:09:52.347 "data_offset": 2048, 00:09:52.347 "data_size": 63488 00:09:52.347 } 00:09:52.347 ] 00:09:52.347 }' 00:09:52.347 23:31:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:52.347 23:31:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:09:52.912 23:31:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:09:52.912 23:31:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:09:52.912 23:31:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:09:52.912 23:31:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:09:52.912 23:31:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:09:52.912 23:31:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:09:52.912 23:31:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:09:52.912 23:31:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:09:52.912 [2024-07-24 23:31:37.816509] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:09:52.912 23:31:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:09:52.912 "name": "raid_bdev1", 00:09:52.912 "aliases": [ 00:09:52.912 "b967975f-4483-4cfd-a633-62a6e769aa49" 00:09:52.912 ], 00:09:52.912 "product_name": "Raid Volume", 00:09:52.912 "block_size": 512, 00:09:52.912 "num_blocks": 126976, 00:09:52.912 "uuid": "b967975f-4483-4cfd-a633-62a6e769aa49", 00:09:52.912 "assigned_rate_limits": { 00:09:52.912 "rw_ios_per_sec": 0, 00:09:52.912 "rw_mbytes_per_sec": 0, 00:09:52.912 "r_mbytes_per_sec": 0, 00:09:52.912 "w_mbytes_per_sec": 0 00:09:52.912 }, 00:09:52.912 "claimed": false, 00:09:52.912 "zoned": false, 00:09:52.912 "supported_io_types": { 00:09:52.912 "read": true, 00:09:52.912 "write": true, 00:09:52.912 "unmap": true, 00:09:52.912 "flush": true, 00:09:52.912 "reset": true, 00:09:52.912 "nvme_admin": false, 00:09:52.912 "nvme_io": false, 00:09:52.912 "nvme_io_md": false, 00:09:52.912 "write_zeroes": true, 00:09:52.912 "zcopy": false, 00:09:52.912 "get_zone_info": false, 00:09:52.912 "zone_management": false, 00:09:52.912 "zone_append": false, 00:09:52.912 "compare": false, 00:09:52.912 "compare_and_write": false, 00:09:52.912 "abort": false, 00:09:52.912 "seek_hole": false, 00:09:52.912 "seek_data": false, 00:09:52.912 "copy": false, 00:09:52.912 "nvme_iov_md": false 00:09:52.912 }, 00:09:52.912 "memory_domains": [ 00:09:52.912 { 00:09:52.912 "dma_device_id": "system", 00:09:52.912 "dma_device_type": 1 00:09:52.912 }, 00:09:52.912 { 00:09:52.912 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:52.912 "dma_device_type": 2 00:09:52.912 }, 00:09:52.912 { 00:09:52.912 "dma_device_id": "system", 00:09:52.912 "dma_device_type": 1 00:09:52.912 }, 00:09:52.912 { 00:09:52.912 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:52.912 "dma_device_type": 2 00:09:52.912 } 00:09:52.912 ], 00:09:52.912 "driver_specific": { 00:09:52.912 "raid": { 00:09:52.912 "uuid": "b967975f-4483-4cfd-a633-62a6e769aa49", 00:09:52.912 "strip_size_kb": 64, 00:09:52.912 "state": "online", 00:09:52.912 "raid_level": "concat", 00:09:52.912 "superblock": true, 00:09:52.912 "num_base_bdevs": 2, 00:09:52.912 "num_base_bdevs_discovered": 2, 00:09:52.912 "num_base_bdevs_operational": 2, 00:09:52.912 "base_bdevs_list": [ 00:09:52.912 { 00:09:52.912 "name": "pt1", 00:09:52.912 "uuid": "00000000-0000-0000-0000-000000000001", 00:09:52.912 "is_configured": true, 00:09:52.912 "data_offset": 2048, 00:09:52.912 "data_size": 63488 00:09:52.912 }, 00:09:52.912 { 00:09:52.912 "name": "pt2", 00:09:52.912 "uuid": "00000000-0000-0000-0000-000000000002", 00:09:52.912 "is_configured": true, 00:09:52.912 "data_offset": 2048, 00:09:52.912 "data_size": 63488 00:09:52.912 } 00:09:52.912 ] 00:09:52.912 } 00:09:52.912 } 00:09:52.912 }' 00:09:52.912 23:31:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:09:52.912 23:31:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:09:52.912 pt2' 00:09:52.912 23:31:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:09:52.912 23:31:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:09:52.912 23:31:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:09:53.170 23:31:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:09:53.170 "name": "pt1", 00:09:53.170 "aliases": [ 00:09:53.170 "00000000-0000-0000-0000-000000000001" 00:09:53.170 ], 00:09:53.170 "product_name": "passthru", 00:09:53.170 "block_size": 512, 00:09:53.170 "num_blocks": 65536, 00:09:53.170 "uuid": "00000000-0000-0000-0000-000000000001", 00:09:53.170 "assigned_rate_limits": { 00:09:53.170 "rw_ios_per_sec": 0, 00:09:53.170 "rw_mbytes_per_sec": 0, 00:09:53.170 "r_mbytes_per_sec": 0, 00:09:53.170 "w_mbytes_per_sec": 0 00:09:53.170 }, 00:09:53.170 "claimed": true, 00:09:53.170 "claim_type": "exclusive_write", 00:09:53.170 "zoned": false, 00:09:53.170 "supported_io_types": { 00:09:53.170 "read": true, 00:09:53.170 "write": true, 00:09:53.170 "unmap": true, 00:09:53.170 "flush": true, 00:09:53.170 "reset": true, 00:09:53.170 "nvme_admin": false, 00:09:53.170 "nvme_io": false, 00:09:53.170 "nvme_io_md": false, 00:09:53.170 "write_zeroes": true, 00:09:53.170 "zcopy": true, 00:09:53.170 "get_zone_info": false, 00:09:53.170 "zone_management": false, 00:09:53.170 "zone_append": false, 00:09:53.170 "compare": false, 00:09:53.170 "compare_and_write": false, 00:09:53.170 "abort": true, 00:09:53.170 "seek_hole": false, 00:09:53.170 "seek_data": false, 00:09:53.170 "copy": true, 00:09:53.170 "nvme_iov_md": false 00:09:53.170 }, 00:09:53.170 "memory_domains": [ 00:09:53.170 { 00:09:53.170 "dma_device_id": "system", 00:09:53.170 "dma_device_type": 1 00:09:53.170 }, 00:09:53.170 { 00:09:53.170 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:53.170 "dma_device_type": 2 00:09:53.170 } 00:09:53.170 ], 00:09:53.170 "driver_specific": { 00:09:53.170 "passthru": { 00:09:53.170 "name": "pt1", 00:09:53.170 "base_bdev_name": "malloc1" 00:09:53.170 } 00:09:53.170 } 00:09:53.170 }' 00:09:53.170 23:31:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:53.170 23:31:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:53.170 23:31:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:09:53.170 23:31:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:53.170 23:31:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:53.428 23:31:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:09:53.428 23:31:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:53.428 23:31:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:53.428 23:31:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:09:53.428 23:31:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:53.428 23:31:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:53.428 23:31:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:09:53.428 23:31:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:09:53.428 23:31:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:09:53.428 23:31:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:09:53.685 23:31:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:09:53.685 "name": "pt2", 00:09:53.685 "aliases": [ 00:09:53.685 "00000000-0000-0000-0000-000000000002" 00:09:53.685 ], 00:09:53.685 "product_name": "passthru", 00:09:53.685 "block_size": 512, 00:09:53.685 "num_blocks": 65536, 00:09:53.685 "uuid": "00000000-0000-0000-0000-000000000002", 00:09:53.685 "assigned_rate_limits": { 00:09:53.685 "rw_ios_per_sec": 0, 00:09:53.685 "rw_mbytes_per_sec": 0, 00:09:53.685 "r_mbytes_per_sec": 0, 00:09:53.685 "w_mbytes_per_sec": 0 00:09:53.685 }, 00:09:53.685 "claimed": true, 00:09:53.685 "claim_type": "exclusive_write", 00:09:53.685 "zoned": false, 00:09:53.685 "supported_io_types": { 00:09:53.685 "read": true, 00:09:53.685 "write": true, 00:09:53.685 "unmap": true, 00:09:53.685 "flush": true, 00:09:53.685 "reset": true, 00:09:53.685 "nvme_admin": false, 00:09:53.685 "nvme_io": false, 00:09:53.685 "nvme_io_md": false, 00:09:53.685 "write_zeroes": true, 00:09:53.685 "zcopy": true, 00:09:53.685 "get_zone_info": false, 00:09:53.685 "zone_management": false, 00:09:53.685 "zone_append": false, 00:09:53.685 "compare": false, 00:09:53.685 "compare_and_write": false, 00:09:53.685 "abort": true, 00:09:53.685 "seek_hole": false, 00:09:53.685 "seek_data": false, 00:09:53.685 "copy": true, 00:09:53.685 "nvme_iov_md": false 00:09:53.685 }, 00:09:53.685 "memory_domains": [ 00:09:53.685 { 00:09:53.685 "dma_device_id": "system", 00:09:53.685 "dma_device_type": 1 00:09:53.685 }, 00:09:53.685 { 00:09:53.685 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:53.685 "dma_device_type": 2 00:09:53.685 } 00:09:53.685 ], 00:09:53.685 "driver_specific": { 00:09:53.685 "passthru": { 00:09:53.685 "name": "pt2", 00:09:53.685 "base_bdev_name": "malloc2" 00:09:53.685 } 00:09:53.685 } 00:09:53.685 }' 00:09:53.685 23:31:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:53.685 23:31:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:53.685 23:31:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:09:53.685 23:31:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:53.685 23:31:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:53.685 23:31:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:09:53.685 23:31:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:53.942 23:31:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:53.942 23:31:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:09:53.942 23:31:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:53.942 23:31:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:53.942 23:31:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:09:53.942 23:31:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:09:53.942 23:31:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:09:53.942 [2024-07-24 23:31:38.931389] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:09:54.199 23:31:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=b967975f-4483-4cfd-a633-62a6e769aa49 00:09:54.199 23:31:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z b967975f-4483-4cfd-a633-62a6e769aa49 ']' 00:09:54.199 23:31:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:09:54.199 [2024-07-24 23:31:39.103690] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:09:54.199 [2024-07-24 23:31:39.103705] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:09:54.199 [2024-07-24 23:31:39.103743] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:09:54.199 [2024-07-24 23:31:39.103771] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:09:54.199 [2024-07-24 23:31:39.103777] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ad22d0 name raid_bdev1, state offline 00:09:54.199 23:31:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:54.199 23:31:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:09:54.456 23:31:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:09:54.456 23:31:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:09:54.456 23:31:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:09:54.456 23:31:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:09:54.714 23:31:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:09:54.714 23:31:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:09:54.714 23:31:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:09:54.714 23:31:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:09:54.971 23:31:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:09:54.971 23:31:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:09:54.971 23:31:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # local es=0 00:09:54.971 23:31:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:09:54.971 23:31:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:09:54.971 23:31:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:09:54.971 23:31:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:09:54.971 23:31:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:09:54.971 23:31:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:09:54.971 23:31:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:09:54.971 23:31:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:09:54.971 23:31:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:09:54.971 23:31:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:09:54.971 [2024-07-24 23:31:39.969918] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:09:54.971 [2024-07-24 23:31:39.970921] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:09:54.971 [2024-07-24 23:31:39.970963] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:09:54.971 [2024-07-24 23:31:39.970990] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:09:54.971 [2024-07-24 23:31:39.971000] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:09:54.971 [2024-07-24 23:31:39.971006] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ad2010 name raid_bdev1, state configuring 00:09:55.228 request: 00:09:55.228 { 00:09:55.228 "name": "raid_bdev1", 00:09:55.228 "raid_level": "concat", 00:09:55.228 "base_bdevs": [ 00:09:55.228 "malloc1", 00:09:55.228 "malloc2" 00:09:55.228 ], 00:09:55.228 "strip_size_kb": 64, 00:09:55.228 "superblock": false, 00:09:55.228 "method": "bdev_raid_create", 00:09:55.228 "req_id": 1 00:09:55.228 } 00:09:55.228 Got JSON-RPC error response 00:09:55.228 response: 00:09:55.228 { 00:09:55.228 "code": -17, 00:09:55.228 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:09:55.228 } 00:09:55.228 23:31:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # es=1 00:09:55.228 23:31:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:09:55.228 23:31:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:09:55.228 23:31:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:09:55.228 23:31:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:55.228 23:31:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:09:55.228 23:31:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:09:55.228 23:31:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:09:55.228 23:31:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:09:55.485 [2024-07-24 23:31:40.314775] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:09:55.485 [2024-07-24 23:31:40.314809] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:55.485 [2024-07-24 23:31:40.314823] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b92ab0 00:09:55.485 [2024-07-24 23:31:40.314829] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:55.485 [2024-07-24 23:31:40.316027] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:55.485 [2024-07-24 23:31:40.316047] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:09:55.485 [2024-07-24 23:31:40.316092] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:09:55.485 [2024-07-24 23:31:40.316110] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:09:55.485 pt1 00:09:55.485 23:31:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 2 00:09:55.485 23:31:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:09:55.485 23:31:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:09:55.485 23:31:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:09:55.485 23:31:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:55.485 23:31:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:55.485 23:31:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:55.485 23:31:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:55.485 23:31:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:55.485 23:31:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:55.485 23:31:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:55.485 23:31:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:09:55.741 23:31:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:55.741 "name": "raid_bdev1", 00:09:55.741 "uuid": "b967975f-4483-4cfd-a633-62a6e769aa49", 00:09:55.741 "strip_size_kb": 64, 00:09:55.741 "state": "configuring", 00:09:55.741 "raid_level": "concat", 00:09:55.741 "superblock": true, 00:09:55.741 "num_base_bdevs": 2, 00:09:55.741 "num_base_bdevs_discovered": 1, 00:09:55.741 "num_base_bdevs_operational": 2, 00:09:55.741 "base_bdevs_list": [ 00:09:55.741 { 00:09:55.741 "name": "pt1", 00:09:55.741 "uuid": "00000000-0000-0000-0000-000000000001", 00:09:55.741 "is_configured": true, 00:09:55.741 "data_offset": 2048, 00:09:55.741 "data_size": 63488 00:09:55.741 }, 00:09:55.741 { 00:09:55.741 "name": null, 00:09:55.741 "uuid": "00000000-0000-0000-0000-000000000002", 00:09:55.741 "is_configured": false, 00:09:55.741 "data_offset": 2048, 00:09:55.741 "data_size": 63488 00:09:55.741 } 00:09:55.741 ] 00:09:55.741 }' 00:09:55.741 23:31:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:55.741 23:31:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:09:55.999 23:31:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:09:55.999 23:31:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:09:55.999 23:31:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:09:55.999 23:31:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:09:56.255 [2024-07-24 23:31:41.128868] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:09:56.255 [2024-07-24 23:31:41.128909] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:56.255 [2024-07-24 23:31:41.128920] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ad18b0 00:09:56.255 [2024-07-24 23:31:41.128925] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:56.255 [2024-07-24 23:31:41.129168] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:56.255 [2024-07-24 23:31:41.129178] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:09:56.255 [2024-07-24 23:31:41.129218] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:09:56.255 [2024-07-24 23:31:41.129230] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:09:56.255 [2024-07-24 23:31:41.129295] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1ad1190 00:09:56.255 [2024-07-24 23:31:41.129300] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:09:56.255 [2024-07-24 23:31:41.129403] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ac8a10 00:09:56.255 [2024-07-24 23:31:41.129492] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1ad1190 00:09:56.255 [2024-07-24 23:31:41.129514] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1ad1190 00:09:56.255 [2024-07-24 23:31:41.129579] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:56.255 pt2 00:09:56.255 23:31:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:09:56.255 23:31:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:09:56.255 23:31:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:09:56.255 23:31:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:09:56.255 23:31:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:09:56.255 23:31:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:09:56.255 23:31:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:56.255 23:31:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:56.255 23:31:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:56.255 23:31:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:56.255 23:31:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:56.255 23:31:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:56.255 23:31:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:56.255 23:31:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:09:56.512 23:31:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:56.512 "name": "raid_bdev1", 00:09:56.512 "uuid": "b967975f-4483-4cfd-a633-62a6e769aa49", 00:09:56.512 "strip_size_kb": 64, 00:09:56.512 "state": "online", 00:09:56.512 "raid_level": "concat", 00:09:56.512 "superblock": true, 00:09:56.512 "num_base_bdevs": 2, 00:09:56.512 "num_base_bdevs_discovered": 2, 00:09:56.512 "num_base_bdevs_operational": 2, 00:09:56.512 "base_bdevs_list": [ 00:09:56.512 { 00:09:56.512 "name": "pt1", 00:09:56.512 "uuid": "00000000-0000-0000-0000-000000000001", 00:09:56.512 "is_configured": true, 00:09:56.512 "data_offset": 2048, 00:09:56.512 "data_size": 63488 00:09:56.512 }, 00:09:56.512 { 00:09:56.512 "name": "pt2", 00:09:56.512 "uuid": "00000000-0000-0000-0000-000000000002", 00:09:56.512 "is_configured": true, 00:09:56.512 "data_offset": 2048, 00:09:56.512 "data_size": 63488 00:09:56.512 } 00:09:56.512 ] 00:09:56.512 }' 00:09:56.512 23:31:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:56.512 23:31:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:09:57.078 23:31:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:09:57.078 23:31:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:09:57.078 23:31:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:09:57.078 23:31:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:09:57.078 23:31:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:09:57.078 23:31:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:09:57.078 23:31:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:09:57.078 23:31:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:09:57.078 [2024-07-24 23:31:41.987291] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:09:57.078 23:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:09:57.078 "name": "raid_bdev1", 00:09:57.078 "aliases": [ 00:09:57.078 "b967975f-4483-4cfd-a633-62a6e769aa49" 00:09:57.078 ], 00:09:57.078 "product_name": "Raid Volume", 00:09:57.078 "block_size": 512, 00:09:57.078 "num_blocks": 126976, 00:09:57.078 "uuid": "b967975f-4483-4cfd-a633-62a6e769aa49", 00:09:57.078 "assigned_rate_limits": { 00:09:57.078 "rw_ios_per_sec": 0, 00:09:57.078 "rw_mbytes_per_sec": 0, 00:09:57.078 "r_mbytes_per_sec": 0, 00:09:57.078 "w_mbytes_per_sec": 0 00:09:57.078 }, 00:09:57.078 "claimed": false, 00:09:57.078 "zoned": false, 00:09:57.078 "supported_io_types": { 00:09:57.078 "read": true, 00:09:57.078 "write": true, 00:09:57.078 "unmap": true, 00:09:57.078 "flush": true, 00:09:57.078 "reset": true, 00:09:57.078 "nvme_admin": false, 00:09:57.078 "nvme_io": false, 00:09:57.078 "nvme_io_md": false, 00:09:57.078 "write_zeroes": true, 00:09:57.078 "zcopy": false, 00:09:57.078 "get_zone_info": false, 00:09:57.078 "zone_management": false, 00:09:57.078 "zone_append": false, 00:09:57.078 "compare": false, 00:09:57.078 "compare_and_write": false, 00:09:57.078 "abort": false, 00:09:57.078 "seek_hole": false, 00:09:57.078 "seek_data": false, 00:09:57.078 "copy": false, 00:09:57.078 "nvme_iov_md": false 00:09:57.078 }, 00:09:57.078 "memory_domains": [ 00:09:57.078 { 00:09:57.078 "dma_device_id": "system", 00:09:57.078 "dma_device_type": 1 00:09:57.078 }, 00:09:57.078 { 00:09:57.078 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:57.078 "dma_device_type": 2 00:09:57.078 }, 00:09:57.078 { 00:09:57.078 "dma_device_id": "system", 00:09:57.078 "dma_device_type": 1 00:09:57.078 }, 00:09:57.078 { 00:09:57.078 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:57.078 "dma_device_type": 2 00:09:57.078 } 00:09:57.078 ], 00:09:57.078 "driver_specific": { 00:09:57.078 "raid": { 00:09:57.078 "uuid": "b967975f-4483-4cfd-a633-62a6e769aa49", 00:09:57.078 "strip_size_kb": 64, 00:09:57.078 "state": "online", 00:09:57.078 "raid_level": "concat", 00:09:57.078 "superblock": true, 00:09:57.078 "num_base_bdevs": 2, 00:09:57.078 "num_base_bdevs_discovered": 2, 00:09:57.078 "num_base_bdevs_operational": 2, 00:09:57.078 "base_bdevs_list": [ 00:09:57.078 { 00:09:57.078 "name": "pt1", 00:09:57.078 "uuid": "00000000-0000-0000-0000-000000000001", 00:09:57.078 "is_configured": true, 00:09:57.078 "data_offset": 2048, 00:09:57.078 "data_size": 63488 00:09:57.078 }, 00:09:57.078 { 00:09:57.078 "name": "pt2", 00:09:57.078 "uuid": "00000000-0000-0000-0000-000000000002", 00:09:57.078 "is_configured": true, 00:09:57.078 "data_offset": 2048, 00:09:57.078 "data_size": 63488 00:09:57.078 } 00:09:57.078 ] 00:09:57.078 } 00:09:57.078 } 00:09:57.078 }' 00:09:57.078 23:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:09:57.078 23:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:09:57.078 pt2' 00:09:57.078 23:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:09:57.078 23:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:09:57.078 23:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:09:57.336 23:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:09:57.336 "name": "pt1", 00:09:57.336 "aliases": [ 00:09:57.336 "00000000-0000-0000-0000-000000000001" 00:09:57.336 ], 00:09:57.336 "product_name": "passthru", 00:09:57.336 "block_size": 512, 00:09:57.336 "num_blocks": 65536, 00:09:57.336 "uuid": "00000000-0000-0000-0000-000000000001", 00:09:57.336 "assigned_rate_limits": { 00:09:57.336 "rw_ios_per_sec": 0, 00:09:57.336 "rw_mbytes_per_sec": 0, 00:09:57.336 "r_mbytes_per_sec": 0, 00:09:57.336 "w_mbytes_per_sec": 0 00:09:57.336 }, 00:09:57.336 "claimed": true, 00:09:57.336 "claim_type": "exclusive_write", 00:09:57.336 "zoned": false, 00:09:57.336 "supported_io_types": { 00:09:57.336 "read": true, 00:09:57.336 "write": true, 00:09:57.336 "unmap": true, 00:09:57.336 "flush": true, 00:09:57.336 "reset": true, 00:09:57.336 "nvme_admin": false, 00:09:57.336 "nvme_io": false, 00:09:57.336 "nvme_io_md": false, 00:09:57.336 "write_zeroes": true, 00:09:57.336 "zcopy": true, 00:09:57.336 "get_zone_info": false, 00:09:57.336 "zone_management": false, 00:09:57.336 "zone_append": false, 00:09:57.336 "compare": false, 00:09:57.336 "compare_and_write": false, 00:09:57.336 "abort": true, 00:09:57.336 "seek_hole": false, 00:09:57.336 "seek_data": false, 00:09:57.336 "copy": true, 00:09:57.336 "nvme_iov_md": false 00:09:57.336 }, 00:09:57.336 "memory_domains": [ 00:09:57.336 { 00:09:57.336 "dma_device_id": "system", 00:09:57.336 "dma_device_type": 1 00:09:57.336 }, 00:09:57.336 { 00:09:57.336 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:57.336 "dma_device_type": 2 00:09:57.336 } 00:09:57.336 ], 00:09:57.336 "driver_specific": { 00:09:57.336 "passthru": { 00:09:57.336 "name": "pt1", 00:09:57.336 "base_bdev_name": "malloc1" 00:09:57.336 } 00:09:57.336 } 00:09:57.336 }' 00:09:57.336 23:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:57.336 23:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:57.337 23:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:09:57.337 23:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:57.594 23:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:57.594 23:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:09:57.594 23:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:57.594 23:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:57.594 23:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:09:57.594 23:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:57.594 23:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:57.594 23:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:09:57.594 23:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:09:57.594 23:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:09:57.594 23:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:09:57.852 23:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:09:57.852 "name": "pt2", 00:09:57.852 "aliases": [ 00:09:57.852 "00000000-0000-0000-0000-000000000002" 00:09:57.852 ], 00:09:57.852 "product_name": "passthru", 00:09:57.852 "block_size": 512, 00:09:57.852 "num_blocks": 65536, 00:09:57.852 "uuid": "00000000-0000-0000-0000-000000000002", 00:09:57.852 "assigned_rate_limits": { 00:09:57.852 "rw_ios_per_sec": 0, 00:09:57.852 "rw_mbytes_per_sec": 0, 00:09:57.852 "r_mbytes_per_sec": 0, 00:09:57.852 "w_mbytes_per_sec": 0 00:09:57.852 }, 00:09:57.852 "claimed": true, 00:09:57.852 "claim_type": "exclusive_write", 00:09:57.852 "zoned": false, 00:09:57.852 "supported_io_types": { 00:09:57.852 "read": true, 00:09:57.852 "write": true, 00:09:57.852 "unmap": true, 00:09:57.852 "flush": true, 00:09:57.852 "reset": true, 00:09:57.852 "nvme_admin": false, 00:09:57.852 "nvme_io": false, 00:09:57.852 "nvme_io_md": false, 00:09:57.852 "write_zeroes": true, 00:09:57.852 "zcopy": true, 00:09:57.852 "get_zone_info": false, 00:09:57.852 "zone_management": false, 00:09:57.852 "zone_append": false, 00:09:57.852 "compare": false, 00:09:57.852 "compare_and_write": false, 00:09:57.852 "abort": true, 00:09:57.852 "seek_hole": false, 00:09:57.852 "seek_data": false, 00:09:57.852 "copy": true, 00:09:57.852 "nvme_iov_md": false 00:09:57.852 }, 00:09:57.852 "memory_domains": [ 00:09:57.852 { 00:09:57.852 "dma_device_id": "system", 00:09:57.852 "dma_device_type": 1 00:09:57.852 }, 00:09:57.852 { 00:09:57.852 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:57.852 "dma_device_type": 2 00:09:57.852 } 00:09:57.852 ], 00:09:57.852 "driver_specific": { 00:09:57.852 "passthru": { 00:09:57.852 "name": "pt2", 00:09:57.852 "base_bdev_name": "malloc2" 00:09:57.852 } 00:09:57.852 } 00:09:57.852 }' 00:09:57.852 23:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:57.852 23:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:57.852 23:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:09:57.852 23:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:57.852 23:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:57.852 23:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:09:57.852 23:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:58.110 23:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:58.110 23:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:09:58.110 23:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:58.110 23:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:58.110 23:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:09:58.110 23:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:09:58.110 23:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:09:58.367 [2024-07-24 23:31:43.138274] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:09:58.367 23:31:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' b967975f-4483-4cfd-a633-62a6e769aa49 '!=' b967975f-4483-4cfd-a633-62a6e769aa49 ']' 00:09:58.367 23:31:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:09:58.367 23:31:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:09:58.367 23:31:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:09:58.367 23:31:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 252525 00:09:58.367 23:31:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@950 -- # '[' -z 252525 ']' 00:09:58.367 23:31:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # kill -0 252525 00:09:58.367 23:31:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # uname 00:09:58.367 23:31:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:09:58.367 23:31:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 252525 00:09:58.367 23:31:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:09:58.367 23:31:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:09:58.367 23:31:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 252525' 00:09:58.367 killing process with pid 252525 00:09:58.367 23:31:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@969 -- # kill 252525 00:09:58.367 [2024-07-24 23:31:43.179444] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:09:58.367 [2024-07-24 23:31:43.179488] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:09:58.367 [2024-07-24 23:31:43.179517] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:09:58.367 [2024-07-24 23:31:43.179523] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ad1190 name raid_bdev1, state offline 00:09:58.367 23:31:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@974 -- # wait 252525 00:09:58.367 [2024-07-24 23:31:43.194806] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:09:58.367 23:31:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:09:58.367 00:09:58.367 real 0m7.988s 00:09:58.367 user 0m14.388s 00:09:58.367 sys 0m1.245s 00:09:58.367 23:31:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:58.367 23:31:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:09:58.367 ************************************ 00:09:58.367 END TEST raid_superblock_test 00:09:58.367 ************************************ 00:09:58.625 23:31:43 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 2 read 00:09:58.625 23:31:43 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:09:58.625 23:31:43 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:58.625 23:31:43 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:09:58.625 ************************************ 00:09:58.625 START TEST raid_read_error_test 00:09:58.625 ************************************ 00:09:58.625 23:31:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test concat 2 read 00:09:58.625 23:31:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:09:58.625 23:31:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:09:58.625 23:31:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:09:58.625 23:31:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:09:58.625 23:31:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:09:58.625 23:31:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:09:58.625 23:31:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:09:58.625 23:31:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:09:58.625 23:31:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:09:58.625 23:31:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:09:58.625 23:31:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:09:58.625 23:31:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:09:58.625 23:31:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:09:58.625 23:31:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:09:58.625 23:31:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:09:58.625 23:31:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:09:58.625 23:31:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:09:58.625 23:31:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:09:58.625 23:31:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:09:58.625 23:31:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:09:58.625 23:31:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:09:58.625 23:31:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:09:58.625 23:31:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.n50VlS2KRy 00:09:58.625 23:31:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=254113 00:09:58.625 23:31:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 254113 /var/tmp/spdk-raid.sock 00:09:58.625 23:31:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:09:58.625 23:31:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@831 -- # '[' -z 254113 ']' 00:09:58.625 23:31:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:09:58.625 23:31:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:09:58.625 23:31:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:09:58.625 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:09:58.625 23:31:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:09:58.625 23:31:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:09:58.625 [2024-07-24 23:31:43.488764] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:09:58.625 [2024-07-24 23:31:43.488812] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid254113 ] 00:09:58.625 [2024-07-24 23:31:43.550988] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:58.882 [2024-07-24 23:31:43.629983] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:58.882 [2024-07-24 23:31:43.680045] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:58.882 [2024-07-24 23:31:43.680071] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:59.446 23:31:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:09:59.446 23:31:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@864 -- # return 0 00:09:59.446 23:31:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:09:59.446 23:31:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:09:59.446 BaseBdev1_malloc 00:09:59.446 23:31:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:09:59.705 true 00:09:59.705 23:31:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:09:59.963 [2024-07-24 23:31:44.727822] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:09:59.963 [2024-07-24 23:31:44.727852] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:59.963 [2024-07-24 23:31:44.727864] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17d0550 00:09:59.963 [2024-07-24 23:31:44.727870] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:59.963 [2024-07-24 23:31:44.729050] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:59.963 [2024-07-24 23:31:44.729071] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:09:59.963 BaseBdev1 00:09:59.963 23:31:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:09:59.963 23:31:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:09:59.963 BaseBdev2_malloc 00:09:59.963 23:31:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:10:00.221 true 00:10:00.221 23:31:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:10:00.221 [2024-07-24 23:31:45.220669] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:10:00.221 [2024-07-24 23:31:45.220702] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:00.221 [2024-07-24 23:31:45.220714] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17d4d90 00:10:00.221 [2024-07-24 23:31:45.220720] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:00.479 [2024-07-24 23:31:45.221809] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:00.479 [2024-07-24 23:31:45.221830] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:10:00.479 BaseBdev2 00:10:00.479 23:31:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:10:00.479 [2024-07-24 23:31:45.377104] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:00.479 [2024-07-24 23:31:45.377996] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:10:00.479 [2024-07-24 23:31:45.378119] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x17d67a0 00:10:00.479 [2024-07-24 23:31:45.378127] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:10:00.479 [2024-07-24 23:31:45.378256] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17d7710 00:10:00.479 [2024-07-24 23:31:45.378351] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x17d67a0 00:10:00.479 [2024-07-24 23:31:45.378356] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x17d67a0 00:10:00.479 [2024-07-24 23:31:45.378422] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:00.479 23:31:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:10:00.479 23:31:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:00.479 23:31:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:00.479 23:31:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:00.479 23:31:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:00.479 23:31:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:00.479 23:31:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:00.479 23:31:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:00.479 23:31:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:00.479 23:31:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:00.479 23:31:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:00.479 23:31:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:00.737 23:31:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:00.737 "name": "raid_bdev1", 00:10:00.737 "uuid": "0c88c639-5fd1-449e-a596-9725d3ecc780", 00:10:00.737 "strip_size_kb": 64, 00:10:00.737 "state": "online", 00:10:00.737 "raid_level": "concat", 00:10:00.737 "superblock": true, 00:10:00.737 "num_base_bdevs": 2, 00:10:00.737 "num_base_bdevs_discovered": 2, 00:10:00.737 "num_base_bdevs_operational": 2, 00:10:00.737 "base_bdevs_list": [ 00:10:00.737 { 00:10:00.737 "name": "BaseBdev1", 00:10:00.737 "uuid": "cf739361-1b9f-500f-8bd9-d95d01abbdd8", 00:10:00.737 "is_configured": true, 00:10:00.737 "data_offset": 2048, 00:10:00.737 "data_size": 63488 00:10:00.737 }, 00:10:00.737 { 00:10:00.737 "name": "BaseBdev2", 00:10:00.737 "uuid": "171d2e35-bb65-51ac-97fd-515e2600dafd", 00:10:00.737 "is_configured": true, 00:10:00.737 "data_offset": 2048, 00:10:00.737 "data_size": 63488 00:10:00.737 } 00:10:00.737 ] 00:10:00.737 }' 00:10:00.737 23:31:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:00.737 23:31:45 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:01.302 23:31:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:10:01.302 23:31:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:10:01.302 [2024-07-24 23:31:46.119213] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17d1e30 00:10:02.235 23:31:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:10:02.235 23:31:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:10:02.235 23:31:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:10:02.235 23:31:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:10:02.235 23:31:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:10:02.235 23:31:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:02.235 23:31:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:02.235 23:31:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:02.235 23:31:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:02.235 23:31:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:02.235 23:31:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:02.235 23:31:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:02.235 23:31:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:02.235 23:31:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:02.235 23:31:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:02.235 23:31:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:02.493 23:31:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:02.493 "name": "raid_bdev1", 00:10:02.493 "uuid": "0c88c639-5fd1-449e-a596-9725d3ecc780", 00:10:02.493 "strip_size_kb": 64, 00:10:02.493 "state": "online", 00:10:02.493 "raid_level": "concat", 00:10:02.493 "superblock": true, 00:10:02.493 "num_base_bdevs": 2, 00:10:02.493 "num_base_bdevs_discovered": 2, 00:10:02.493 "num_base_bdevs_operational": 2, 00:10:02.493 "base_bdevs_list": [ 00:10:02.493 { 00:10:02.493 "name": "BaseBdev1", 00:10:02.493 "uuid": "cf739361-1b9f-500f-8bd9-d95d01abbdd8", 00:10:02.493 "is_configured": true, 00:10:02.493 "data_offset": 2048, 00:10:02.493 "data_size": 63488 00:10:02.493 }, 00:10:02.493 { 00:10:02.493 "name": "BaseBdev2", 00:10:02.493 "uuid": "171d2e35-bb65-51ac-97fd-515e2600dafd", 00:10:02.493 "is_configured": true, 00:10:02.493 "data_offset": 2048, 00:10:02.493 "data_size": 63488 00:10:02.493 } 00:10:02.493 ] 00:10:02.493 }' 00:10:02.493 23:31:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:02.493 23:31:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:03.060 23:31:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:10:03.060 [2024-07-24 23:31:48.039763] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:10:03.060 [2024-07-24 23:31:48.039795] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:03.060 [2024-07-24 23:31:48.041871] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:03.060 [2024-07-24 23:31:48.041892] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:03.060 [2024-07-24 23:31:48.041910] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:03.060 [2024-07-24 23:31:48.041915] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17d67a0 name raid_bdev1, state offline 00:10:03.060 0 00:10:03.060 23:31:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 254113 00:10:03.060 23:31:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@950 -- # '[' -z 254113 ']' 00:10:03.060 23:31:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # kill -0 254113 00:10:03.317 23:31:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # uname 00:10:03.317 23:31:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:10:03.317 23:31:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 254113 00:10:03.317 23:31:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:10:03.317 23:31:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:10:03.317 23:31:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 254113' 00:10:03.317 killing process with pid 254113 00:10:03.317 23:31:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@969 -- # kill 254113 00:10:03.317 [2024-07-24 23:31:48.099727] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:03.317 23:31:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@974 -- # wait 254113 00:10:03.317 [2024-07-24 23:31:48.109063] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:03.317 23:31:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:10:03.317 23:31:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.n50VlS2KRy 00:10:03.317 23:31:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:10:03.317 23:31:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.52 00:10:03.317 23:31:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:10:03.317 23:31:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:10:03.317 23:31:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:10:03.317 23:31:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.52 != \0\.\0\0 ]] 00:10:03.317 00:10:03.317 real 0m4.872s 00:10:03.317 user 0m7.478s 00:10:03.317 sys 0m0.659s 00:10:03.317 23:31:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:03.317 23:31:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:03.317 ************************************ 00:10:03.317 END TEST raid_read_error_test 00:10:03.317 ************************************ 00:10:03.575 23:31:48 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 2 write 00:10:03.575 23:31:48 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:10:03.575 23:31:48 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:03.575 23:31:48 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:03.575 ************************************ 00:10:03.575 START TEST raid_write_error_test 00:10:03.575 ************************************ 00:10:03.575 23:31:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test concat 2 write 00:10:03.575 23:31:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:10:03.575 23:31:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:10:03.575 23:31:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:10:03.575 23:31:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:10:03.575 23:31:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:03.575 23:31:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:10:03.575 23:31:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:10:03.575 23:31:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:03.575 23:31:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:10:03.575 23:31:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:10:03.575 23:31:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:03.575 23:31:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:10:03.575 23:31:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:10:03.575 23:31:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:10:03.575 23:31:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:10:03.575 23:31:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:10:03.575 23:31:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:10:03.575 23:31:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:10:03.575 23:31:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:10:03.575 23:31:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:10:03.575 23:31:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:10:03.575 23:31:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:10:03.575 23:31:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.F3PQN1OFSm 00:10:03.575 23:31:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=255093 00:10:03.575 23:31:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 255093 /var/tmp/spdk-raid.sock 00:10:03.575 23:31:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:10:03.575 23:31:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@831 -- # '[' -z 255093 ']' 00:10:03.575 23:31:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:03.575 23:31:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:10:03.575 23:31:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:03.575 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:03.575 23:31:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:10:03.575 23:31:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:03.575 [2024-07-24 23:31:48.428868] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:10:03.575 [2024-07-24 23:31:48.428906] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid255093 ] 00:10:03.575 [2024-07-24 23:31:48.495217] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:03.575 [2024-07-24 23:31:48.570087] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:03.833 [2024-07-24 23:31:48.621655] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:03.834 [2024-07-24 23:31:48.621680] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:04.400 23:31:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:10:04.400 23:31:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@864 -- # return 0 00:10:04.400 23:31:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:10:04.400 23:31:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:10:04.400 BaseBdev1_malloc 00:10:04.400 23:31:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:10:04.657 true 00:10:04.657 23:31:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:10:04.915 [2024-07-24 23:31:49.721367] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:10:04.915 [2024-07-24 23:31:49.721401] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:04.915 [2024-07-24 23:31:49.721413] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21a7550 00:10:04.915 [2024-07-24 23:31:49.721420] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:04.915 [2024-07-24 23:31:49.722686] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:04.915 [2024-07-24 23:31:49.722710] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:10:04.915 BaseBdev1 00:10:04.915 23:31:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:10:04.915 23:31:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:10:04.915 BaseBdev2_malloc 00:10:04.915 23:31:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:10:05.173 true 00:10:05.173 23:31:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:10:05.432 [2024-07-24 23:31:50.238027] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:10:05.433 [2024-07-24 23:31:50.238063] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:05.433 [2024-07-24 23:31:50.238074] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21abd90 00:10:05.433 [2024-07-24 23:31:50.238079] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:05.433 [2024-07-24 23:31:50.239080] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:05.433 [2024-07-24 23:31:50.239102] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:10:05.433 BaseBdev2 00:10:05.433 23:31:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:10:05.433 [2024-07-24 23:31:50.414527] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:05.433 [2024-07-24 23:31:50.415316] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:10:05.433 [2024-07-24 23:31:50.415441] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x21ad7a0 00:10:05.433 [2024-07-24 23:31:50.415453] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:10:05.433 [2024-07-24 23:31:50.415580] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21ae710 00:10:05.433 [2024-07-24 23:31:50.415675] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x21ad7a0 00:10:05.433 [2024-07-24 23:31:50.415680] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x21ad7a0 00:10:05.433 [2024-07-24 23:31:50.415744] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:05.723 23:31:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:10:05.723 23:31:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:05.723 23:31:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:05.723 23:31:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:05.723 23:31:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:05.723 23:31:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:05.723 23:31:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:05.723 23:31:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:05.723 23:31:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:05.723 23:31:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:05.723 23:31:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:05.723 23:31:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:05.723 23:31:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:05.723 "name": "raid_bdev1", 00:10:05.723 "uuid": "25ae5846-ffc4-4cc1-8e74-f8adc2c1fdb0", 00:10:05.723 "strip_size_kb": 64, 00:10:05.723 "state": "online", 00:10:05.723 "raid_level": "concat", 00:10:05.723 "superblock": true, 00:10:05.723 "num_base_bdevs": 2, 00:10:05.723 "num_base_bdevs_discovered": 2, 00:10:05.723 "num_base_bdevs_operational": 2, 00:10:05.723 "base_bdevs_list": [ 00:10:05.723 { 00:10:05.723 "name": "BaseBdev1", 00:10:05.723 "uuid": "f7989eb1-9f15-54f0-ba76-1ed942684d12", 00:10:05.723 "is_configured": true, 00:10:05.723 "data_offset": 2048, 00:10:05.723 "data_size": 63488 00:10:05.723 }, 00:10:05.723 { 00:10:05.723 "name": "BaseBdev2", 00:10:05.723 "uuid": "dedd334f-39e3-5fe2-a11d-52429bd95a72", 00:10:05.723 "is_configured": true, 00:10:05.723 "data_offset": 2048, 00:10:05.723 "data_size": 63488 00:10:05.723 } 00:10:05.723 ] 00:10:05.723 }' 00:10:05.723 23:31:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:05.723 23:31:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:06.298 23:31:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:10:06.298 23:31:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:10:06.298 [2024-07-24 23:31:51.164666] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21a8e30 00:10:07.231 23:31:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:10:07.489 23:31:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:10:07.489 23:31:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:10:07.489 23:31:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:10:07.489 23:31:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:10:07.489 23:31:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:07.489 23:31:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:07.489 23:31:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:07.489 23:31:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:07.489 23:31:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:07.489 23:31:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:07.489 23:31:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:07.489 23:31:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:07.489 23:31:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:07.489 23:31:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:07.489 23:31:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:07.489 23:31:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:07.489 "name": "raid_bdev1", 00:10:07.489 "uuid": "25ae5846-ffc4-4cc1-8e74-f8adc2c1fdb0", 00:10:07.489 "strip_size_kb": 64, 00:10:07.489 "state": "online", 00:10:07.489 "raid_level": "concat", 00:10:07.489 "superblock": true, 00:10:07.489 "num_base_bdevs": 2, 00:10:07.489 "num_base_bdevs_discovered": 2, 00:10:07.489 "num_base_bdevs_operational": 2, 00:10:07.489 "base_bdevs_list": [ 00:10:07.489 { 00:10:07.489 "name": "BaseBdev1", 00:10:07.489 "uuid": "f7989eb1-9f15-54f0-ba76-1ed942684d12", 00:10:07.489 "is_configured": true, 00:10:07.489 "data_offset": 2048, 00:10:07.489 "data_size": 63488 00:10:07.489 }, 00:10:07.489 { 00:10:07.489 "name": "BaseBdev2", 00:10:07.489 "uuid": "dedd334f-39e3-5fe2-a11d-52429bd95a72", 00:10:07.489 "is_configured": true, 00:10:07.489 "data_offset": 2048, 00:10:07.489 "data_size": 63488 00:10:07.489 } 00:10:07.489 ] 00:10:07.489 }' 00:10:07.489 23:31:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:07.489 23:31:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:08.055 23:31:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:10:08.313 [2024-07-24 23:31:53.060851] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:10:08.313 [2024-07-24 23:31:53.060880] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:08.313 [2024-07-24 23:31:53.062942] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:08.313 [2024-07-24 23:31:53.062963] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:08.313 [2024-07-24 23:31:53.062980] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:08.313 [2024-07-24 23:31:53.062985] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21ad7a0 name raid_bdev1, state offline 00:10:08.313 0 00:10:08.313 23:31:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 255093 00:10:08.313 23:31:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@950 -- # '[' -z 255093 ']' 00:10:08.313 23:31:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # kill -0 255093 00:10:08.313 23:31:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # uname 00:10:08.313 23:31:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:10:08.313 23:31:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 255093 00:10:08.313 23:31:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:10:08.313 23:31:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:10:08.313 23:31:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 255093' 00:10:08.313 killing process with pid 255093 00:10:08.313 23:31:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@969 -- # kill 255093 00:10:08.313 [2024-07-24 23:31:53.121929] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:08.313 23:31:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@974 -- # wait 255093 00:10:08.313 [2024-07-24 23:31:53.131309] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:08.313 23:31:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.F3PQN1OFSm 00:10:08.313 23:31:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:10:08.313 23:31:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:10:08.572 23:31:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.53 00:10:08.572 23:31:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:10:08.572 23:31:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:10:08.572 23:31:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:10:08.572 23:31:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.53 != \0\.\0\0 ]] 00:10:08.572 00:10:08.572 real 0m4.952s 00:10:08.572 user 0m7.585s 00:10:08.572 sys 0m0.725s 00:10:08.572 23:31:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:08.572 23:31:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:08.572 ************************************ 00:10:08.572 END TEST raid_write_error_test 00:10:08.572 ************************************ 00:10:08.572 23:31:53 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:10:08.572 23:31:53 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 2 false 00:10:08.572 23:31:53 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:10:08.572 23:31:53 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:08.572 23:31:53 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:08.572 ************************************ 00:10:08.572 START TEST raid_state_function_test 00:10:08.572 ************************************ 00:10:08.572 23:31:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # raid_state_function_test raid1 2 false 00:10:08.572 23:31:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:10:08.572 23:31:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:10:08.572 23:31:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:10:08.572 23:31:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:10:08.572 23:31:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:10:08.572 23:31:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:08.572 23:31:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:10:08.572 23:31:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:08.572 23:31:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:08.572 23:31:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:10:08.572 23:31:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:08.572 23:31:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:08.572 23:31:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:10:08.572 23:31:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:10:08.572 23:31:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:10:08.572 23:31:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:10:08.572 23:31:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:10:08.572 23:31:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:10:08.572 23:31:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:10:08.572 23:31:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:10:08.572 23:31:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:10:08.572 23:31:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:10:08.572 23:31:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=255920 00:10:08.572 23:31:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 255920' 00:10:08.572 Process raid pid: 255920 00:10:08.572 23:31:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:08.572 23:31:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 255920 /var/tmp/spdk-raid.sock 00:10:08.572 23:31:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@831 -- # '[' -z 255920 ']' 00:10:08.572 23:31:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:08.572 23:31:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:10:08.572 23:31:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:08.572 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:08.572 23:31:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:10:08.572 23:31:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:08.572 [2024-07-24 23:31:53.442555] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:10:08.572 [2024-07-24 23:31:53.442594] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:08.572 [2024-07-24 23:31:53.504988] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:08.830 [2024-07-24 23:31:53.584325] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:08.830 [2024-07-24 23:31:53.642769] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:08.830 [2024-07-24 23:31:53.642798] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:09.394 23:31:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:10:09.394 23:31:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@864 -- # return 0 00:10:09.394 23:31:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:09.394 [2024-07-24 23:31:54.386085] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:09.394 [2024-07-24 23:31:54.386119] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:09.394 [2024-07-24 23:31:54.386125] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:09.394 [2024-07-24 23:31:54.386130] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:09.652 23:31:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:10:09.652 23:31:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:09.652 23:31:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:09.652 23:31:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:09.652 23:31:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:09.652 23:31:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:09.652 23:31:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:09.652 23:31:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:09.652 23:31:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:09.652 23:31:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:09.652 23:31:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:09.652 23:31:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:09.652 23:31:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:09.652 "name": "Existed_Raid", 00:10:09.652 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:09.652 "strip_size_kb": 0, 00:10:09.652 "state": "configuring", 00:10:09.652 "raid_level": "raid1", 00:10:09.652 "superblock": false, 00:10:09.652 "num_base_bdevs": 2, 00:10:09.652 "num_base_bdevs_discovered": 0, 00:10:09.652 "num_base_bdevs_operational": 2, 00:10:09.652 "base_bdevs_list": [ 00:10:09.652 { 00:10:09.652 "name": "BaseBdev1", 00:10:09.652 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:09.652 "is_configured": false, 00:10:09.652 "data_offset": 0, 00:10:09.652 "data_size": 0 00:10:09.652 }, 00:10:09.652 { 00:10:09.652 "name": "BaseBdev2", 00:10:09.652 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:09.652 "is_configured": false, 00:10:09.652 "data_offset": 0, 00:10:09.652 "data_size": 0 00:10:09.652 } 00:10:09.652 ] 00:10:09.652 }' 00:10:09.652 23:31:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:09.652 23:31:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:10.217 23:31:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:10.217 [2024-07-24 23:31:55.212133] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:10.217 [2024-07-24 23:31:55.212158] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb2ab10 name Existed_Raid, state configuring 00:10:10.473 23:31:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:10.473 [2024-07-24 23:31:55.384599] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:10.473 [2024-07-24 23:31:55.384618] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:10.473 [2024-07-24 23:31:55.384623] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:10.473 [2024-07-24 23:31:55.384628] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:10.473 23:31:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:10:10.730 [2024-07-24 23:31:55.561151] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:10.730 BaseBdev1 00:10:10.730 23:31:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:10:10.730 23:31:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:10:10.730 23:31:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:10:10.730 23:31:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:10:10.730 23:31:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:10:10.730 23:31:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:10:10.730 23:31:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:10.988 23:31:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:10:10.988 [ 00:10:10.988 { 00:10:10.988 "name": "BaseBdev1", 00:10:10.988 "aliases": [ 00:10:10.988 "2e9f853b-60e2-47a4-a67e-94f181c2fbce" 00:10:10.988 ], 00:10:10.988 "product_name": "Malloc disk", 00:10:10.988 "block_size": 512, 00:10:10.988 "num_blocks": 65536, 00:10:10.988 "uuid": "2e9f853b-60e2-47a4-a67e-94f181c2fbce", 00:10:10.988 "assigned_rate_limits": { 00:10:10.988 "rw_ios_per_sec": 0, 00:10:10.988 "rw_mbytes_per_sec": 0, 00:10:10.988 "r_mbytes_per_sec": 0, 00:10:10.988 "w_mbytes_per_sec": 0 00:10:10.988 }, 00:10:10.988 "claimed": true, 00:10:10.988 "claim_type": "exclusive_write", 00:10:10.988 "zoned": false, 00:10:10.988 "supported_io_types": { 00:10:10.988 "read": true, 00:10:10.988 "write": true, 00:10:10.988 "unmap": true, 00:10:10.988 "flush": true, 00:10:10.988 "reset": true, 00:10:10.988 "nvme_admin": false, 00:10:10.988 "nvme_io": false, 00:10:10.988 "nvme_io_md": false, 00:10:10.988 "write_zeroes": true, 00:10:10.988 "zcopy": true, 00:10:10.988 "get_zone_info": false, 00:10:10.988 "zone_management": false, 00:10:10.988 "zone_append": false, 00:10:10.988 "compare": false, 00:10:10.988 "compare_and_write": false, 00:10:10.988 "abort": true, 00:10:10.988 "seek_hole": false, 00:10:10.988 "seek_data": false, 00:10:10.988 "copy": true, 00:10:10.988 "nvme_iov_md": false 00:10:10.988 }, 00:10:10.988 "memory_domains": [ 00:10:10.988 { 00:10:10.988 "dma_device_id": "system", 00:10:10.988 "dma_device_type": 1 00:10:10.988 }, 00:10:10.988 { 00:10:10.988 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:10.988 "dma_device_type": 2 00:10:10.988 } 00:10:10.988 ], 00:10:10.988 "driver_specific": {} 00:10:10.988 } 00:10:10.988 ] 00:10:10.988 23:31:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:10:10.988 23:31:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:10:10.988 23:31:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:10.988 23:31:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:10.988 23:31:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:10.988 23:31:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:10.988 23:31:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:10.988 23:31:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:10.988 23:31:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:10.988 23:31:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:10.988 23:31:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:10.988 23:31:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:10.989 23:31:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:11.247 23:31:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:11.247 "name": "Existed_Raid", 00:10:11.247 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:11.247 "strip_size_kb": 0, 00:10:11.247 "state": "configuring", 00:10:11.247 "raid_level": "raid1", 00:10:11.247 "superblock": false, 00:10:11.247 "num_base_bdevs": 2, 00:10:11.247 "num_base_bdevs_discovered": 1, 00:10:11.247 "num_base_bdevs_operational": 2, 00:10:11.247 "base_bdevs_list": [ 00:10:11.247 { 00:10:11.247 "name": "BaseBdev1", 00:10:11.247 "uuid": "2e9f853b-60e2-47a4-a67e-94f181c2fbce", 00:10:11.247 "is_configured": true, 00:10:11.247 "data_offset": 0, 00:10:11.247 "data_size": 65536 00:10:11.247 }, 00:10:11.247 { 00:10:11.247 "name": "BaseBdev2", 00:10:11.247 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:11.247 "is_configured": false, 00:10:11.247 "data_offset": 0, 00:10:11.247 "data_size": 0 00:10:11.247 } 00:10:11.247 ] 00:10:11.247 }' 00:10:11.247 23:31:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:11.247 23:31:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:11.813 23:31:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:11.813 [2024-07-24 23:31:56.716144] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:11.813 [2024-07-24 23:31:56.716175] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb2a3a0 name Existed_Raid, state configuring 00:10:11.813 23:31:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:12.071 [2024-07-24 23:31:56.880593] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:12.071 [2024-07-24 23:31:56.881627] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:12.071 [2024-07-24 23:31:56.881653] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:12.071 23:31:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:10:12.071 23:31:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:10:12.071 23:31:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:10:12.071 23:31:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:12.071 23:31:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:12.071 23:31:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:12.071 23:31:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:12.071 23:31:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:12.071 23:31:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:12.071 23:31:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:12.071 23:31:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:12.071 23:31:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:12.071 23:31:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:12.071 23:31:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:12.329 23:31:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:12.329 "name": "Existed_Raid", 00:10:12.329 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:12.329 "strip_size_kb": 0, 00:10:12.329 "state": "configuring", 00:10:12.329 "raid_level": "raid1", 00:10:12.329 "superblock": false, 00:10:12.329 "num_base_bdevs": 2, 00:10:12.329 "num_base_bdevs_discovered": 1, 00:10:12.329 "num_base_bdevs_operational": 2, 00:10:12.329 "base_bdevs_list": [ 00:10:12.329 { 00:10:12.329 "name": "BaseBdev1", 00:10:12.329 "uuid": "2e9f853b-60e2-47a4-a67e-94f181c2fbce", 00:10:12.329 "is_configured": true, 00:10:12.329 "data_offset": 0, 00:10:12.329 "data_size": 65536 00:10:12.329 }, 00:10:12.329 { 00:10:12.329 "name": "BaseBdev2", 00:10:12.329 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:12.329 "is_configured": false, 00:10:12.329 "data_offset": 0, 00:10:12.329 "data_size": 0 00:10:12.329 } 00:10:12.329 ] 00:10:12.329 }' 00:10:12.329 23:31:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:12.329 23:31:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:12.586 23:31:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:10:12.844 [2024-07-24 23:31:57.713458] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:10:12.844 [2024-07-24 23:31:57.713502] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xb2b050 00:10:12.844 [2024-07-24 23:31:57.713509] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:10:12.844 [2024-07-24 23:31:57.713679] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb2d770 00:10:12.844 [2024-07-24 23:31:57.713772] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xb2b050 00:10:12.844 [2024-07-24 23:31:57.713777] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xb2b050 00:10:12.844 [2024-07-24 23:31:57.713917] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:12.844 BaseBdev2 00:10:12.844 23:31:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:10:12.844 23:31:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:10:12.844 23:31:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:10:12.844 23:31:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:10:12.844 23:31:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:10:12.844 23:31:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:10:12.844 23:31:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:13.102 23:31:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:10:13.102 [ 00:10:13.102 { 00:10:13.102 "name": "BaseBdev2", 00:10:13.102 "aliases": [ 00:10:13.102 "59632d3e-ed43-40bf-9b34-b7bc928e755f" 00:10:13.102 ], 00:10:13.102 "product_name": "Malloc disk", 00:10:13.102 "block_size": 512, 00:10:13.102 "num_blocks": 65536, 00:10:13.102 "uuid": "59632d3e-ed43-40bf-9b34-b7bc928e755f", 00:10:13.102 "assigned_rate_limits": { 00:10:13.102 "rw_ios_per_sec": 0, 00:10:13.102 "rw_mbytes_per_sec": 0, 00:10:13.102 "r_mbytes_per_sec": 0, 00:10:13.102 "w_mbytes_per_sec": 0 00:10:13.102 }, 00:10:13.102 "claimed": true, 00:10:13.102 "claim_type": "exclusive_write", 00:10:13.102 "zoned": false, 00:10:13.102 "supported_io_types": { 00:10:13.102 "read": true, 00:10:13.102 "write": true, 00:10:13.102 "unmap": true, 00:10:13.102 "flush": true, 00:10:13.102 "reset": true, 00:10:13.102 "nvme_admin": false, 00:10:13.102 "nvme_io": false, 00:10:13.102 "nvme_io_md": false, 00:10:13.102 "write_zeroes": true, 00:10:13.102 "zcopy": true, 00:10:13.102 "get_zone_info": false, 00:10:13.102 "zone_management": false, 00:10:13.102 "zone_append": false, 00:10:13.102 "compare": false, 00:10:13.102 "compare_and_write": false, 00:10:13.102 "abort": true, 00:10:13.102 "seek_hole": false, 00:10:13.102 "seek_data": false, 00:10:13.102 "copy": true, 00:10:13.102 "nvme_iov_md": false 00:10:13.102 }, 00:10:13.102 "memory_domains": [ 00:10:13.102 { 00:10:13.102 "dma_device_id": "system", 00:10:13.102 "dma_device_type": 1 00:10:13.102 }, 00:10:13.102 { 00:10:13.102 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:13.102 "dma_device_type": 2 00:10:13.102 } 00:10:13.102 ], 00:10:13.102 "driver_specific": {} 00:10:13.102 } 00:10:13.102 ] 00:10:13.102 23:31:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:10:13.102 23:31:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:10:13.102 23:31:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:10:13.102 23:31:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:10:13.102 23:31:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:13.102 23:31:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:13.102 23:31:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:13.102 23:31:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:13.102 23:31:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:13.102 23:31:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:13.102 23:31:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:13.102 23:31:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:13.102 23:31:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:13.102 23:31:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:13.102 23:31:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:13.361 23:31:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:13.361 "name": "Existed_Raid", 00:10:13.361 "uuid": "1932369a-c558-47c1-8aea-6acaae6e4bf6", 00:10:13.361 "strip_size_kb": 0, 00:10:13.361 "state": "online", 00:10:13.361 "raid_level": "raid1", 00:10:13.361 "superblock": false, 00:10:13.361 "num_base_bdevs": 2, 00:10:13.361 "num_base_bdevs_discovered": 2, 00:10:13.361 "num_base_bdevs_operational": 2, 00:10:13.361 "base_bdevs_list": [ 00:10:13.361 { 00:10:13.361 "name": "BaseBdev1", 00:10:13.361 "uuid": "2e9f853b-60e2-47a4-a67e-94f181c2fbce", 00:10:13.361 "is_configured": true, 00:10:13.361 "data_offset": 0, 00:10:13.361 "data_size": 65536 00:10:13.361 }, 00:10:13.361 { 00:10:13.361 "name": "BaseBdev2", 00:10:13.361 "uuid": "59632d3e-ed43-40bf-9b34-b7bc928e755f", 00:10:13.361 "is_configured": true, 00:10:13.361 "data_offset": 0, 00:10:13.361 "data_size": 65536 00:10:13.361 } 00:10:13.361 ] 00:10:13.361 }' 00:10:13.361 23:31:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:13.361 23:31:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:13.926 23:31:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:10:13.926 23:31:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:10:13.926 23:31:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:10:13.926 23:31:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:10:13.926 23:31:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:10:13.926 23:31:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:10:13.926 23:31:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:10:13.926 23:31:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:10:13.926 [2024-07-24 23:31:58.868654] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:13.927 23:31:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:10:13.927 "name": "Existed_Raid", 00:10:13.927 "aliases": [ 00:10:13.927 "1932369a-c558-47c1-8aea-6acaae6e4bf6" 00:10:13.927 ], 00:10:13.927 "product_name": "Raid Volume", 00:10:13.927 "block_size": 512, 00:10:13.927 "num_blocks": 65536, 00:10:13.927 "uuid": "1932369a-c558-47c1-8aea-6acaae6e4bf6", 00:10:13.927 "assigned_rate_limits": { 00:10:13.927 "rw_ios_per_sec": 0, 00:10:13.927 "rw_mbytes_per_sec": 0, 00:10:13.927 "r_mbytes_per_sec": 0, 00:10:13.927 "w_mbytes_per_sec": 0 00:10:13.927 }, 00:10:13.927 "claimed": false, 00:10:13.927 "zoned": false, 00:10:13.927 "supported_io_types": { 00:10:13.927 "read": true, 00:10:13.927 "write": true, 00:10:13.927 "unmap": false, 00:10:13.927 "flush": false, 00:10:13.927 "reset": true, 00:10:13.927 "nvme_admin": false, 00:10:13.927 "nvme_io": false, 00:10:13.927 "nvme_io_md": false, 00:10:13.927 "write_zeroes": true, 00:10:13.927 "zcopy": false, 00:10:13.927 "get_zone_info": false, 00:10:13.927 "zone_management": false, 00:10:13.927 "zone_append": false, 00:10:13.927 "compare": false, 00:10:13.927 "compare_and_write": false, 00:10:13.927 "abort": false, 00:10:13.927 "seek_hole": false, 00:10:13.927 "seek_data": false, 00:10:13.927 "copy": false, 00:10:13.927 "nvme_iov_md": false 00:10:13.927 }, 00:10:13.927 "memory_domains": [ 00:10:13.927 { 00:10:13.927 "dma_device_id": "system", 00:10:13.927 "dma_device_type": 1 00:10:13.927 }, 00:10:13.927 { 00:10:13.927 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:13.927 "dma_device_type": 2 00:10:13.927 }, 00:10:13.927 { 00:10:13.927 "dma_device_id": "system", 00:10:13.927 "dma_device_type": 1 00:10:13.927 }, 00:10:13.927 { 00:10:13.927 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:13.927 "dma_device_type": 2 00:10:13.927 } 00:10:13.927 ], 00:10:13.927 "driver_specific": { 00:10:13.927 "raid": { 00:10:13.927 "uuid": "1932369a-c558-47c1-8aea-6acaae6e4bf6", 00:10:13.927 "strip_size_kb": 0, 00:10:13.927 "state": "online", 00:10:13.927 "raid_level": "raid1", 00:10:13.927 "superblock": false, 00:10:13.927 "num_base_bdevs": 2, 00:10:13.927 "num_base_bdevs_discovered": 2, 00:10:13.927 "num_base_bdevs_operational": 2, 00:10:13.927 "base_bdevs_list": [ 00:10:13.927 { 00:10:13.927 "name": "BaseBdev1", 00:10:13.927 "uuid": "2e9f853b-60e2-47a4-a67e-94f181c2fbce", 00:10:13.927 "is_configured": true, 00:10:13.927 "data_offset": 0, 00:10:13.927 "data_size": 65536 00:10:13.927 }, 00:10:13.927 { 00:10:13.927 "name": "BaseBdev2", 00:10:13.927 "uuid": "59632d3e-ed43-40bf-9b34-b7bc928e755f", 00:10:13.927 "is_configured": true, 00:10:13.927 "data_offset": 0, 00:10:13.927 "data_size": 65536 00:10:13.927 } 00:10:13.927 ] 00:10:13.927 } 00:10:13.927 } 00:10:13.927 }' 00:10:13.927 23:31:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:14.185 23:31:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:10:14.185 BaseBdev2' 00:10:14.185 23:31:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:14.185 23:31:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:10:14.185 23:31:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:14.185 23:31:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:14.185 "name": "BaseBdev1", 00:10:14.185 "aliases": [ 00:10:14.185 "2e9f853b-60e2-47a4-a67e-94f181c2fbce" 00:10:14.185 ], 00:10:14.185 "product_name": "Malloc disk", 00:10:14.185 "block_size": 512, 00:10:14.185 "num_blocks": 65536, 00:10:14.185 "uuid": "2e9f853b-60e2-47a4-a67e-94f181c2fbce", 00:10:14.185 "assigned_rate_limits": { 00:10:14.185 "rw_ios_per_sec": 0, 00:10:14.185 "rw_mbytes_per_sec": 0, 00:10:14.185 "r_mbytes_per_sec": 0, 00:10:14.185 "w_mbytes_per_sec": 0 00:10:14.185 }, 00:10:14.185 "claimed": true, 00:10:14.185 "claim_type": "exclusive_write", 00:10:14.185 "zoned": false, 00:10:14.185 "supported_io_types": { 00:10:14.185 "read": true, 00:10:14.185 "write": true, 00:10:14.185 "unmap": true, 00:10:14.185 "flush": true, 00:10:14.185 "reset": true, 00:10:14.185 "nvme_admin": false, 00:10:14.185 "nvme_io": false, 00:10:14.185 "nvme_io_md": false, 00:10:14.185 "write_zeroes": true, 00:10:14.185 "zcopy": true, 00:10:14.185 "get_zone_info": false, 00:10:14.185 "zone_management": false, 00:10:14.185 "zone_append": false, 00:10:14.185 "compare": false, 00:10:14.185 "compare_and_write": false, 00:10:14.185 "abort": true, 00:10:14.185 "seek_hole": false, 00:10:14.185 "seek_data": false, 00:10:14.185 "copy": true, 00:10:14.185 "nvme_iov_md": false 00:10:14.185 }, 00:10:14.185 "memory_domains": [ 00:10:14.185 { 00:10:14.185 "dma_device_id": "system", 00:10:14.185 "dma_device_type": 1 00:10:14.185 }, 00:10:14.185 { 00:10:14.185 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:14.185 "dma_device_type": 2 00:10:14.185 } 00:10:14.185 ], 00:10:14.185 "driver_specific": {} 00:10:14.185 }' 00:10:14.185 23:31:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:14.185 23:31:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:14.443 23:31:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:14.443 23:31:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:14.443 23:31:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:14.443 23:31:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:14.443 23:31:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:14.443 23:31:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:14.443 23:31:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:14.443 23:31:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:14.443 23:31:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:14.443 23:31:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:14.443 23:31:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:14.443 23:31:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:10:14.443 23:31:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:14.700 23:31:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:14.700 "name": "BaseBdev2", 00:10:14.700 "aliases": [ 00:10:14.700 "59632d3e-ed43-40bf-9b34-b7bc928e755f" 00:10:14.700 ], 00:10:14.700 "product_name": "Malloc disk", 00:10:14.700 "block_size": 512, 00:10:14.700 "num_blocks": 65536, 00:10:14.700 "uuid": "59632d3e-ed43-40bf-9b34-b7bc928e755f", 00:10:14.700 "assigned_rate_limits": { 00:10:14.700 "rw_ios_per_sec": 0, 00:10:14.700 "rw_mbytes_per_sec": 0, 00:10:14.700 "r_mbytes_per_sec": 0, 00:10:14.700 "w_mbytes_per_sec": 0 00:10:14.700 }, 00:10:14.700 "claimed": true, 00:10:14.700 "claim_type": "exclusive_write", 00:10:14.700 "zoned": false, 00:10:14.700 "supported_io_types": { 00:10:14.700 "read": true, 00:10:14.700 "write": true, 00:10:14.700 "unmap": true, 00:10:14.700 "flush": true, 00:10:14.700 "reset": true, 00:10:14.700 "nvme_admin": false, 00:10:14.700 "nvme_io": false, 00:10:14.700 "nvme_io_md": false, 00:10:14.700 "write_zeroes": true, 00:10:14.700 "zcopy": true, 00:10:14.700 "get_zone_info": false, 00:10:14.700 "zone_management": false, 00:10:14.700 "zone_append": false, 00:10:14.700 "compare": false, 00:10:14.700 "compare_and_write": false, 00:10:14.700 "abort": true, 00:10:14.700 "seek_hole": false, 00:10:14.700 "seek_data": false, 00:10:14.700 "copy": true, 00:10:14.700 "nvme_iov_md": false 00:10:14.700 }, 00:10:14.700 "memory_domains": [ 00:10:14.700 { 00:10:14.700 "dma_device_id": "system", 00:10:14.700 "dma_device_type": 1 00:10:14.700 }, 00:10:14.700 { 00:10:14.700 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:14.700 "dma_device_type": 2 00:10:14.700 } 00:10:14.700 ], 00:10:14.700 "driver_specific": {} 00:10:14.700 }' 00:10:14.700 23:31:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:14.700 23:31:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:14.700 23:31:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:14.700 23:31:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:14.957 23:31:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:14.957 23:31:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:14.958 23:31:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:14.958 23:31:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:14.958 23:31:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:14.958 23:31:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:14.958 23:31:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:14.958 23:31:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:14.958 23:31:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:10:15.215 [2024-07-24 23:32:00.067586] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:10:15.215 23:32:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:10:15.215 23:32:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:10:15.215 23:32:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:10:15.216 23:32:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:10:15.216 23:32:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:10:15.216 23:32:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:10:15.216 23:32:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:15.216 23:32:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:15.216 23:32:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:15.216 23:32:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:15.216 23:32:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:10:15.216 23:32:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:15.216 23:32:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:15.216 23:32:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:15.216 23:32:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:15.216 23:32:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:15.216 23:32:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:15.473 23:32:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:15.473 "name": "Existed_Raid", 00:10:15.473 "uuid": "1932369a-c558-47c1-8aea-6acaae6e4bf6", 00:10:15.473 "strip_size_kb": 0, 00:10:15.473 "state": "online", 00:10:15.473 "raid_level": "raid1", 00:10:15.473 "superblock": false, 00:10:15.473 "num_base_bdevs": 2, 00:10:15.473 "num_base_bdevs_discovered": 1, 00:10:15.473 "num_base_bdevs_operational": 1, 00:10:15.473 "base_bdevs_list": [ 00:10:15.473 { 00:10:15.473 "name": null, 00:10:15.473 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:15.473 "is_configured": false, 00:10:15.473 "data_offset": 0, 00:10:15.473 "data_size": 65536 00:10:15.473 }, 00:10:15.473 { 00:10:15.473 "name": "BaseBdev2", 00:10:15.473 "uuid": "59632d3e-ed43-40bf-9b34-b7bc928e755f", 00:10:15.473 "is_configured": true, 00:10:15.473 "data_offset": 0, 00:10:15.473 "data_size": 65536 00:10:15.473 } 00:10:15.473 ] 00:10:15.473 }' 00:10:15.473 23:32:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:15.473 23:32:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:16.038 23:32:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:10:16.038 23:32:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:10:16.038 23:32:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:16.038 23:32:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:10:16.038 23:32:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:10:16.038 23:32:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:10:16.038 23:32:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:10:16.295 [2024-07-24 23:32:01.070965] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:10:16.296 [2024-07-24 23:32:01.071031] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:16.296 [2024-07-24 23:32:01.080985] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:16.296 [2024-07-24 23:32:01.081027] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:16.296 [2024-07-24 23:32:01.081033] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb2b050 name Existed_Raid, state offline 00:10:16.296 23:32:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:10:16.296 23:32:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:10:16.296 23:32:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:16.296 23:32:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:10:16.296 23:32:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:10:16.296 23:32:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:10:16.296 23:32:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:10:16.296 23:32:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 255920 00:10:16.296 23:32:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@950 -- # '[' -z 255920 ']' 00:10:16.296 23:32:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # kill -0 255920 00:10:16.296 23:32:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # uname 00:10:16.296 23:32:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:10:16.296 23:32:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 255920 00:10:16.554 23:32:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:10:16.554 23:32:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:10:16.554 23:32:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 255920' 00:10:16.554 killing process with pid 255920 00:10:16.554 23:32:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@969 -- # kill 255920 00:10:16.554 [2024-07-24 23:32:01.308569] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:16.554 23:32:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@974 -- # wait 255920 00:10:16.554 [2024-07-24 23:32:01.309331] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:16.554 23:32:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:10:16.554 00:10:16.554 real 0m8.092s 00:10:16.554 user 0m14.475s 00:10:16.554 sys 0m1.311s 00:10:16.554 23:32:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:16.554 23:32:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:16.554 ************************************ 00:10:16.554 END TEST raid_state_function_test 00:10:16.554 ************************************ 00:10:16.554 23:32:01 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 2 true 00:10:16.554 23:32:01 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:10:16.554 23:32:01 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:16.554 23:32:01 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:16.554 ************************************ 00:10:16.554 START TEST raid_state_function_test_sb 00:10:16.554 ************************************ 00:10:16.554 23:32:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # raid_state_function_test raid1 2 true 00:10:16.554 23:32:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:10:16.554 23:32:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:10:16.554 23:32:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:10:16.554 23:32:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:10:16.554 23:32:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:10:16.554 23:32:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:16.554 23:32:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:10:16.554 23:32:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:16.554 23:32:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:16.554 23:32:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:10:16.554 23:32:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:16.554 23:32:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:16.554 23:32:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:10:16.554 23:32:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:10:16.554 23:32:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:10:16.554 23:32:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:10:16.554 23:32:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:10:16.554 23:32:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:10:16.554 23:32:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:10:16.554 23:32:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:10:16.554 23:32:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:10:16.554 23:32:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:10:16.813 23:32:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=257519 00:10:16.813 23:32:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 257519' 00:10:16.813 23:32:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:16.813 Process raid pid: 257519 00:10:16.813 23:32:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 257519 /var/tmp/spdk-raid.sock 00:10:16.813 23:32:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@831 -- # '[' -z 257519 ']' 00:10:16.813 23:32:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:16.813 23:32:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # local max_retries=100 00:10:16.813 23:32:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:16.813 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:16.813 23:32:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@840 -- # xtrace_disable 00:10:16.813 23:32:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:16.813 [2024-07-24 23:32:01.602244] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:10:16.813 [2024-07-24 23:32:01.602282] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:16.813 [2024-07-24 23:32:01.665403] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:16.813 [2024-07-24 23:32:01.744357] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:16.813 [2024-07-24 23:32:01.799888] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:16.813 [2024-07-24 23:32:01.799915] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:17.746 23:32:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:10:17.746 23:32:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@864 -- # return 0 00:10:17.746 23:32:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:17.746 [2024-07-24 23:32:02.547368] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:17.746 [2024-07-24 23:32:02.547399] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:17.746 [2024-07-24 23:32:02.547404] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:17.746 [2024-07-24 23:32:02.547409] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:17.746 23:32:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:10:17.746 23:32:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:17.746 23:32:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:17.746 23:32:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:17.746 23:32:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:17.746 23:32:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:17.746 23:32:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:17.746 23:32:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:17.746 23:32:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:17.746 23:32:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:17.746 23:32:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:17.746 23:32:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:17.746 23:32:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:17.746 "name": "Existed_Raid", 00:10:17.746 "uuid": "e16e447d-88f3-44a2-909f-286b8aec2081", 00:10:17.746 "strip_size_kb": 0, 00:10:17.746 "state": "configuring", 00:10:17.746 "raid_level": "raid1", 00:10:17.746 "superblock": true, 00:10:17.746 "num_base_bdevs": 2, 00:10:17.746 "num_base_bdevs_discovered": 0, 00:10:17.746 "num_base_bdevs_operational": 2, 00:10:17.746 "base_bdevs_list": [ 00:10:17.746 { 00:10:17.746 "name": "BaseBdev1", 00:10:17.746 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:17.746 "is_configured": false, 00:10:17.746 "data_offset": 0, 00:10:17.746 "data_size": 0 00:10:17.746 }, 00:10:17.746 { 00:10:17.746 "name": "BaseBdev2", 00:10:17.746 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:17.746 "is_configured": false, 00:10:17.746 "data_offset": 0, 00:10:17.746 "data_size": 0 00:10:17.746 } 00:10:17.746 ] 00:10:17.746 }' 00:10:17.746 23:32:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:17.746 23:32:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:18.310 23:32:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:18.568 [2024-07-24 23:32:03.329306] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:18.568 [2024-07-24 23:32:03.329329] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf2bb10 name Existed_Raid, state configuring 00:10:18.568 23:32:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:18.568 [2024-07-24 23:32:03.489744] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:18.568 [2024-07-24 23:32:03.489769] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:18.568 [2024-07-24 23:32:03.489774] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:18.568 [2024-07-24 23:32:03.489779] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:18.568 23:32:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:10:18.824 [2024-07-24 23:32:03.646290] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:18.824 BaseBdev1 00:10:18.824 23:32:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:10:18.824 23:32:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:10:18.824 23:32:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:10:18.824 23:32:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:10:18.824 23:32:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:10:18.824 23:32:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:10:18.824 23:32:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:18.824 23:32:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:10:19.081 [ 00:10:19.081 { 00:10:19.081 "name": "BaseBdev1", 00:10:19.081 "aliases": [ 00:10:19.081 "b53e7302-d45e-4215-8eeb-c27892f247a9" 00:10:19.081 ], 00:10:19.081 "product_name": "Malloc disk", 00:10:19.081 "block_size": 512, 00:10:19.081 "num_blocks": 65536, 00:10:19.081 "uuid": "b53e7302-d45e-4215-8eeb-c27892f247a9", 00:10:19.081 "assigned_rate_limits": { 00:10:19.081 "rw_ios_per_sec": 0, 00:10:19.081 "rw_mbytes_per_sec": 0, 00:10:19.081 "r_mbytes_per_sec": 0, 00:10:19.081 "w_mbytes_per_sec": 0 00:10:19.081 }, 00:10:19.081 "claimed": true, 00:10:19.081 "claim_type": "exclusive_write", 00:10:19.081 "zoned": false, 00:10:19.081 "supported_io_types": { 00:10:19.081 "read": true, 00:10:19.081 "write": true, 00:10:19.081 "unmap": true, 00:10:19.081 "flush": true, 00:10:19.081 "reset": true, 00:10:19.081 "nvme_admin": false, 00:10:19.081 "nvme_io": false, 00:10:19.081 "nvme_io_md": false, 00:10:19.081 "write_zeroes": true, 00:10:19.081 "zcopy": true, 00:10:19.081 "get_zone_info": false, 00:10:19.081 "zone_management": false, 00:10:19.081 "zone_append": false, 00:10:19.081 "compare": false, 00:10:19.081 "compare_and_write": false, 00:10:19.081 "abort": true, 00:10:19.081 "seek_hole": false, 00:10:19.081 "seek_data": false, 00:10:19.081 "copy": true, 00:10:19.081 "nvme_iov_md": false 00:10:19.081 }, 00:10:19.081 "memory_domains": [ 00:10:19.081 { 00:10:19.081 "dma_device_id": "system", 00:10:19.081 "dma_device_type": 1 00:10:19.081 }, 00:10:19.081 { 00:10:19.081 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:19.081 "dma_device_type": 2 00:10:19.081 } 00:10:19.081 ], 00:10:19.081 "driver_specific": {} 00:10:19.081 } 00:10:19.081 ] 00:10:19.081 23:32:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:10:19.081 23:32:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:10:19.081 23:32:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:19.081 23:32:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:19.081 23:32:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:19.081 23:32:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:19.081 23:32:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:19.081 23:32:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:19.081 23:32:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:19.081 23:32:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:19.081 23:32:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:19.081 23:32:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:19.081 23:32:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:19.339 23:32:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:19.339 "name": "Existed_Raid", 00:10:19.339 "uuid": "7b040d01-b92a-4224-9a38-b44283bc1fd9", 00:10:19.339 "strip_size_kb": 0, 00:10:19.339 "state": "configuring", 00:10:19.339 "raid_level": "raid1", 00:10:19.339 "superblock": true, 00:10:19.339 "num_base_bdevs": 2, 00:10:19.339 "num_base_bdevs_discovered": 1, 00:10:19.339 "num_base_bdevs_operational": 2, 00:10:19.339 "base_bdevs_list": [ 00:10:19.339 { 00:10:19.339 "name": "BaseBdev1", 00:10:19.339 "uuid": "b53e7302-d45e-4215-8eeb-c27892f247a9", 00:10:19.339 "is_configured": true, 00:10:19.339 "data_offset": 2048, 00:10:19.339 "data_size": 63488 00:10:19.339 }, 00:10:19.339 { 00:10:19.339 "name": "BaseBdev2", 00:10:19.339 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:19.339 "is_configured": false, 00:10:19.339 "data_offset": 0, 00:10:19.339 "data_size": 0 00:10:19.339 } 00:10:19.339 ] 00:10:19.339 }' 00:10:19.339 23:32:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:19.339 23:32:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:19.903 23:32:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:19.903 [2024-07-24 23:32:04.773197] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:19.903 [2024-07-24 23:32:04.773228] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf2b3a0 name Existed_Raid, state configuring 00:10:19.903 23:32:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:20.161 [2024-07-24 23:32:04.941669] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:20.161 [2024-07-24 23:32:04.942668] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:20.161 [2024-07-24 23:32:04.942693] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:20.161 23:32:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:10:20.161 23:32:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:10:20.161 23:32:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:10:20.161 23:32:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:20.161 23:32:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:20.161 23:32:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:20.161 23:32:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:20.161 23:32:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:20.161 23:32:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:20.161 23:32:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:20.161 23:32:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:20.161 23:32:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:20.161 23:32:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:20.161 23:32:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:20.161 23:32:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:20.161 "name": "Existed_Raid", 00:10:20.161 "uuid": "86507d78-c8b6-48fa-9deb-db31796bf21a", 00:10:20.161 "strip_size_kb": 0, 00:10:20.161 "state": "configuring", 00:10:20.161 "raid_level": "raid1", 00:10:20.161 "superblock": true, 00:10:20.161 "num_base_bdevs": 2, 00:10:20.161 "num_base_bdevs_discovered": 1, 00:10:20.161 "num_base_bdevs_operational": 2, 00:10:20.162 "base_bdevs_list": [ 00:10:20.162 { 00:10:20.162 "name": "BaseBdev1", 00:10:20.162 "uuid": "b53e7302-d45e-4215-8eeb-c27892f247a9", 00:10:20.162 "is_configured": true, 00:10:20.162 "data_offset": 2048, 00:10:20.162 "data_size": 63488 00:10:20.162 }, 00:10:20.162 { 00:10:20.162 "name": "BaseBdev2", 00:10:20.162 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:20.162 "is_configured": false, 00:10:20.162 "data_offset": 0, 00:10:20.162 "data_size": 0 00:10:20.162 } 00:10:20.162 ] 00:10:20.162 }' 00:10:20.162 23:32:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:20.162 23:32:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:20.727 23:32:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:10:20.985 [2024-07-24 23:32:05.778340] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:10:20.985 [2024-07-24 23:32:05.778446] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xf2c050 00:10:20.985 [2024-07-24 23:32:05.778454] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:10:20.985 [2024-07-24 23:32:05.778596] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf2feb0 00:10:20.985 [2024-07-24 23:32:05.778686] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xf2c050 00:10:20.985 [2024-07-24 23:32:05.778692] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xf2c050 00:10:20.985 [2024-07-24 23:32:05.778753] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:20.985 BaseBdev2 00:10:20.985 23:32:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:10:20.985 23:32:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:10:20.985 23:32:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:10:20.985 23:32:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:10:20.985 23:32:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:10:20.985 23:32:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:10:20.985 23:32:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:20.985 23:32:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:10:21.243 [ 00:10:21.243 { 00:10:21.243 "name": "BaseBdev2", 00:10:21.243 "aliases": [ 00:10:21.243 "91f8b951-117e-4ec3-b89e-4bbb1ea04542" 00:10:21.243 ], 00:10:21.243 "product_name": "Malloc disk", 00:10:21.243 "block_size": 512, 00:10:21.243 "num_blocks": 65536, 00:10:21.243 "uuid": "91f8b951-117e-4ec3-b89e-4bbb1ea04542", 00:10:21.243 "assigned_rate_limits": { 00:10:21.243 "rw_ios_per_sec": 0, 00:10:21.243 "rw_mbytes_per_sec": 0, 00:10:21.243 "r_mbytes_per_sec": 0, 00:10:21.243 "w_mbytes_per_sec": 0 00:10:21.243 }, 00:10:21.243 "claimed": true, 00:10:21.243 "claim_type": "exclusive_write", 00:10:21.243 "zoned": false, 00:10:21.243 "supported_io_types": { 00:10:21.243 "read": true, 00:10:21.243 "write": true, 00:10:21.243 "unmap": true, 00:10:21.243 "flush": true, 00:10:21.243 "reset": true, 00:10:21.243 "nvme_admin": false, 00:10:21.243 "nvme_io": false, 00:10:21.243 "nvme_io_md": false, 00:10:21.243 "write_zeroes": true, 00:10:21.243 "zcopy": true, 00:10:21.243 "get_zone_info": false, 00:10:21.243 "zone_management": false, 00:10:21.243 "zone_append": false, 00:10:21.243 "compare": false, 00:10:21.243 "compare_and_write": false, 00:10:21.243 "abort": true, 00:10:21.243 "seek_hole": false, 00:10:21.243 "seek_data": false, 00:10:21.243 "copy": true, 00:10:21.243 "nvme_iov_md": false 00:10:21.243 }, 00:10:21.243 "memory_domains": [ 00:10:21.243 { 00:10:21.243 "dma_device_id": "system", 00:10:21.243 "dma_device_type": 1 00:10:21.244 }, 00:10:21.244 { 00:10:21.244 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:21.244 "dma_device_type": 2 00:10:21.244 } 00:10:21.244 ], 00:10:21.244 "driver_specific": {} 00:10:21.244 } 00:10:21.244 ] 00:10:21.244 23:32:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:10:21.244 23:32:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:10:21.244 23:32:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:10:21.244 23:32:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:10:21.244 23:32:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:21.244 23:32:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:21.244 23:32:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:21.244 23:32:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:21.244 23:32:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:21.244 23:32:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:21.244 23:32:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:21.244 23:32:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:21.244 23:32:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:21.244 23:32:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:21.244 23:32:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:21.502 23:32:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:21.502 "name": "Existed_Raid", 00:10:21.502 "uuid": "86507d78-c8b6-48fa-9deb-db31796bf21a", 00:10:21.502 "strip_size_kb": 0, 00:10:21.502 "state": "online", 00:10:21.502 "raid_level": "raid1", 00:10:21.502 "superblock": true, 00:10:21.502 "num_base_bdevs": 2, 00:10:21.502 "num_base_bdevs_discovered": 2, 00:10:21.502 "num_base_bdevs_operational": 2, 00:10:21.502 "base_bdevs_list": [ 00:10:21.502 { 00:10:21.502 "name": "BaseBdev1", 00:10:21.502 "uuid": "b53e7302-d45e-4215-8eeb-c27892f247a9", 00:10:21.502 "is_configured": true, 00:10:21.502 "data_offset": 2048, 00:10:21.502 "data_size": 63488 00:10:21.502 }, 00:10:21.502 { 00:10:21.502 "name": "BaseBdev2", 00:10:21.502 "uuid": "91f8b951-117e-4ec3-b89e-4bbb1ea04542", 00:10:21.502 "is_configured": true, 00:10:21.502 "data_offset": 2048, 00:10:21.502 "data_size": 63488 00:10:21.502 } 00:10:21.502 ] 00:10:21.502 }' 00:10:21.502 23:32:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:21.502 23:32:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:21.760 23:32:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:10:21.760 23:32:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:10:21.760 23:32:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:10:21.760 23:32:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:10:21.760 23:32:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:10:21.760 23:32:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:10:21.760 23:32:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:10:21.760 23:32:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:10:22.018 [2024-07-24 23:32:06.905417] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:22.018 23:32:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:10:22.018 "name": "Existed_Raid", 00:10:22.018 "aliases": [ 00:10:22.018 "86507d78-c8b6-48fa-9deb-db31796bf21a" 00:10:22.018 ], 00:10:22.018 "product_name": "Raid Volume", 00:10:22.018 "block_size": 512, 00:10:22.018 "num_blocks": 63488, 00:10:22.018 "uuid": "86507d78-c8b6-48fa-9deb-db31796bf21a", 00:10:22.018 "assigned_rate_limits": { 00:10:22.018 "rw_ios_per_sec": 0, 00:10:22.018 "rw_mbytes_per_sec": 0, 00:10:22.018 "r_mbytes_per_sec": 0, 00:10:22.018 "w_mbytes_per_sec": 0 00:10:22.018 }, 00:10:22.018 "claimed": false, 00:10:22.018 "zoned": false, 00:10:22.018 "supported_io_types": { 00:10:22.018 "read": true, 00:10:22.018 "write": true, 00:10:22.018 "unmap": false, 00:10:22.018 "flush": false, 00:10:22.018 "reset": true, 00:10:22.018 "nvme_admin": false, 00:10:22.018 "nvme_io": false, 00:10:22.018 "nvme_io_md": false, 00:10:22.018 "write_zeroes": true, 00:10:22.018 "zcopy": false, 00:10:22.018 "get_zone_info": false, 00:10:22.018 "zone_management": false, 00:10:22.018 "zone_append": false, 00:10:22.018 "compare": false, 00:10:22.018 "compare_and_write": false, 00:10:22.018 "abort": false, 00:10:22.018 "seek_hole": false, 00:10:22.018 "seek_data": false, 00:10:22.018 "copy": false, 00:10:22.018 "nvme_iov_md": false 00:10:22.018 }, 00:10:22.018 "memory_domains": [ 00:10:22.018 { 00:10:22.018 "dma_device_id": "system", 00:10:22.018 "dma_device_type": 1 00:10:22.018 }, 00:10:22.018 { 00:10:22.018 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:22.018 "dma_device_type": 2 00:10:22.018 }, 00:10:22.018 { 00:10:22.018 "dma_device_id": "system", 00:10:22.018 "dma_device_type": 1 00:10:22.018 }, 00:10:22.018 { 00:10:22.018 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:22.018 "dma_device_type": 2 00:10:22.018 } 00:10:22.018 ], 00:10:22.018 "driver_specific": { 00:10:22.018 "raid": { 00:10:22.018 "uuid": "86507d78-c8b6-48fa-9deb-db31796bf21a", 00:10:22.018 "strip_size_kb": 0, 00:10:22.018 "state": "online", 00:10:22.018 "raid_level": "raid1", 00:10:22.018 "superblock": true, 00:10:22.018 "num_base_bdevs": 2, 00:10:22.018 "num_base_bdevs_discovered": 2, 00:10:22.018 "num_base_bdevs_operational": 2, 00:10:22.018 "base_bdevs_list": [ 00:10:22.018 { 00:10:22.018 "name": "BaseBdev1", 00:10:22.018 "uuid": "b53e7302-d45e-4215-8eeb-c27892f247a9", 00:10:22.018 "is_configured": true, 00:10:22.018 "data_offset": 2048, 00:10:22.018 "data_size": 63488 00:10:22.018 }, 00:10:22.018 { 00:10:22.018 "name": "BaseBdev2", 00:10:22.018 "uuid": "91f8b951-117e-4ec3-b89e-4bbb1ea04542", 00:10:22.018 "is_configured": true, 00:10:22.018 "data_offset": 2048, 00:10:22.018 "data_size": 63488 00:10:22.018 } 00:10:22.018 ] 00:10:22.018 } 00:10:22.018 } 00:10:22.018 }' 00:10:22.018 23:32:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:22.018 23:32:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:10:22.018 BaseBdev2' 00:10:22.018 23:32:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:22.018 23:32:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:10:22.018 23:32:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:22.316 23:32:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:22.316 "name": "BaseBdev1", 00:10:22.316 "aliases": [ 00:10:22.316 "b53e7302-d45e-4215-8eeb-c27892f247a9" 00:10:22.316 ], 00:10:22.316 "product_name": "Malloc disk", 00:10:22.316 "block_size": 512, 00:10:22.316 "num_blocks": 65536, 00:10:22.316 "uuid": "b53e7302-d45e-4215-8eeb-c27892f247a9", 00:10:22.316 "assigned_rate_limits": { 00:10:22.316 "rw_ios_per_sec": 0, 00:10:22.316 "rw_mbytes_per_sec": 0, 00:10:22.316 "r_mbytes_per_sec": 0, 00:10:22.316 "w_mbytes_per_sec": 0 00:10:22.316 }, 00:10:22.316 "claimed": true, 00:10:22.316 "claim_type": "exclusive_write", 00:10:22.316 "zoned": false, 00:10:22.316 "supported_io_types": { 00:10:22.316 "read": true, 00:10:22.316 "write": true, 00:10:22.316 "unmap": true, 00:10:22.316 "flush": true, 00:10:22.316 "reset": true, 00:10:22.316 "nvme_admin": false, 00:10:22.316 "nvme_io": false, 00:10:22.316 "nvme_io_md": false, 00:10:22.316 "write_zeroes": true, 00:10:22.316 "zcopy": true, 00:10:22.316 "get_zone_info": false, 00:10:22.316 "zone_management": false, 00:10:22.316 "zone_append": false, 00:10:22.316 "compare": false, 00:10:22.316 "compare_and_write": false, 00:10:22.316 "abort": true, 00:10:22.316 "seek_hole": false, 00:10:22.316 "seek_data": false, 00:10:22.316 "copy": true, 00:10:22.316 "nvme_iov_md": false 00:10:22.316 }, 00:10:22.316 "memory_domains": [ 00:10:22.316 { 00:10:22.316 "dma_device_id": "system", 00:10:22.317 "dma_device_type": 1 00:10:22.317 }, 00:10:22.317 { 00:10:22.317 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:22.317 "dma_device_type": 2 00:10:22.317 } 00:10:22.317 ], 00:10:22.317 "driver_specific": {} 00:10:22.317 }' 00:10:22.317 23:32:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:22.317 23:32:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:22.317 23:32:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:22.317 23:32:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:22.317 23:32:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:22.597 23:32:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:22.597 23:32:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:22.597 23:32:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:22.597 23:32:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:22.597 23:32:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:22.597 23:32:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:22.597 23:32:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:22.597 23:32:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:22.597 23:32:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:22.597 23:32:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:10:22.852 23:32:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:22.852 "name": "BaseBdev2", 00:10:22.852 "aliases": [ 00:10:22.852 "91f8b951-117e-4ec3-b89e-4bbb1ea04542" 00:10:22.853 ], 00:10:22.853 "product_name": "Malloc disk", 00:10:22.853 "block_size": 512, 00:10:22.853 "num_blocks": 65536, 00:10:22.853 "uuid": "91f8b951-117e-4ec3-b89e-4bbb1ea04542", 00:10:22.853 "assigned_rate_limits": { 00:10:22.853 "rw_ios_per_sec": 0, 00:10:22.853 "rw_mbytes_per_sec": 0, 00:10:22.853 "r_mbytes_per_sec": 0, 00:10:22.853 "w_mbytes_per_sec": 0 00:10:22.853 }, 00:10:22.853 "claimed": true, 00:10:22.853 "claim_type": "exclusive_write", 00:10:22.853 "zoned": false, 00:10:22.853 "supported_io_types": { 00:10:22.853 "read": true, 00:10:22.853 "write": true, 00:10:22.853 "unmap": true, 00:10:22.853 "flush": true, 00:10:22.853 "reset": true, 00:10:22.853 "nvme_admin": false, 00:10:22.853 "nvme_io": false, 00:10:22.853 "nvme_io_md": false, 00:10:22.853 "write_zeroes": true, 00:10:22.853 "zcopy": true, 00:10:22.853 "get_zone_info": false, 00:10:22.853 "zone_management": false, 00:10:22.853 "zone_append": false, 00:10:22.853 "compare": false, 00:10:22.853 "compare_and_write": false, 00:10:22.853 "abort": true, 00:10:22.853 "seek_hole": false, 00:10:22.853 "seek_data": false, 00:10:22.853 "copy": true, 00:10:22.853 "nvme_iov_md": false 00:10:22.853 }, 00:10:22.853 "memory_domains": [ 00:10:22.853 { 00:10:22.853 "dma_device_id": "system", 00:10:22.853 "dma_device_type": 1 00:10:22.853 }, 00:10:22.853 { 00:10:22.853 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:22.853 "dma_device_type": 2 00:10:22.853 } 00:10:22.853 ], 00:10:22.853 "driver_specific": {} 00:10:22.853 }' 00:10:22.853 23:32:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:22.853 23:32:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:22.853 23:32:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:22.853 23:32:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:22.853 23:32:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:22.853 23:32:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:22.853 23:32:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:22.853 23:32:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:23.110 23:32:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:23.110 23:32:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:23.110 23:32:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:23.110 23:32:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:23.110 23:32:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:10:23.110 [2024-07-24 23:32:08.096342] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:10:23.368 23:32:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:10:23.368 23:32:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:10:23.368 23:32:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:10:23.368 23:32:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:10:23.368 23:32:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:10:23.368 23:32:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:10:23.368 23:32:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:23.368 23:32:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:23.368 23:32:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:23.368 23:32:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:23.368 23:32:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:10:23.368 23:32:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:23.368 23:32:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:23.368 23:32:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:23.368 23:32:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:23.368 23:32:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:23.368 23:32:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:23.368 23:32:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:23.368 "name": "Existed_Raid", 00:10:23.368 "uuid": "86507d78-c8b6-48fa-9deb-db31796bf21a", 00:10:23.368 "strip_size_kb": 0, 00:10:23.368 "state": "online", 00:10:23.368 "raid_level": "raid1", 00:10:23.368 "superblock": true, 00:10:23.368 "num_base_bdevs": 2, 00:10:23.368 "num_base_bdevs_discovered": 1, 00:10:23.368 "num_base_bdevs_operational": 1, 00:10:23.368 "base_bdevs_list": [ 00:10:23.368 { 00:10:23.368 "name": null, 00:10:23.368 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:23.368 "is_configured": false, 00:10:23.368 "data_offset": 2048, 00:10:23.368 "data_size": 63488 00:10:23.368 }, 00:10:23.368 { 00:10:23.368 "name": "BaseBdev2", 00:10:23.368 "uuid": "91f8b951-117e-4ec3-b89e-4bbb1ea04542", 00:10:23.368 "is_configured": true, 00:10:23.368 "data_offset": 2048, 00:10:23.368 "data_size": 63488 00:10:23.368 } 00:10:23.368 ] 00:10:23.368 }' 00:10:23.368 23:32:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:23.368 23:32:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:23.934 23:32:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:10:23.934 23:32:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:10:23.934 23:32:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:23.934 23:32:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:10:23.934 23:32:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:10:23.934 23:32:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:10:23.934 23:32:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:10:24.192 [2024-07-24 23:32:09.059757] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:10:24.192 [2024-07-24 23:32:09.059822] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:24.192 [2024-07-24 23:32:09.070014] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:24.192 [2024-07-24 23:32:09.070041] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:24.192 [2024-07-24 23:32:09.070046] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf2c050 name Existed_Raid, state offline 00:10:24.192 23:32:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:10:24.192 23:32:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:10:24.192 23:32:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:24.192 23:32:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:10:24.451 23:32:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:10:24.451 23:32:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:10:24.451 23:32:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:10:24.451 23:32:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 257519 00:10:24.451 23:32:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@950 -- # '[' -z 257519 ']' 00:10:24.451 23:32:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # kill -0 257519 00:10:24.451 23:32:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # uname 00:10:24.451 23:32:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:10:24.451 23:32:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 257519 00:10:24.451 23:32:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:10:24.451 23:32:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:10:24.452 23:32:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # echo 'killing process with pid 257519' 00:10:24.452 killing process with pid 257519 00:10:24.452 23:32:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@969 -- # kill 257519 00:10:24.452 [2024-07-24 23:32:09.301280] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:24.452 23:32:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@974 -- # wait 257519 00:10:24.452 [2024-07-24 23:32:09.302050] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:24.710 23:32:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:10:24.710 00:10:24.710 real 0m7.927s 00:10:24.710 user 0m14.237s 00:10:24.710 sys 0m1.254s 00:10:24.710 23:32:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:24.710 23:32:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:24.710 ************************************ 00:10:24.710 END TEST raid_state_function_test_sb 00:10:24.710 ************************************ 00:10:24.710 23:32:09 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 2 00:10:24.710 23:32:09 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:10:24.710 23:32:09 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:24.710 23:32:09 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:24.710 ************************************ 00:10:24.710 START TEST raid_superblock_test 00:10:24.710 ************************************ 00:10:24.710 23:32:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # raid_superblock_test raid1 2 00:10:24.710 23:32:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:10:24.710 23:32:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:10:24.710 23:32:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:10:24.710 23:32:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:10:24.710 23:32:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:10:24.710 23:32:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:10:24.710 23:32:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:10:24.710 23:32:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:10:24.710 23:32:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:10:24.710 23:32:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:10:24.710 23:32:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:10:24.710 23:32:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:10:24.710 23:32:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:10:24.710 23:32:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:10:24.710 23:32:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:10:24.710 23:32:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=259110 00:10:24.710 23:32:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 259110 /var/tmp/spdk-raid.sock 00:10:24.710 23:32:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:10:24.710 23:32:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@831 -- # '[' -z 259110 ']' 00:10:24.710 23:32:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:24.710 23:32:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:10:24.710 23:32:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:24.710 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:24.710 23:32:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:10:24.710 23:32:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:24.710 [2024-07-24 23:32:09.593253] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:10:24.710 [2024-07-24 23:32:09.593291] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid259110 ] 00:10:24.710 [2024-07-24 23:32:09.655707] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:24.969 [2024-07-24 23:32:09.735320] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:24.969 [2024-07-24 23:32:09.788413] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:24.969 [2024-07-24 23:32:09.788440] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:25.534 23:32:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:10:25.534 23:32:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@864 -- # return 0 00:10:25.534 23:32:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:10:25.534 23:32:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:10:25.534 23:32:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:10:25.534 23:32:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:10:25.534 23:32:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:10:25.534 23:32:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:10:25.534 23:32:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:10:25.534 23:32:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:10:25.534 23:32:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:10:25.792 malloc1 00:10:25.792 23:32:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:10:25.792 [2024-07-24 23:32:10.720156] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:10:25.792 [2024-07-24 23:32:10.720192] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:25.792 [2024-07-24 23:32:10.720204] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1edfdd0 00:10:25.792 [2024-07-24 23:32:10.720210] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:25.792 [2024-07-24 23:32:10.721279] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:25.792 [2024-07-24 23:32:10.721300] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:10:25.792 pt1 00:10:25.792 23:32:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:10:25.792 23:32:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:10:25.792 23:32:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:10:25.792 23:32:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:10:25.792 23:32:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:10:25.792 23:32:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:10:25.792 23:32:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:10:25.792 23:32:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:10:25.792 23:32:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:10:26.050 malloc2 00:10:26.050 23:32:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:10:26.050 [2024-07-24 23:32:11.036607] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:10:26.050 [2024-07-24 23:32:11.036637] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:26.050 [2024-07-24 23:32:11.036648] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ee08d0 00:10:26.050 [2024-07-24 23:32:11.036653] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:26.050 [2024-07-24 23:32:11.037637] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:26.050 [2024-07-24 23:32:11.037657] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:10:26.050 pt2 00:10:26.050 23:32:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:10:26.050 23:32:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:10:26.308 23:32:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:10:26.308 [2024-07-24 23:32:11.205059] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:10:26.308 [2024-07-24 23:32:11.205921] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:10:26.308 [2024-07-24 23:32:11.206025] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1ee32d0 00:10:26.308 [2024-07-24 23:32:11.206032] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:10:26.308 [2024-07-24 23:32:11.206155] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ee2730 00:10:26.309 [2024-07-24 23:32:11.206256] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1ee32d0 00:10:26.309 [2024-07-24 23:32:11.206261] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1ee32d0 00:10:26.309 [2024-07-24 23:32:11.206325] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:26.309 23:32:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:10:26.309 23:32:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:26.309 23:32:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:26.309 23:32:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:26.309 23:32:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:26.309 23:32:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:26.309 23:32:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:26.309 23:32:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:26.309 23:32:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:26.309 23:32:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:26.309 23:32:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:26.309 23:32:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:26.566 23:32:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:26.566 "name": "raid_bdev1", 00:10:26.566 "uuid": "b8d6eed5-d164-4dd8-a46b-7444d54889dd", 00:10:26.566 "strip_size_kb": 0, 00:10:26.566 "state": "online", 00:10:26.566 "raid_level": "raid1", 00:10:26.566 "superblock": true, 00:10:26.566 "num_base_bdevs": 2, 00:10:26.566 "num_base_bdevs_discovered": 2, 00:10:26.566 "num_base_bdevs_operational": 2, 00:10:26.566 "base_bdevs_list": [ 00:10:26.566 { 00:10:26.566 "name": "pt1", 00:10:26.566 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:26.566 "is_configured": true, 00:10:26.566 "data_offset": 2048, 00:10:26.566 "data_size": 63488 00:10:26.566 }, 00:10:26.566 { 00:10:26.566 "name": "pt2", 00:10:26.566 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:26.566 "is_configured": true, 00:10:26.566 "data_offset": 2048, 00:10:26.566 "data_size": 63488 00:10:26.566 } 00:10:26.566 ] 00:10:26.566 }' 00:10:26.566 23:32:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:26.566 23:32:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:27.131 23:32:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:10:27.131 23:32:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:10:27.131 23:32:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:10:27.131 23:32:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:10:27.131 23:32:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:10:27.131 23:32:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:10:27.131 23:32:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:10:27.131 23:32:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:10:27.131 [2024-07-24 23:32:12.031317] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:27.131 23:32:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:10:27.131 "name": "raid_bdev1", 00:10:27.131 "aliases": [ 00:10:27.131 "b8d6eed5-d164-4dd8-a46b-7444d54889dd" 00:10:27.131 ], 00:10:27.131 "product_name": "Raid Volume", 00:10:27.131 "block_size": 512, 00:10:27.131 "num_blocks": 63488, 00:10:27.131 "uuid": "b8d6eed5-d164-4dd8-a46b-7444d54889dd", 00:10:27.131 "assigned_rate_limits": { 00:10:27.131 "rw_ios_per_sec": 0, 00:10:27.131 "rw_mbytes_per_sec": 0, 00:10:27.131 "r_mbytes_per_sec": 0, 00:10:27.131 "w_mbytes_per_sec": 0 00:10:27.131 }, 00:10:27.131 "claimed": false, 00:10:27.131 "zoned": false, 00:10:27.131 "supported_io_types": { 00:10:27.131 "read": true, 00:10:27.131 "write": true, 00:10:27.131 "unmap": false, 00:10:27.131 "flush": false, 00:10:27.131 "reset": true, 00:10:27.131 "nvme_admin": false, 00:10:27.131 "nvme_io": false, 00:10:27.131 "nvme_io_md": false, 00:10:27.131 "write_zeroes": true, 00:10:27.131 "zcopy": false, 00:10:27.131 "get_zone_info": false, 00:10:27.131 "zone_management": false, 00:10:27.131 "zone_append": false, 00:10:27.131 "compare": false, 00:10:27.131 "compare_and_write": false, 00:10:27.131 "abort": false, 00:10:27.131 "seek_hole": false, 00:10:27.131 "seek_data": false, 00:10:27.131 "copy": false, 00:10:27.131 "nvme_iov_md": false 00:10:27.131 }, 00:10:27.131 "memory_domains": [ 00:10:27.131 { 00:10:27.131 "dma_device_id": "system", 00:10:27.131 "dma_device_type": 1 00:10:27.131 }, 00:10:27.131 { 00:10:27.131 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:27.131 "dma_device_type": 2 00:10:27.131 }, 00:10:27.131 { 00:10:27.131 "dma_device_id": "system", 00:10:27.131 "dma_device_type": 1 00:10:27.131 }, 00:10:27.131 { 00:10:27.131 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:27.131 "dma_device_type": 2 00:10:27.131 } 00:10:27.131 ], 00:10:27.131 "driver_specific": { 00:10:27.131 "raid": { 00:10:27.131 "uuid": "b8d6eed5-d164-4dd8-a46b-7444d54889dd", 00:10:27.131 "strip_size_kb": 0, 00:10:27.131 "state": "online", 00:10:27.131 "raid_level": "raid1", 00:10:27.131 "superblock": true, 00:10:27.131 "num_base_bdevs": 2, 00:10:27.131 "num_base_bdevs_discovered": 2, 00:10:27.131 "num_base_bdevs_operational": 2, 00:10:27.131 "base_bdevs_list": [ 00:10:27.131 { 00:10:27.131 "name": "pt1", 00:10:27.131 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:27.131 "is_configured": true, 00:10:27.131 "data_offset": 2048, 00:10:27.131 "data_size": 63488 00:10:27.131 }, 00:10:27.131 { 00:10:27.131 "name": "pt2", 00:10:27.131 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:27.131 "is_configured": true, 00:10:27.131 "data_offset": 2048, 00:10:27.131 "data_size": 63488 00:10:27.131 } 00:10:27.131 ] 00:10:27.131 } 00:10:27.131 } 00:10:27.132 }' 00:10:27.132 23:32:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:27.132 23:32:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:10:27.132 pt2' 00:10:27.132 23:32:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:27.132 23:32:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:10:27.132 23:32:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:27.389 23:32:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:27.389 "name": "pt1", 00:10:27.389 "aliases": [ 00:10:27.389 "00000000-0000-0000-0000-000000000001" 00:10:27.389 ], 00:10:27.389 "product_name": "passthru", 00:10:27.389 "block_size": 512, 00:10:27.389 "num_blocks": 65536, 00:10:27.389 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:27.389 "assigned_rate_limits": { 00:10:27.389 "rw_ios_per_sec": 0, 00:10:27.389 "rw_mbytes_per_sec": 0, 00:10:27.389 "r_mbytes_per_sec": 0, 00:10:27.389 "w_mbytes_per_sec": 0 00:10:27.389 }, 00:10:27.389 "claimed": true, 00:10:27.389 "claim_type": "exclusive_write", 00:10:27.389 "zoned": false, 00:10:27.389 "supported_io_types": { 00:10:27.389 "read": true, 00:10:27.389 "write": true, 00:10:27.389 "unmap": true, 00:10:27.389 "flush": true, 00:10:27.389 "reset": true, 00:10:27.389 "nvme_admin": false, 00:10:27.389 "nvme_io": false, 00:10:27.389 "nvme_io_md": false, 00:10:27.389 "write_zeroes": true, 00:10:27.389 "zcopy": true, 00:10:27.389 "get_zone_info": false, 00:10:27.389 "zone_management": false, 00:10:27.389 "zone_append": false, 00:10:27.389 "compare": false, 00:10:27.389 "compare_and_write": false, 00:10:27.389 "abort": true, 00:10:27.389 "seek_hole": false, 00:10:27.389 "seek_data": false, 00:10:27.389 "copy": true, 00:10:27.389 "nvme_iov_md": false 00:10:27.389 }, 00:10:27.389 "memory_domains": [ 00:10:27.390 { 00:10:27.390 "dma_device_id": "system", 00:10:27.390 "dma_device_type": 1 00:10:27.390 }, 00:10:27.390 { 00:10:27.390 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:27.390 "dma_device_type": 2 00:10:27.390 } 00:10:27.390 ], 00:10:27.390 "driver_specific": { 00:10:27.390 "passthru": { 00:10:27.390 "name": "pt1", 00:10:27.390 "base_bdev_name": "malloc1" 00:10:27.390 } 00:10:27.390 } 00:10:27.390 }' 00:10:27.390 23:32:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:27.390 23:32:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:27.390 23:32:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:27.390 23:32:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:27.390 23:32:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:27.647 23:32:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:27.647 23:32:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:27.647 23:32:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:27.647 23:32:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:27.647 23:32:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:27.648 23:32:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:27.648 23:32:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:27.648 23:32:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:27.648 23:32:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:10:27.648 23:32:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:27.905 23:32:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:27.905 "name": "pt2", 00:10:27.905 "aliases": [ 00:10:27.905 "00000000-0000-0000-0000-000000000002" 00:10:27.905 ], 00:10:27.905 "product_name": "passthru", 00:10:27.905 "block_size": 512, 00:10:27.905 "num_blocks": 65536, 00:10:27.905 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:27.905 "assigned_rate_limits": { 00:10:27.905 "rw_ios_per_sec": 0, 00:10:27.905 "rw_mbytes_per_sec": 0, 00:10:27.905 "r_mbytes_per_sec": 0, 00:10:27.905 "w_mbytes_per_sec": 0 00:10:27.905 }, 00:10:27.905 "claimed": true, 00:10:27.905 "claim_type": "exclusive_write", 00:10:27.905 "zoned": false, 00:10:27.905 "supported_io_types": { 00:10:27.905 "read": true, 00:10:27.905 "write": true, 00:10:27.905 "unmap": true, 00:10:27.905 "flush": true, 00:10:27.905 "reset": true, 00:10:27.905 "nvme_admin": false, 00:10:27.905 "nvme_io": false, 00:10:27.905 "nvme_io_md": false, 00:10:27.906 "write_zeroes": true, 00:10:27.906 "zcopy": true, 00:10:27.906 "get_zone_info": false, 00:10:27.906 "zone_management": false, 00:10:27.906 "zone_append": false, 00:10:27.906 "compare": false, 00:10:27.906 "compare_and_write": false, 00:10:27.906 "abort": true, 00:10:27.906 "seek_hole": false, 00:10:27.906 "seek_data": false, 00:10:27.906 "copy": true, 00:10:27.906 "nvme_iov_md": false 00:10:27.906 }, 00:10:27.906 "memory_domains": [ 00:10:27.906 { 00:10:27.906 "dma_device_id": "system", 00:10:27.906 "dma_device_type": 1 00:10:27.906 }, 00:10:27.906 { 00:10:27.906 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:27.906 "dma_device_type": 2 00:10:27.906 } 00:10:27.906 ], 00:10:27.906 "driver_specific": { 00:10:27.906 "passthru": { 00:10:27.906 "name": "pt2", 00:10:27.906 "base_bdev_name": "malloc2" 00:10:27.906 } 00:10:27.906 } 00:10:27.906 }' 00:10:27.906 23:32:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:27.906 23:32:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:27.906 23:32:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:27.906 23:32:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:27.906 23:32:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:27.906 23:32:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:27.906 23:32:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:28.163 23:32:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:28.163 23:32:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:28.163 23:32:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:28.163 23:32:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:28.163 23:32:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:28.163 23:32:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:10:28.163 23:32:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:10:28.421 [2024-07-24 23:32:13.170264] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:28.421 23:32:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=b8d6eed5-d164-4dd8-a46b-7444d54889dd 00:10:28.421 23:32:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z b8d6eed5-d164-4dd8-a46b-7444d54889dd ']' 00:10:28.421 23:32:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:10:28.421 [2024-07-24 23:32:13.342546] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:10:28.421 [2024-07-24 23:32:13.342557] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:28.421 [2024-07-24 23:32:13.342594] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:28.421 [2024-07-24 23:32:13.342629] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:28.421 [2024-07-24 23:32:13.342635] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ee32d0 name raid_bdev1, state offline 00:10:28.421 23:32:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:28.421 23:32:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:10:28.679 23:32:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:10:28.679 23:32:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:10:28.679 23:32:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:10:28.679 23:32:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:10:28.937 23:32:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:10:28.937 23:32:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:10:28.937 23:32:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:10:28.937 23:32:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:10:29.195 23:32:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:10:29.195 23:32:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:10:29.195 23:32:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # local es=0 00:10:29.195 23:32:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:10:29.195 23:32:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:29.195 23:32:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:10:29.195 23:32:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:29.195 23:32:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:10:29.195 23:32:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:29.195 23:32:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:10:29.195 23:32:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:29.195 23:32:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:10:29.195 23:32:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:10:29.195 [2024-07-24 23:32:14.168683] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:10:29.195 [2024-07-24 23:32:14.169653] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:10:29.195 [2024-07-24 23:32:14.169695] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:10:29.195 [2024-07-24 23:32:14.169723] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:10:29.195 [2024-07-24 23:32:14.169733] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:10:29.195 [2024-07-24 23:32:14.169738] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ee3010 name raid_bdev1, state configuring 00:10:29.195 request: 00:10:29.195 { 00:10:29.195 "name": "raid_bdev1", 00:10:29.195 "raid_level": "raid1", 00:10:29.195 "base_bdevs": [ 00:10:29.195 "malloc1", 00:10:29.195 "malloc2" 00:10:29.195 ], 00:10:29.195 "superblock": false, 00:10:29.195 "method": "bdev_raid_create", 00:10:29.195 "req_id": 1 00:10:29.195 } 00:10:29.195 Got JSON-RPC error response 00:10:29.195 response: 00:10:29.195 { 00:10:29.195 "code": -17, 00:10:29.195 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:10:29.195 } 00:10:29.195 23:32:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # es=1 00:10:29.195 23:32:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:10:29.195 23:32:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:10:29.195 23:32:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:10:29.195 23:32:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:29.195 23:32:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:10:29.453 23:32:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:10:29.453 23:32:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:10:29.453 23:32:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:10:29.710 [2024-07-24 23:32:14.505520] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:10:29.710 [2024-07-24 23:32:14.505559] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:29.710 [2024-07-24 23:32:14.505569] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ee28d0 00:10:29.710 [2024-07-24 23:32:14.505575] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:29.710 [2024-07-24 23:32:14.506764] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:29.710 [2024-07-24 23:32:14.506785] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:10:29.710 [2024-07-24 23:32:14.506833] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:10:29.710 [2024-07-24 23:32:14.506851] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:10:29.710 pt1 00:10:29.710 23:32:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:10:29.710 23:32:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:29.710 23:32:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:29.710 23:32:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:29.710 23:32:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:29.710 23:32:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:29.710 23:32:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:29.710 23:32:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:29.710 23:32:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:29.710 23:32:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:29.710 23:32:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:29.710 23:32:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:29.710 23:32:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:29.710 "name": "raid_bdev1", 00:10:29.710 "uuid": "b8d6eed5-d164-4dd8-a46b-7444d54889dd", 00:10:29.710 "strip_size_kb": 0, 00:10:29.710 "state": "configuring", 00:10:29.710 "raid_level": "raid1", 00:10:29.710 "superblock": true, 00:10:29.710 "num_base_bdevs": 2, 00:10:29.710 "num_base_bdevs_discovered": 1, 00:10:29.710 "num_base_bdevs_operational": 2, 00:10:29.710 "base_bdevs_list": [ 00:10:29.710 { 00:10:29.710 "name": "pt1", 00:10:29.710 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:29.710 "is_configured": true, 00:10:29.710 "data_offset": 2048, 00:10:29.710 "data_size": 63488 00:10:29.710 }, 00:10:29.710 { 00:10:29.710 "name": null, 00:10:29.710 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:29.710 "is_configured": false, 00:10:29.710 "data_offset": 2048, 00:10:29.710 "data_size": 63488 00:10:29.710 } 00:10:29.710 ] 00:10:29.710 }' 00:10:29.710 23:32:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:29.710 23:32:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:30.276 23:32:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:10:30.276 23:32:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:10:30.276 23:32:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:10:30.276 23:32:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:10:30.534 [2024-07-24 23:32:15.319634] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:10:30.534 [2024-07-24 23:32:15.319671] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:30.534 [2024-07-24 23:32:15.319682] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1fa3510 00:10:30.534 [2024-07-24 23:32:15.319687] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:30.534 [2024-07-24 23:32:15.319926] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:30.534 [2024-07-24 23:32:15.319938] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:10:30.534 [2024-07-24 23:32:15.319983] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:10:30.534 [2024-07-24 23:32:15.319997] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:10:30.534 [2024-07-24 23:32:15.320064] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1ee2400 00:10:30.534 [2024-07-24 23:32:15.320070] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:10:30.534 [2024-07-24 23:32:15.320176] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ed9f50 00:10:30.534 [2024-07-24 23:32:15.320257] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1ee2400 00:10:30.534 [2024-07-24 23:32:15.320262] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1ee2400 00:10:30.534 [2024-07-24 23:32:15.320321] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:30.534 pt2 00:10:30.534 23:32:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:10:30.534 23:32:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:10:30.534 23:32:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:10:30.534 23:32:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:30.534 23:32:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:30.534 23:32:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:30.534 23:32:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:30.534 23:32:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:30.534 23:32:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:30.534 23:32:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:30.534 23:32:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:30.534 23:32:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:30.534 23:32:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:30.534 23:32:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:30.534 23:32:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:30.534 "name": "raid_bdev1", 00:10:30.534 "uuid": "b8d6eed5-d164-4dd8-a46b-7444d54889dd", 00:10:30.534 "strip_size_kb": 0, 00:10:30.534 "state": "online", 00:10:30.534 "raid_level": "raid1", 00:10:30.534 "superblock": true, 00:10:30.534 "num_base_bdevs": 2, 00:10:30.534 "num_base_bdevs_discovered": 2, 00:10:30.534 "num_base_bdevs_operational": 2, 00:10:30.534 "base_bdevs_list": [ 00:10:30.534 { 00:10:30.534 "name": "pt1", 00:10:30.534 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:30.534 "is_configured": true, 00:10:30.534 "data_offset": 2048, 00:10:30.534 "data_size": 63488 00:10:30.534 }, 00:10:30.534 { 00:10:30.534 "name": "pt2", 00:10:30.534 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:30.534 "is_configured": true, 00:10:30.534 "data_offset": 2048, 00:10:30.534 "data_size": 63488 00:10:30.534 } 00:10:30.534 ] 00:10:30.534 }' 00:10:30.534 23:32:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:30.534 23:32:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:31.099 23:32:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:10:31.099 23:32:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:10:31.099 23:32:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:10:31.099 23:32:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:10:31.099 23:32:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:10:31.099 23:32:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:10:31.099 23:32:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:10:31.099 23:32:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:10:31.356 [2024-07-24 23:32:16.166000] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:31.356 23:32:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:10:31.356 "name": "raid_bdev1", 00:10:31.356 "aliases": [ 00:10:31.356 "b8d6eed5-d164-4dd8-a46b-7444d54889dd" 00:10:31.356 ], 00:10:31.356 "product_name": "Raid Volume", 00:10:31.356 "block_size": 512, 00:10:31.356 "num_blocks": 63488, 00:10:31.356 "uuid": "b8d6eed5-d164-4dd8-a46b-7444d54889dd", 00:10:31.356 "assigned_rate_limits": { 00:10:31.356 "rw_ios_per_sec": 0, 00:10:31.356 "rw_mbytes_per_sec": 0, 00:10:31.356 "r_mbytes_per_sec": 0, 00:10:31.356 "w_mbytes_per_sec": 0 00:10:31.356 }, 00:10:31.356 "claimed": false, 00:10:31.356 "zoned": false, 00:10:31.356 "supported_io_types": { 00:10:31.356 "read": true, 00:10:31.356 "write": true, 00:10:31.356 "unmap": false, 00:10:31.356 "flush": false, 00:10:31.356 "reset": true, 00:10:31.356 "nvme_admin": false, 00:10:31.356 "nvme_io": false, 00:10:31.356 "nvme_io_md": false, 00:10:31.356 "write_zeroes": true, 00:10:31.356 "zcopy": false, 00:10:31.356 "get_zone_info": false, 00:10:31.356 "zone_management": false, 00:10:31.356 "zone_append": false, 00:10:31.356 "compare": false, 00:10:31.356 "compare_and_write": false, 00:10:31.356 "abort": false, 00:10:31.356 "seek_hole": false, 00:10:31.356 "seek_data": false, 00:10:31.356 "copy": false, 00:10:31.356 "nvme_iov_md": false 00:10:31.356 }, 00:10:31.356 "memory_domains": [ 00:10:31.356 { 00:10:31.356 "dma_device_id": "system", 00:10:31.356 "dma_device_type": 1 00:10:31.356 }, 00:10:31.356 { 00:10:31.356 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:31.356 "dma_device_type": 2 00:10:31.356 }, 00:10:31.356 { 00:10:31.356 "dma_device_id": "system", 00:10:31.356 "dma_device_type": 1 00:10:31.356 }, 00:10:31.356 { 00:10:31.356 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:31.356 "dma_device_type": 2 00:10:31.356 } 00:10:31.356 ], 00:10:31.356 "driver_specific": { 00:10:31.356 "raid": { 00:10:31.356 "uuid": "b8d6eed5-d164-4dd8-a46b-7444d54889dd", 00:10:31.356 "strip_size_kb": 0, 00:10:31.356 "state": "online", 00:10:31.356 "raid_level": "raid1", 00:10:31.356 "superblock": true, 00:10:31.356 "num_base_bdevs": 2, 00:10:31.356 "num_base_bdevs_discovered": 2, 00:10:31.356 "num_base_bdevs_operational": 2, 00:10:31.356 "base_bdevs_list": [ 00:10:31.356 { 00:10:31.356 "name": "pt1", 00:10:31.356 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:31.356 "is_configured": true, 00:10:31.356 "data_offset": 2048, 00:10:31.356 "data_size": 63488 00:10:31.356 }, 00:10:31.356 { 00:10:31.356 "name": "pt2", 00:10:31.356 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:31.356 "is_configured": true, 00:10:31.356 "data_offset": 2048, 00:10:31.356 "data_size": 63488 00:10:31.356 } 00:10:31.356 ] 00:10:31.356 } 00:10:31.356 } 00:10:31.356 }' 00:10:31.356 23:32:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:31.356 23:32:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:10:31.356 pt2' 00:10:31.356 23:32:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:31.356 23:32:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:10:31.356 23:32:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:31.613 23:32:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:31.613 "name": "pt1", 00:10:31.613 "aliases": [ 00:10:31.613 "00000000-0000-0000-0000-000000000001" 00:10:31.613 ], 00:10:31.613 "product_name": "passthru", 00:10:31.613 "block_size": 512, 00:10:31.613 "num_blocks": 65536, 00:10:31.613 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:31.613 "assigned_rate_limits": { 00:10:31.613 "rw_ios_per_sec": 0, 00:10:31.613 "rw_mbytes_per_sec": 0, 00:10:31.613 "r_mbytes_per_sec": 0, 00:10:31.613 "w_mbytes_per_sec": 0 00:10:31.613 }, 00:10:31.613 "claimed": true, 00:10:31.613 "claim_type": "exclusive_write", 00:10:31.613 "zoned": false, 00:10:31.613 "supported_io_types": { 00:10:31.613 "read": true, 00:10:31.613 "write": true, 00:10:31.613 "unmap": true, 00:10:31.613 "flush": true, 00:10:31.613 "reset": true, 00:10:31.613 "nvme_admin": false, 00:10:31.613 "nvme_io": false, 00:10:31.613 "nvme_io_md": false, 00:10:31.613 "write_zeroes": true, 00:10:31.613 "zcopy": true, 00:10:31.613 "get_zone_info": false, 00:10:31.613 "zone_management": false, 00:10:31.613 "zone_append": false, 00:10:31.613 "compare": false, 00:10:31.613 "compare_and_write": false, 00:10:31.613 "abort": true, 00:10:31.613 "seek_hole": false, 00:10:31.613 "seek_data": false, 00:10:31.613 "copy": true, 00:10:31.613 "nvme_iov_md": false 00:10:31.613 }, 00:10:31.613 "memory_domains": [ 00:10:31.613 { 00:10:31.613 "dma_device_id": "system", 00:10:31.613 "dma_device_type": 1 00:10:31.613 }, 00:10:31.613 { 00:10:31.613 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:31.613 "dma_device_type": 2 00:10:31.613 } 00:10:31.613 ], 00:10:31.613 "driver_specific": { 00:10:31.613 "passthru": { 00:10:31.613 "name": "pt1", 00:10:31.613 "base_bdev_name": "malloc1" 00:10:31.613 } 00:10:31.613 } 00:10:31.613 }' 00:10:31.613 23:32:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:31.613 23:32:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:31.613 23:32:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:31.613 23:32:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:31.613 23:32:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:31.613 23:32:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:31.613 23:32:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:31.613 23:32:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:31.871 23:32:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:31.871 23:32:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:31.871 23:32:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:31.871 23:32:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:31.871 23:32:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:31.871 23:32:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:31.871 23:32:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:10:31.871 23:32:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:31.871 "name": "pt2", 00:10:31.871 "aliases": [ 00:10:31.871 "00000000-0000-0000-0000-000000000002" 00:10:31.871 ], 00:10:31.871 "product_name": "passthru", 00:10:31.871 "block_size": 512, 00:10:31.871 "num_blocks": 65536, 00:10:31.871 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:31.871 "assigned_rate_limits": { 00:10:31.871 "rw_ios_per_sec": 0, 00:10:31.871 "rw_mbytes_per_sec": 0, 00:10:31.871 "r_mbytes_per_sec": 0, 00:10:31.871 "w_mbytes_per_sec": 0 00:10:31.871 }, 00:10:31.871 "claimed": true, 00:10:31.871 "claim_type": "exclusive_write", 00:10:31.871 "zoned": false, 00:10:31.871 "supported_io_types": { 00:10:31.871 "read": true, 00:10:31.871 "write": true, 00:10:31.871 "unmap": true, 00:10:31.871 "flush": true, 00:10:31.871 "reset": true, 00:10:31.871 "nvme_admin": false, 00:10:31.871 "nvme_io": false, 00:10:31.871 "nvme_io_md": false, 00:10:31.871 "write_zeroes": true, 00:10:31.871 "zcopy": true, 00:10:31.871 "get_zone_info": false, 00:10:31.871 "zone_management": false, 00:10:31.871 "zone_append": false, 00:10:31.871 "compare": false, 00:10:31.871 "compare_and_write": false, 00:10:31.871 "abort": true, 00:10:31.871 "seek_hole": false, 00:10:31.871 "seek_data": false, 00:10:31.871 "copy": true, 00:10:31.871 "nvme_iov_md": false 00:10:31.871 }, 00:10:31.871 "memory_domains": [ 00:10:31.871 { 00:10:31.871 "dma_device_id": "system", 00:10:31.871 "dma_device_type": 1 00:10:31.871 }, 00:10:31.871 { 00:10:31.871 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:31.871 "dma_device_type": 2 00:10:31.871 } 00:10:31.871 ], 00:10:31.871 "driver_specific": { 00:10:31.871 "passthru": { 00:10:31.871 "name": "pt2", 00:10:31.871 "base_bdev_name": "malloc2" 00:10:31.871 } 00:10:31.871 } 00:10:31.871 }' 00:10:32.129 23:32:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:32.129 23:32:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:32.129 23:32:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:32.129 23:32:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:32.129 23:32:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:32.129 23:32:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:32.129 23:32:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:32.129 23:32:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:32.129 23:32:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:32.129 23:32:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:32.387 23:32:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:32.387 23:32:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:32.387 23:32:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:10:32.387 23:32:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:10:32.387 [2024-07-24 23:32:17.345044] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:32.387 23:32:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' b8d6eed5-d164-4dd8-a46b-7444d54889dd '!=' b8d6eed5-d164-4dd8-a46b-7444d54889dd ']' 00:10:32.387 23:32:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:10:32.387 23:32:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:10:32.387 23:32:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:10:32.387 23:32:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:10:32.645 [2024-07-24 23:32:17.521366] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:10:32.645 23:32:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:10:32.645 23:32:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:32.645 23:32:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:32.646 23:32:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:32.646 23:32:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:32.646 23:32:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:10:32.646 23:32:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:32.646 23:32:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:32.646 23:32:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:32.646 23:32:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:32.646 23:32:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:32.646 23:32:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:32.904 23:32:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:32.904 "name": "raid_bdev1", 00:10:32.904 "uuid": "b8d6eed5-d164-4dd8-a46b-7444d54889dd", 00:10:32.904 "strip_size_kb": 0, 00:10:32.904 "state": "online", 00:10:32.904 "raid_level": "raid1", 00:10:32.904 "superblock": true, 00:10:32.904 "num_base_bdevs": 2, 00:10:32.904 "num_base_bdevs_discovered": 1, 00:10:32.904 "num_base_bdevs_operational": 1, 00:10:32.904 "base_bdevs_list": [ 00:10:32.904 { 00:10:32.904 "name": null, 00:10:32.904 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:32.904 "is_configured": false, 00:10:32.904 "data_offset": 2048, 00:10:32.904 "data_size": 63488 00:10:32.904 }, 00:10:32.904 { 00:10:32.904 "name": "pt2", 00:10:32.904 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:32.904 "is_configured": true, 00:10:32.904 "data_offset": 2048, 00:10:32.904 "data_size": 63488 00:10:32.904 } 00:10:32.904 ] 00:10:32.904 }' 00:10:32.904 23:32:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:32.904 23:32:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:33.470 23:32:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:10:33.470 [2024-07-24 23:32:18.319414] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:10:33.470 [2024-07-24 23:32:18.319432] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:33.470 [2024-07-24 23:32:18.319474] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:33.470 [2024-07-24 23:32:18.319502] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:33.470 [2024-07-24 23:32:18.319508] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ee2400 name raid_bdev1, state offline 00:10:33.470 23:32:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:33.470 23:32:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:10:33.728 23:32:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:10:33.728 23:32:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:10:33.728 23:32:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:10:33.728 23:32:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:10:33.728 23:32:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:10:33.728 23:32:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:10:33.728 23:32:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:10:33.728 23:32:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:10:33.728 23:32:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:10:33.728 23:32:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=1 00:10:33.728 23:32:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:10:33.986 [2024-07-24 23:32:18.840825] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:10:33.986 [2024-07-24 23:32:18.840859] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:33.986 [2024-07-24 23:32:18.840868] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1fa3890 00:10:33.986 [2024-07-24 23:32:18.840874] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:33.986 [2024-07-24 23:32:18.841994] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:33.986 [2024-07-24 23:32:18.842015] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:10:33.986 [2024-07-24 23:32:18.842058] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:10:33.986 [2024-07-24 23:32:18.842076] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:10:33.986 [2024-07-24 23:32:18.842131] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1ed9b50 00:10:33.986 [2024-07-24 23:32:18.842137] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:10:33.986 [2024-07-24 23:32:18.842242] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ed9f30 00:10:33.986 [2024-07-24 23:32:18.842320] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1ed9b50 00:10:33.986 [2024-07-24 23:32:18.842325] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1ed9b50 00:10:33.986 [2024-07-24 23:32:18.842387] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:33.986 pt2 00:10:33.986 23:32:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:10:33.986 23:32:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:33.986 23:32:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:33.986 23:32:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:33.986 23:32:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:33.986 23:32:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:10:33.987 23:32:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:33.987 23:32:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:33.987 23:32:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:33.987 23:32:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:33.987 23:32:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:33.987 23:32:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:34.245 23:32:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:34.245 "name": "raid_bdev1", 00:10:34.245 "uuid": "b8d6eed5-d164-4dd8-a46b-7444d54889dd", 00:10:34.245 "strip_size_kb": 0, 00:10:34.245 "state": "online", 00:10:34.245 "raid_level": "raid1", 00:10:34.245 "superblock": true, 00:10:34.245 "num_base_bdevs": 2, 00:10:34.245 "num_base_bdevs_discovered": 1, 00:10:34.245 "num_base_bdevs_operational": 1, 00:10:34.245 "base_bdevs_list": [ 00:10:34.245 { 00:10:34.245 "name": null, 00:10:34.245 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:34.245 "is_configured": false, 00:10:34.245 "data_offset": 2048, 00:10:34.245 "data_size": 63488 00:10:34.245 }, 00:10:34.245 { 00:10:34.245 "name": "pt2", 00:10:34.245 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:34.245 "is_configured": true, 00:10:34.245 "data_offset": 2048, 00:10:34.245 "data_size": 63488 00:10:34.245 } 00:10:34.245 ] 00:10:34.245 }' 00:10:34.245 23:32:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:34.245 23:32:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:34.809 23:32:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:10:34.809 [2024-07-24 23:32:19.674980] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:10:34.809 [2024-07-24 23:32:19.675000] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:34.809 [2024-07-24 23:32:19.675038] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:34.809 [2024-07-24 23:32:19.675068] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:34.809 [2024-07-24 23:32:19.675075] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ed9b50 name raid_bdev1, state offline 00:10:34.809 23:32:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:10:34.809 23:32:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:35.066 23:32:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:10:35.066 23:32:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:10:35.066 23:32:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:10:35.066 23:32:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:10:35.066 [2024-07-24 23:32:20.031905] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:10:35.066 [2024-07-24 23:32:20.031946] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:35.066 [2024-07-24 23:32:20.031957] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ed6720 00:10:35.066 [2024-07-24 23:32:20.031963] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:35.066 [2024-07-24 23:32:20.033127] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:35.066 [2024-07-24 23:32:20.033149] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:10:35.066 [2024-07-24 23:32:20.033194] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:10:35.066 [2024-07-24 23:32:20.033213] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:10:35.066 [2024-07-24 23:32:20.033284] bdev_raid.c:3639:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:10:35.066 [2024-07-24 23:32:20.033291] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:10:35.066 [2024-07-24 23:32:20.033299] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ed7d10 name raid_bdev1, state configuring 00:10:35.066 [2024-07-24 23:32:20.033312] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:10:35.066 [2024-07-24 23:32:20.033351] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1ed7f90 00:10:35.066 [2024-07-24 23:32:20.033356] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:10:35.066 [2024-07-24 23:32:20.033476] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ed7ad0 00:10:35.066 [2024-07-24 23:32:20.033560] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1ed7f90 00:10:35.066 [2024-07-24 23:32:20.033565] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1ed7f90 00:10:35.066 [2024-07-24 23:32:20.033636] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:35.066 pt1 00:10:35.067 23:32:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:10:35.067 23:32:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:10:35.067 23:32:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:35.067 23:32:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:35.067 23:32:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:35.067 23:32:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:35.067 23:32:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:10:35.067 23:32:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:35.067 23:32:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:35.067 23:32:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:35.067 23:32:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:35.067 23:32:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:35.067 23:32:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:35.323 23:32:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:35.323 "name": "raid_bdev1", 00:10:35.323 "uuid": "b8d6eed5-d164-4dd8-a46b-7444d54889dd", 00:10:35.323 "strip_size_kb": 0, 00:10:35.323 "state": "online", 00:10:35.323 "raid_level": "raid1", 00:10:35.323 "superblock": true, 00:10:35.323 "num_base_bdevs": 2, 00:10:35.323 "num_base_bdevs_discovered": 1, 00:10:35.323 "num_base_bdevs_operational": 1, 00:10:35.323 "base_bdevs_list": [ 00:10:35.323 { 00:10:35.323 "name": null, 00:10:35.323 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:35.323 "is_configured": false, 00:10:35.323 "data_offset": 2048, 00:10:35.323 "data_size": 63488 00:10:35.323 }, 00:10:35.323 { 00:10:35.323 "name": "pt2", 00:10:35.323 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:35.323 "is_configured": true, 00:10:35.323 "data_offset": 2048, 00:10:35.323 "data_size": 63488 00:10:35.323 } 00:10:35.323 ] 00:10:35.323 }' 00:10:35.323 23:32:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:35.323 23:32:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:35.887 23:32:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:10:35.887 23:32:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:10:35.887 23:32:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:10:35.887 23:32:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:10:35.887 23:32:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:10:36.145 [2024-07-24 23:32:21.022591] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:36.145 23:32:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' b8d6eed5-d164-4dd8-a46b-7444d54889dd '!=' b8d6eed5-d164-4dd8-a46b-7444d54889dd ']' 00:10:36.145 23:32:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 259110 00:10:36.145 23:32:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@950 -- # '[' -z 259110 ']' 00:10:36.145 23:32:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # kill -0 259110 00:10:36.145 23:32:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # uname 00:10:36.145 23:32:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:10:36.145 23:32:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 259110 00:10:36.145 23:32:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:10:36.145 23:32:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:10:36.145 23:32:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 259110' 00:10:36.145 killing process with pid 259110 00:10:36.145 23:32:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@969 -- # kill 259110 00:10:36.145 [2024-07-24 23:32:21.077048] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:36.145 [2024-07-24 23:32:21.077085] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:36.145 [2024-07-24 23:32:21.077114] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:36.145 [2024-07-24 23:32:21.077120] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ed7f90 name raid_bdev1, state offline 00:10:36.145 23:32:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@974 -- # wait 259110 00:10:36.145 [2024-07-24 23:32:21.092442] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:36.404 23:32:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:10:36.404 00:10:36.404 real 0m11.721s 00:10:36.404 user 0m21.486s 00:10:36.404 sys 0m1.877s 00:10:36.404 23:32:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:36.404 23:32:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:36.404 ************************************ 00:10:36.404 END TEST raid_superblock_test 00:10:36.404 ************************************ 00:10:36.404 23:32:21 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 2 read 00:10:36.404 23:32:21 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:10:36.404 23:32:21 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:36.404 23:32:21 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:36.404 ************************************ 00:10:36.404 START TEST raid_read_error_test 00:10:36.404 ************************************ 00:10:36.404 23:32:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test raid1 2 read 00:10:36.404 23:32:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:10:36.404 23:32:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:10:36.404 23:32:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:10:36.404 23:32:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:10:36.404 23:32:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:36.404 23:32:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:10:36.404 23:32:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:10:36.404 23:32:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:36.404 23:32:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:10:36.404 23:32:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:10:36.404 23:32:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:36.404 23:32:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:10:36.404 23:32:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:10:36.404 23:32:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:10:36.404 23:32:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:10:36.404 23:32:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:10:36.404 23:32:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:10:36.404 23:32:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:10:36.404 23:32:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:10:36.404 23:32:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:10:36.404 23:32:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:10:36.404 23:32:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.cxo54q61hc 00:10:36.404 23:32:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=261482 00:10:36.404 23:32:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 261482 /var/tmp/spdk-raid.sock 00:10:36.404 23:32:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@831 -- # '[' -z 261482 ']' 00:10:36.404 23:32:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:36.404 23:32:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:10:36.404 23:32:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:36.404 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:36.404 23:32:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:10:36.404 23:32:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:10:36.404 23:32:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:36.404 [2024-07-24 23:32:21.384220] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:10:36.404 [2024-07-24 23:32:21.384257] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid261482 ] 00:10:36.662 [2024-07-24 23:32:21.446847] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:36.662 [2024-07-24 23:32:21.524835] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:36.662 [2024-07-24 23:32:21.575083] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:36.662 [2024-07-24 23:32:21.575109] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:37.228 23:32:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:10:37.228 23:32:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@864 -- # return 0 00:10:37.228 23:32:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:10:37.228 23:32:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:10:37.486 BaseBdev1_malloc 00:10:37.486 23:32:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:10:37.744 true 00:10:37.744 23:32:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:10:37.744 [2024-07-24 23:32:22.667054] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:10:37.744 [2024-07-24 23:32:22.667086] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:37.744 [2024-07-24 23:32:22.667097] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xab0550 00:10:37.745 [2024-07-24 23:32:22.667103] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:37.745 [2024-07-24 23:32:22.668332] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:37.745 [2024-07-24 23:32:22.668353] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:10:37.745 BaseBdev1 00:10:37.745 23:32:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:10:37.745 23:32:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:10:38.003 BaseBdev2_malloc 00:10:38.003 23:32:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:10:38.263 true 00:10:38.263 23:32:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:10:38.263 [2024-07-24 23:32:23.179742] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:10:38.263 [2024-07-24 23:32:23.179771] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:38.263 [2024-07-24 23:32:23.179783] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xab4d90 00:10:38.263 [2024-07-24 23:32:23.179789] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:38.263 [2024-07-24 23:32:23.180812] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:38.263 [2024-07-24 23:32:23.180832] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:10:38.263 BaseBdev2 00:10:38.264 23:32:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:10:38.543 [2024-07-24 23:32:23.344189] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:38.543 [2024-07-24 23:32:23.345113] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:10:38.543 [2024-07-24 23:32:23.345254] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xab67a0 00:10:38.543 [2024-07-24 23:32:23.345264] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:10:38.543 [2024-07-24 23:32:23.345398] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x91e180 00:10:38.543 [2024-07-24 23:32:23.345520] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xab67a0 00:10:38.543 [2024-07-24 23:32:23.345525] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xab67a0 00:10:38.543 [2024-07-24 23:32:23.345596] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:38.543 23:32:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:10:38.543 23:32:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:38.543 23:32:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:38.543 23:32:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:38.543 23:32:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:38.543 23:32:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:38.543 23:32:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:38.543 23:32:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:38.543 23:32:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:38.543 23:32:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:38.543 23:32:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:38.543 23:32:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:38.543 23:32:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:38.543 "name": "raid_bdev1", 00:10:38.543 "uuid": "6507cd42-7cb6-4d2a-916a-9d9041344fb2", 00:10:38.543 "strip_size_kb": 0, 00:10:38.543 "state": "online", 00:10:38.543 "raid_level": "raid1", 00:10:38.543 "superblock": true, 00:10:38.543 "num_base_bdevs": 2, 00:10:38.543 "num_base_bdevs_discovered": 2, 00:10:38.543 "num_base_bdevs_operational": 2, 00:10:38.543 "base_bdevs_list": [ 00:10:38.543 { 00:10:38.543 "name": "BaseBdev1", 00:10:38.543 "uuid": "89ea9f1a-0093-5019-80bd-c33d30f89cbf", 00:10:38.543 "is_configured": true, 00:10:38.543 "data_offset": 2048, 00:10:38.543 "data_size": 63488 00:10:38.543 }, 00:10:38.543 { 00:10:38.543 "name": "BaseBdev2", 00:10:38.543 "uuid": "568cfd44-1db8-5be0-9c6f-a4c2e5cd2699", 00:10:38.543 "is_configured": true, 00:10:38.543 "data_offset": 2048, 00:10:38.543 "data_size": 63488 00:10:38.543 } 00:10:38.543 ] 00:10:38.543 }' 00:10:38.543 23:32:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:38.543 23:32:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:39.121 23:32:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:10:39.122 23:32:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:10:39.122 [2024-07-24 23:32:24.094482] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xab20f0 00:10:40.056 23:32:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:10:40.315 23:32:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:10:40.315 23:32:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:10:40.315 23:32:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:10:40.315 23:32:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:10:40.315 23:32:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:10:40.315 23:32:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:40.315 23:32:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:40.315 23:32:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:40.315 23:32:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:40.315 23:32:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:40.315 23:32:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:40.315 23:32:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:40.315 23:32:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:40.315 23:32:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:40.315 23:32:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:40.315 23:32:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:40.574 23:32:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:40.574 "name": "raid_bdev1", 00:10:40.574 "uuid": "6507cd42-7cb6-4d2a-916a-9d9041344fb2", 00:10:40.574 "strip_size_kb": 0, 00:10:40.574 "state": "online", 00:10:40.574 "raid_level": "raid1", 00:10:40.574 "superblock": true, 00:10:40.574 "num_base_bdevs": 2, 00:10:40.574 "num_base_bdevs_discovered": 2, 00:10:40.574 "num_base_bdevs_operational": 2, 00:10:40.574 "base_bdevs_list": [ 00:10:40.574 { 00:10:40.574 "name": "BaseBdev1", 00:10:40.574 "uuid": "89ea9f1a-0093-5019-80bd-c33d30f89cbf", 00:10:40.574 "is_configured": true, 00:10:40.574 "data_offset": 2048, 00:10:40.574 "data_size": 63488 00:10:40.574 }, 00:10:40.574 { 00:10:40.574 "name": "BaseBdev2", 00:10:40.574 "uuid": "568cfd44-1db8-5be0-9c6f-a4c2e5cd2699", 00:10:40.574 "is_configured": true, 00:10:40.574 "data_offset": 2048, 00:10:40.574 "data_size": 63488 00:10:40.574 } 00:10:40.574 ] 00:10:40.574 }' 00:10:40.574 23:32:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:40.574 23:32:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:41.141 23:32:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:10:41.141 [2024-07-24 23:32:26.038351] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:10:41.141 [2024-07-24 23:32:26.038381] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:41.141 [2024-07-24 23:32:26.040452] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:41.141 [2024-07-24 23:32:26.040475] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:41.141 [2024-07-24 23:32:26.040546] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:41.141 [2024-07-24 23:32:26.040552] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xab67a0 name raid_bdev1, state offline 00:10:41.141 0 00:10:41.141 23:32:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 261482 00:10:41.141 23:32:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@950 -- # '[' -z 261482 ']' 00:10:41.141 23:32:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # kill -0 261482 00:10:41.141 23:32:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # uname 00:10:41.141 23:32:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:10:41.141 23:32:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 261482 00:10:41.141 23:32:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:10:41.141 23:32:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:10:41.141 23:32:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 261482' 00:10:41.141 killing process with pid 261482 00:10:41.141 23:32:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@969 -- # kill 261482 00:10:41.141 [2024-07-24 23:32:26.098094] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:41.141 23:32:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@974 -- # wait 261482 00:10:41.141 [2024-07-24 23:32:26.107875] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:41.400 23:32:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.cxo54q61hc 00:10:41.400 23:32:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:10:41.400 23:32:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:10:41.400 23:32:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:10:41.400 23:32:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:10:41.400 23:32:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:10:41.400 23:32:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:10:41.400 23:32:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:10:41.400 00:10:41.400 real 0m4.964s 00:10:41.400 user 0m7.626s 00:10:41.400 sys 0m0.712s 00:10:41.400 23:32:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:41.400 23:32:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:41.400 ************************************ 00:10:41.400 END TEST raid_read_error_test 00:10:41.400 ************************************ 00:10:41.400 23:32:26 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 2 write 00:10:41.400 23:32:26 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:10:41.400 23:32:26 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:41.400 23:32:26 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:41.400 ************************************ 00:10:41.400 START TEST raid_write_error_test 00:10:41.400 ************************************ 00:10:41.400 23:32:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test raid1 2 write 00:10:41.400 23:32:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:10:41.400 23:32:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:10:41.400 23:32:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:10:41.400 23:32:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:10:41.400 23:32:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:41.400 23:32:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:10:41.400 23:32:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:10:41.400 23:32:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:41.400 23:32:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:10:41.400 23:32:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:10:41.400 23:32:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:41.400 23:32:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:10:41.400 23:32:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:10:41.400 23:32:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:10:41.400 23:32:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:10:41.400 23:32:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:10:41.400 23:32:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:10:41.400 23:32:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:10:41.400 23:32:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:10:41.400 23:32:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:10:41.400 23:32:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:10:41.400 23:32:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.rZTNl0xQUF 00:10:41.400 23:32:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=262312 00:10:41.400 23:32:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 262312 /var/tmp/spdk-raid.sock 00:10:41.400 23:32:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:10:41.400 23:32:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@831 -- # '[' -z 262312 ']' 00:10:41.400 23:32:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:41.400 23:32:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:10:41.400 23:32:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:41.400 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:41.400 23:32:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:10:41.400 23:32:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:41.658 [2024-07-24 23:32:26.430083] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:10:41.658 [2024-07-24 23:32:26.430124] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid262312 ] 00:10:41.658 [2024-07-24 23:32:26.492947] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:41.658 [2024-07-24 23:32:26.570717] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:41.658 [2024-07-24 23:32:26.629429] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:41.658 [2024-07-24 23:32:26.629458] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:42.224 23:32:27 bdev_raid.raid_write_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:10:42.224 23:32:27 bdev_raid.raid_write_error_test -- common/autotest_common.sh@864 -- # return 0 00:10:42.224 23:32:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:10:42.224 23:32:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:10:42.482 BaseBdev1_malloc 00:10:42.482 23:32:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:10:42.740 true 00:10:42.740 23:32:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:10:42.740 [2024-07-24 23:32:27.713805] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:10:42.740 [2024-07-24 23:32:27.713836] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:42.740 [2024-07-24 23:32:27.713849] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11a6550 00:10:42.740 [2024-07-24 23:32:27.713855] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:42.740 [2024-07-24 23:32:27.715082] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:42.740 [2024-07-24 23:32:27.715104] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:10:42.741 BaseBdev1 00:10:42.741 23:32:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:10:42.741 23:32:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:10:42.999 BaseBdev2_malloc 00:10:42.999 23:32:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:10:43.257 true 00:10:43.257 23:32:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:10:43.257 [2024-07-24 23:32:28.214681] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:10:43.257 [2024-07-24 23:32:28.214712] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:43.257 [2024-07-24 23:32:28.214726] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11aad90 00:10:43.257 [2024-07-24 23:32:28.214732] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:43.257 [2024-07-24 23:32:28.215762] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:43.257 [2024-07-24 23:32:28.215781] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:10:43.257 BaseBdev2 00:10:43.257 23:32:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:10:43.515 [2024-07-24 23:32:28.379128] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:43.515 [2024-07-24 23:32:28.379997] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:10:43.515 [2024-07-24 23:32:28.380127] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x11ac7a0 00:10:43.515 [2024-07-24 23:32:28.380136] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:10:43.515 [2024-07-24 23:32:28.380268] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1014180 00:10:43.515 [2024-07-24 23:32:28.380370] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x11ac7a0 00:10:43.515 [2024-07-24 23:32:28.380375] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x11ac7a0 00:10:43.515 [2024-07-24 23:32:28.380446] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:43.515 23:32:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:10:43.515 23:32:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:43.515 23:32:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:43.515 23:32:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:43.515 23:32:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:43.515 23:32:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:43.515 23:32:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:43.515 23:32:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:43.515 23:32:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:43.515 23:32:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:43.515 23:32:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:43.515 23:32:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:43.773 23:32:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:43.773 "name": "raid_bdev1", 00:10:43.773 "uuid": "d0000545-503e-41a8-9672-fa2769e6dd29", 00:10:43.773 "strip_size_kb": 0, 00:10:43.773 "state": "online", 00:10:43.773 "raid_level": "raid1", 00:10:43.773 "superblock": true, 00:10:43.773 "num_base_bdevs": 2, 00:10:43.773 "num_base_bdevs_discovered": 2, 00:10:43.773 "num_base_bdevs_operational": 2, 00:10:43.773 "base_bdevs_list": [ 00:10:43.773 { 00:10:43.773 "name": "BaseBdev1", 00:10:43.773 "uuid": "e6e98209-0386-592c-8fef-18feb1093aea", 00:10:43.773 "is_configured": true, 00:10:43.773 "data_offset": 2048, 00:10:43.773 "data_size": 63488 00:10:43.773 }, 00:10:43.773 { 00:10:43.773 "name": "BaseBdev2", 00:10:43.773 "uuid": "9a7709fd-d2b2-52e3-824d-29a097662905", 00:10:43.773 "is_configured": true, 00:10:43.773 "data_offset": 2048, 00:10:43.773 "data_size": 63488 00:10:43.773 } 00:10:43.774 ] 00:10:43.774 }' 00:10:43.774 23:32:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:43.774 23:32:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:44.031 23:32:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:10:44.031 23:32:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:10:44.290 [2024-07-24 23:32:29.089167] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11a80f0 00:10:45.224 23:32:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:10:45.224 [2024-07-24 23:32:30.177963] bdev_raid.c:2247:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:10:45.225 [2024-07-24 23:32:30.178004] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:10:45.225 [2024-07-24 23:32:30.178158] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x11a80f0 00:10:45.225 23:32:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:10:45.225 23:32:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:10:45.225 23:32:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:10:45.225 23:32:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=1 00:10:45.225 23:32:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:10:45.225 23:32:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:45.225 23:32:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:45.225 23:32:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:10:45.225 23:32:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:10:45.225 23:32:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:10:45.225 23:32:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:45.225 23:32:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:45.225 23:32:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:45.225 23:32:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:45.225 23:32:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:45.225 23:32:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:45.483 23:32:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:45.483 "name": "raid_bdev1", 00:10:45.483 "uuid": "d0000545-503e-41a8-9672-fa2769e6dd29", 00:10:45.483 "strip_size_kb": 0, 00:10:45.483 "state": "online", 00:10:45.483 "raid_level": "raid1", 00:10:45.483 "superblock": true, 00:10:45.483 "num_base_bdevs": 2, 00:10:45.483 "num_base_bdevs_discovered": 1, 00:10:45.483 "num_base_bdevs_operational": 1, 00:10:45.483 "base_bdevs_list": [ 00:10:45.483 { 00:10:45.483 "name": null, 00:10:45.483 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:45.483 "is_configured": false, 00:10:45.483 "data_offset": 2048, 00:10:45.483 "data_size": 63488 00:10:45.483 }, 00:10:45.483 { 00:10:45.483 "name": "BaseBdev2", 00:10:45.483 "uuid": "9a7709fd-d2b2-52e3-824d-29a097662905", 00:10:45.483 "is_configured": true, 00:10:45.483 "data_offset": 2048, 00:10:45.483 "data_size": 63488 00:10:45.483 } 00:10:45.483 ] 00:10:45.483 }' 00:10:45.483 23:32:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:45.483 23:32:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:46.050 23:32:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:10:46.050 [2024-07-24 23:32:31.022612] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:10:46.050 [2024-07-24 23:32:31.022643] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:46.050 [2024-07-24 23:32:31.024636] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:46.050 [2024-07-24 23:32:31.024652] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:46.050 [2024-07-24 23:32:31.024686] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:46.050 [2024-07-24 23:32:31.024691] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11ac7a0 name raid_bdev1, state offline 00:10:46.050 0 00:10:46.050 23:32:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 262312 00:10:46.050 23:32:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@950 -- # '[' -z 262312 ']' 00:10:46.050 23:32:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # kill -0 262312 00:10:46.050 23:32:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # uname 00:10:46.308 23:32:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:10:46.308 23:32:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 262312 00:10:46.308 23:32:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:10:46.308 23:32:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:10:46.308 23:32:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 262312' 00:10:46.308 killing process with pid 262312 00:10:46.308 23:32:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@969 -- # kill 262312 00:10:46.308 [2024-07-24 23:32:31.086345] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:46.308 23:32:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@974 -- # wait 262312 00:10:46.308 [2024-07-24 23:32:31.095150] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:46.308 23:32:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.rZTNl0xQUF 00:10:46.308 23:32:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:10:46.308 23:32:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:10:46.308 23:32:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:10:46.308 23:32:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:10:46.308 23:32:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:10:46.308 23:32:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:10:46.308 23:32:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:10:46.308 00:10:46.308 real 0m4.916s 00:10:46.308 user 0m7.529s 00:10:46.308 sys 0m0.705s 00:10:46.308 23:32:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:46.308 23:32:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:46.308 ************************************ 00:10:46.308 END TEST raid_write_error_test 00:10:46.308 ************************************ 00:10:46.567 23:32:31 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:10:46.567 23:32:31 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:10:46.567 23:32:31 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 3 false 00:10:46.567 23:32:31 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:10:46.567 23:32:31 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:46.567 23:32:31 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:46.567 ************************************ 00:10:46.567 START TEST raid_state_function_test 00:10:46.567 ************************************ 00:10:46.567 23:32:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # raid_state_function_test raid0 3 false 00:10:46.567 23:32:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:10:46.567 23:32:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:10:46.567 23:32:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:10:46.567 23:32:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:10:46.567 23:32:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:10:46.567 23:32:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:46.567 23:32:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:10:46.567 23:32:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:46.567 23:32:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:46.567 23:32:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:10:46.567 23:32:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:46.567 23:32:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:46.567 23:32:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:10:46.567 23:32:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:46.567 23:32:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:46.567 23:32:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:10:46.567 23:32:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:10:46.567 23:32:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:10:46.567 23:32:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:10:46.567 23:32:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:10:46.567 23:32:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:10:46.567 23:32:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:10:46.567 23:32:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:10:46.567 23:32:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:10:46.567 23:32:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:10:46.567 23:32:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:10:46.567 23:32:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=263277 00:10:46.567 23:32:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 263277' 00:10:46.567 Process raid pid: 263277 00:10:46.567 23:32:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:46.567 23:32:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 263277 /var/tmp/spdk-raid.sock 00:10:46.567 23:32:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@831 -- # '[' -z 263277 ']' 00:10:46.567 23:32:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:46.567 23:32:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:10:46.567 23:32:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:46.567 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:46.567 23:32:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:10:46.567 23:32:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:46.567 [2024-07-24 23:32:31.401486] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:10:46.567 [2024-07-24 23:32:31.401521] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:46.567 [2024-07-24 23:32:31.467005] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:46.567 [2024-07-24 23:32:31.539155] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:46.825 [2024-07-24 23:32:31.594586] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:46.825 [2024-07-24 23:32:31.594613] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:47.391 23:32:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:10:47.391 23:32:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@864 -- # return 0 00:10:47.391 23:32:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:10:47.391 [2024-07-24 23:32:32.349439] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:47.391 [2024-07-24 23:32:32.349474] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:47.391 [2024-07-24 23:32:32.349480] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:47.391 [2024-07-24 23:32:32.349486] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:47.391 [2024-07-24 23:32:32.349490] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:10:47.391 [2024-07-24 23:32:32.349495] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:10:47.391 23:32:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:10:47.391 23:32:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:47.391 23:32:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:47.391 23:32:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:47.391 23:32:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:47.391 23:32:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:10:47.391 23:32:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:47.391 23:32:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:47.391 23:32:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:47.391 23:32:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:47.391 23:32:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:47.391 23:32:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:47.649 23:32:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:47.649 "name": "Existed_Raid", 00:10:47.649 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:47.649 "strip_size_kb": 64, 00:10:47.649 "state": "configuring", 00:10:47.649 "raid_level": "raid0", 00:10:47.649 "superblock": false, 00:10:47.649 "num_base_bdevs": 3, 00:10:47.649 "num_base_bdevs_discovered": 0, 00:10:47.649 "num_base_bdevs_operational": 3, 00:10:47.649 "base_bdevs_list": [ 00:10:47.649 { 00:10:47.649 "name": "BaseBdev1", 00:10:47.649 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:47.649 "is_configured": false, 00:10:47.649 "data_offset": 0, 00:10:47.649 "data_size": 0 00:10:47.649 }, 00:10:47.649 { 00:10:47.649 "name": "BaseBdev2", 00:10:47.649 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:47.649 "is_configured": false, 00:10:47.649 "data_offset": 0, 00:10:47.649 "data_size": 0 00:10:47.649 }, 00:10:47.649 { 00:10:47.649 "name": "BaseBdev3", 00:10:47.649 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:47.649 "is_configured": false, 00:10:47.649 "data_offset": 0, 00:10:47.649 "data_size": 0 00:10:47.649 } 00:10:47.649 ] 00:10:47.649 }' 00:10:47.649 23:32:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:47.649 23:32:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:48.215 23:32:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:48.215 [2024-07-24 23:32:33.179513] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:48.215 [2024-07-24 23:32:33.179532] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1768b30 name Existed_Raid, state configuring 00:10:48.215 23:32:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:10:48.474 [2024-07-24 23:32:33.347979] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:48.474 [2024-07-24 23:32:33.347999] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:48.474 [2024-07-24 23:32:33.348005] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:48.474 [2024-07-24 23:32:33.348010] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:48.474 [2024-07-24 23:32:33.348015] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:10:48.474 [2024-07-24 23:32:33.348021] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:10:48.474 23:32:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:10:48.732 [2024-07-24 23:32:33.528559] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:48.732 BaseBdev1 00:10:48.732 23:32:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:10:48.732 23:32:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:10:48.732 23:32:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:10:48.732 23:32:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:10:48.732 23:32:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:10:48.732 23:32:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:10:48.732 23:32:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:48.732 23:32:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:10:48.989 [ 00:10:48.989 { 00:10:48.989 "name": "BaseBdev1", 00:10:48.989 "aliases": [ 00:10:48.989 "d7830a20-464d-407a-a6d3-c3bd6c8b4488" 00:10:48.989 ], 00:10:48.989 "product_name": "Malloc disk", 00:10:48.989 "block_size": 512, 00:10:48.989 "num_blocks": 65536, 00:10:48.989 "uuid": "d7830a20-464d-407a-a6d3-c3bd6c8b4488", 00:10:48.989 "assigned_rate_limits": { 00:10:48.989 "rw_ios_per_sec": 0, 00:10:48.989 "rw_mbytes_per_sec": 0, 00:10:48.989 "r_mbytes_per_sec": 0, 00:10:48.989 "w_mbytes_per_sec": 0 00:10:48.989 }, 00:10:48.989 "claimed": true, 00:10:48.989 "claim_type": "exclusive_write", 00:10:48.989 "zoned": false, 00:10:48.989 "supported_io_types": { 00:10:48.989 "read": true, 00:10:48.989 "write": true, 00:10:48.989 "unmap": true, 00:10:48.989 "flush": true, 00:10:48.989 "reset": true, 00:10:48.989 "nvme_admin": false, 00:10:48.989 "nvme_io": false, 00:10:48.989 "nvme_io_md": false, 00:10:48.989 "write_zeroes": true, 00:10:48.989 "zcopy": true, 00:10:48.989 "get_zone_info": false, 00:10:48.989 "zone_management": false, 00:10:48.989 "zone_append": false, 00:10:48.989 "compare": false, 00:10:48.989 "compare_and_write": false, 00:10:48.989 "abort": true, 00:10:48.989 "seek_hole": false, 00:10:48.989 "seek_data": false, 00:10:48.989 "copy": true, 00:10:48.989 "nvme_iov_md": false 00:10:48.989 }, 00:10:48.989 "memory_domains": [ 00:10:48.989 { 00:10:48.989 "dma_device_id": "system", 00:10:48.989 "dma_device_type": 1 00:10:48.989 }, 00:10:48.989 { 00:10:48.989 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:48.989 "dma_device_type": 2 00:10:48.989 } 00:10:48.989 ], 00:10:48.989 "driver_specific": {} 00:10:48.989 } 00:10:48.989 ] 00:10:48.989 23:32:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:10:48.989 23:32:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:10:48.989 23:32:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:48.989 23:32:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:48.989 23:32:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:48.989 23:32:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:48.989 23:32:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:10:48.989 23:32:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:48.989 23:32:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:48.989 23:32:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:48.989 23:32:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:48.989 23:32:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:48.989 23:32:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:49.247 23:32:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:49.247 "name": "Existed_Raid", 00:10:49.247 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:49.247 "strip_size_kb": 64, 00:10:49.247 "state": "configuring", 00:10:49.247 "raid_level": "raid0", 00:10:49.247 "superblock": false, 00:10:49.247 "num_base_bdevs": 3, 00:10:49.247 "num_base_bdevs_discovered": 1, 00:10:49.247 "num_base_bdevs_operational": 3, 00:10:49.247 "base_bdevs_list": [ 00:10:49.247 { 00:10:49.247 "name": "BaseBdev1", 00:10:49.247 "uuid": "d7830a20-464d-407a-a6d3-c3bd6c8b4488", 00:10:49.247 "is_configured": true, 00:10:49.247 "data_offset": 0, 00:10:49.247 "data_size": 65536 00:10:49.247 }, 00:10:49.247 { 00:10:49.247 "name": "BaseBdev2", 00:10:49.247 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:49.247 "is_configured": false, 00:10:49.247 "data_offset": 0, 00:10:49.247 "data_size": 0 00:10:49.247 }, 00:10:49.247 { 00:10:49.247 "name": "BaseBdev3", 00:10:49.247 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:49.247 "is_configured": false, 00:10:49.247 "data_offset": 0, 00:10:49.247 "data_size": 0 00:10:49.247 } 00:10:49.247 ] 00:10:49.247 }' 00:10:49.247 23:32:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:49.247 23:32:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:49.812 23:32:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:49.812 [2024-07-24 23:32:34.679575] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:49.812 [2024-07-24 23:32:34.679606] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17683a0 name Existed_Raid, state configuring 00:10:49.812 23:32:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:10:50.070 [2024-07-24 23:32:34.848039] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:50.070 [2024-07-24 23:32:34.849078] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:50.070 [2024-07-24 23:32:34.849106] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:50.070 [2024-07-24 23:32:34.849111] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:10:50.070 [2024-07-24 23:32:34.849116] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:10:50.070 23:32:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:10:50.070 23:32:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:10:50.070 23:32:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:10:50.070 23:32:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:50.070 23:32:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:50.070 23:32:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:50.070 23:32:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:50.070 23:32:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:10:50.070 23:32:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:50.070 23:32:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:50.070 23:32:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:50.070 23:32:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:50.070 23:32:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:50.070 23:32:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:50.070 23:32:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:50.070 "name": "Existed_Raid", 00:10:50.070 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:50.070 "strip_size_kb": 64, 00:10:50.070 "state": "configuring", 00:10:50.070 "raid_level": "raid0", 00:10:50.070 "superblock": false, 00:10:50.070 "num_base_bdevs": 3, 00:10:50.070 "num_base_bdevs_discovered": 1, 00:10:50.070 "num_base_bdevs_operational": 3, 00:10:50.070 "base_bdevs_list": [ 00:10:50.070 { 00:10:50.070 "name": "BaseBdev1", 00:10:50.070 "uuid": "d7830a20-464d-407a-a6d3-c3bd6c8b4488", 00:10:50.070 "is_configured": true, 00:10:50.070 "data_offset": 0, 00:10:50.070 "data_size": 65536 00:10:50.070 }, 00:10:50.070 { 00:10:50.070 "name": "BaseBdev2", 00:10:50.070 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:50.070 "is_configured": false, 00:10:50.070 "data_offset": 0, 00:10:50.070 "data_size": 0 00:10:50.070 }, 00:10:50.070 { 00:10:50.070 "name": "BaseBdev3", 00:10:50.070 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:50.070 "is_configured": false, 00:10:50.070 "data_offset": 0, 00:10:50.070 "data_size": 0 00:10:50.070 } 00:10:50.070 ] 00:10:50.070 }' 00:10:50.070 23:32:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:50.070 23:32:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:50.635 23:32:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:10:50.893 [2024-07-24 23:32:35.684831] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:10:50.893 BaseBdev2 00:10:50.893 23:32:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:10:50.893 23:32:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:10:50.893 23:32:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:10:50.893 23:32:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:10:50.893 23:32:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:10:50.893 23:32:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:10:50.893 23:32:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:50.893 23:32:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:10:51.151 [ 00:10:51.151 { 00:10:51.151 "name": "BaseBdev2", 00:10:51.151 "aliases": [ 00:10:51.151 "36287e24-c7c2-4c14-81f5-44960320b89d" 00:10:51.151 ], 00:10:51.151 "product_name": "Malloc disk", 00:10:51.151 "block_size": 512, 00:10:51.151 "num_blocks": 65536, 00:10:51.151 "uuid": "36287e24-c7c2-4c14-81f5-44960320b89d", 00:10:51.151 "assigned_rate_limits": { 00:10:51.151 "rw_ios_per_sec": 0, 00:10:51.151 "rw_mbytes_per_sec": 0, 00:10:51.151 "r_mbytes_per_sec": 0, 00:10:51.151 "w_mbytes_per_sec": 0 00:10:51.151 }, 00:10:51.151 "claimed": true, 00:10:51.151 "claim_type": "exclusive_write", 00:10:51.151 "zoned": false, 00:10:51.151 "supported_io_types": { 00:10:51.151 "read": true, 00:10:51.151 "write": true, 00:10:51.151 "unmap": true, 00:10:51.151 "flush": true, 00:10:51.151 "reset": true, 00:10:51.151 "nvme_admin": false, 00:10:51.151 "nvme_io": false, 00:10:51.151 "nvme_io_md": false, 00:10:51.151 "write_zeroes": true, 00:10:51.151 "zcopy": true, 00:10:51.151 "get_zone_info": false, 00:10:51.151 "zone_management": false, 00:10:51.151 "zone_append": false, 00:10:51.151 "compare": false, 00:10:51.151 "compare_and_write": false, 00:10:51.151 "abort": true, 00:10:51.151 "seek_hole": false, 00:10:51.151 "seek_data": false, 00:10:51.151 "copy": true, 00:10:51.151 "nvme_iov_md": false 00:10:51.151 }, 00:10:51.151 "memory_domains": [ 00:10:51.151 { 00:10:51.151 "dma_device_id": "system", 00:10:51.151 "dma_device_type": 1 00:10:51.151 }, 00:10:51.151 { 00:10:51.151 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:51.151 "dma_device_type": 2 00:10:51.151 } 00:10:51.151 ], 00:10:51.151 "driver_specific": {} 00:10:51.151 } 00:10:51.151 ] 00:10:51.151 23:32:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:10:51.151 23:32:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:10:51.151 23:32:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:10:51.151 23:32:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:10:51.151 23:32:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:51.151 23:32:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:51.151 23:32:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:51.151 23:32:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:51.151 23:32:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:10:51.151 23:32:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:51.151 23:32:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:51.151 23:32:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:51.151 23:32:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:51.151 23:32:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:51.151 23:32:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:51.409 23:32:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:51.409 "name": "Existed_Raid", 00:10:51.409 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:51.409 "strip_size_kb": 64, 00:10:51.409 "state": "configuring", 00:10:51.409 "raid_level": "raid0", 00:10:51.409 "superblock": false, 00:10:51.409 "num_base_bdevs": 3, 00:10:51.409 "num_base_bdevs_discovered": 2, 00:10:51.409 "num_base_bdevs_operational": 3, 00:10:51.409 "base_bdevs_list": [ 00:10:51.409 { 00:10:51.409 "name": "BaseBdev1", 00:10:51.409 "uuid": "d7830a20-464d-407a-a6d3-c3bd6c8b4488", 00:10:51.409 "is_configured": true, 00:10:51.409 "data_offset": 0, 00:10:51.409 "data_size": 65536 00:10:51.409 }, 00:10:51.409 { 00:10:51.409 "name": "BaseBdev2", 00:10:51.409 "uuid": "36287e24-c7c2-4c14-81f5-44960320b89d", 00:10:51.409 "is_configured": true, 00:10:51.409 "data_offset": 0, 00:10:51.409 "data_size": 65536 00:10:51.409 }, 00:10:51.409 { 00:10:51.409 "name": "BaseBdev3", 00:10:51.409 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:51.409 "is_configured": false, 00:10:51.409 "data_offset": 0, 00:10:51.409 "data_size": 0 00:10:51.409 } 00:10:51.409 ] 00:10:51.409 }' 00:10:51.409 23:32:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:51.409 23:32:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:51.974 23:32:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:10:51.974 [2024-07-24 23:32:36.862455] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:10:51.974 [2024-07-24 23:32:36.862490] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x17692a0 00:10:51.974 [2024-07-24 23:32:36.862494] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:10:51.974 [2024-07-24 23:32:36.862628] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17640e0 00:10:51.974 [2024-07-24 23:32:36.862713] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x17692a0 00:10:51.974 [2024-07-24 23:32:36.862718] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x17692a0 00:10:51.974 [2024-07-24 23:32:36.862831] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:51.974 BaseBdev3 00:10:51.974 23:32:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:10:51.974 23:32:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:10:51.974 23:32:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:10:51.974 23:32:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:10:51.974 23:32:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:10:51.974 23:32:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:10:51.974 23:32:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:52.232 23:32:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:10:52.232 [ 00:10:52.232 { 00:10:52.232 "name": "BaseBdev3", 00:10:52.232 "aliases": [ 00:10:52.232 "10b30bb1-e51b-44b5-83bb-926efc82f24a" 00:10:52.232 ], 00:10:52.232 "product_name": "Malloc disk", 00:10:52.232 "block_size": 512, 00:10:52.232 "num_blocks": 65536, 00:10:52.232 "uuid": "10b30bb1-e51b-44b5-83bb-926efc82f24a", 00:10:52.232 "assigned_rate_limits": { 00:10:52.232 "rw_ios_per_sec": 0, 00:10:52.232 "rw_mbytes_per_sec": 0, 00:10:52.232 "r_mbytes_per_sec": 0, 00:10:52.232 "w_mbytes_per_sec": 0 00:10:52.232 }, 00:10:52.232 "claimed": true, 00:10:52.232 "claim_type": "exclusive_write", 00:10:52.232 "zoned": false, 00:10:52.232 "supported_io_types": { 00:10:52.232 "read": true, 00:10:52.232 "write": true, 00:10:52.232 "unmap": true, 00:10:52.232 "flush": true, 00:10:52.232 "reset": true, 00:10:52.232 "nvme_admin": false, 00:10:52.232 "nvme_io": false, 00:10:52.232 "nvme_io_md": false, 00:10:52.232 "write_zeroes": true, 00:10:52.232 "zcopy": true, 00:10:52.232 "get_zone_info": false, 00:10:52.232 "zone_management": false, 00:10:52.232 "zone_append": false, 00:10:52.232 "compare": false, 00:10:52.232 "compare_and_write": false, 00:10:52.232 "abort": true, 00:10:52.232 "seek_hole": false, 00:10:52.232 "seek_data": false, 00:10:52.232 "copy": true, 00:10:52.232 "nvme_iov_md": false 00:10:52.232 }, 00:10:52.232 "memory_domains": [ 00:10:52.232 { 00:10:52.232 "dma_device_id": "system", 00:10:52.232 "dma_device_type": 1 00:10:52.232 }, 00:10:52.232 { 00:10:52.232 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:52.232 "dma_device_type": 2 00:10:52.232 } 00:10:52.232 ], 00:10:52.232 "driver_specific": {} 00:10:52.232 } 00:10:52.232 ] 00:10:52.232 23:32:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:10:52.232 23:32:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:10:52.232 23:32:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:10:52.232 23:32:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:10:52.232 23:32:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:52.232 23:32:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:52.232 23:32:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:52.232 23:32:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:52.232 23:32:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:10:52.232 23:32:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:52.232 23:32:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:52.232 23:32:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:52.232 23:32:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:52.232 23:32:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:52.232 23:32:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:52.491 23:32:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:52.491 "name": "Existed_Raid", 00:10:52.491 "uuid": "c0394123-66f8-4306-b3e2-927b1fb6fb83", 00:10:52.491 "strip_size_kb": 64, 00:10:52.491 "state": "online", 00:10:52.491 "raid_level": "raid0", 00:10:52.491 "superblock": false, 00:10:52.491 "num_base_bdevs": 3, 00:10:52.491 "num_base_bdevs_discovered": 3, 00:10:52.491 "num_base_bdevs_operational": 3, 00:10:52.491 "base_bdevs_list": [ 00:10:52.491 { 00:10:52.491 "name": "BaseBdev1", 00:10:52.491 "uuid": "d7830a20-464d-407a-a6d3-c3bd6c8b4488", 00:10:52.491 "is_configured": true, 00:10:52.491 "data_offset": 0, 00:10:52.491 "data_size": 65536 00:10:52.491 }, 00:10:52.491 { 00:10:52.491 "name": "BaseBdev2", 00:10:52.491 "uuid": "36287e24-c7c2-4c14-81f5-44960320b89d", 00:10:52.491 "is_configured": true, 00:10:52.491 "data_offset": 0, 00:10:52.491 "data_size": 65536 00:10:52.491 }, 00:10:52.491 { 00:10:52.491 "name": "BaseBdev3", 00:10:52.491 "uuid": "10b30bb1-e51b-44b5-83bb-926efc82f24a", 00:10:52.491 "is_configured": true, 00:10:52.491 "data_offset": 0, 00:10:52.491 "data_size": 65536 00:10:52.491 } 00:10:52.491 ] 00:10:52.491 }' 00:10:52.491 23:32:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:52.491 23:32:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:53.056 23:32:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:10:53.056 23:32:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:10:53.056 23:32:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:10:53.056 23:32:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:10:53.056 23:32:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:10:53.056 23:32:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:10:53.056 23:32:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:10:53.057 23:32:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:10:53.314 [2024-07-24 23:32:38.057749] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:53.315 23:32:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:10:53.315 "name": "Existed_Raid", 00:10:53.315 "aliases": [ 00:10:53.315 "c0394123-66f8-4306-b3e2-927b1fb6fb83" 00:10:53.315 ], 00:10:53.315 "product_name": "Raid Volume", 00:10:53.315 "block_size": 512, 00:10:53.315 "num_blocks": 196608, 00:10:53.315 "uuid": "c0394123-66f8-4306-b3e2-927b1fb6fb83", 00:10:53.315 "assigned_rate_limits": { 00:10:53.315 "rw_ios_per_sec": 0, 00:10:53.315 "rw_mbytes_per_sec": 0, 00:10:53.315 "r_mbytes_per_sec": 0, 00:10:53.315 "w_mbytes_per_sec": 0 00:10:53.315 }, 00:10:53.315 "claimed": false, 00:10:53.315 "zoned": false, 00:10:53.315 "supported_io_types": { 00:10:53.315 "read": true, 00:10:53.315 "write": true, 00:10:53.315 "unmap": true, 00:10:53.315 "flush": true, 00:10:53.315 "reset": true, 00:10:53.315 "nvme_admin": false, 00:10:53.315 "nvme_io": false, 00:10:53.315 "nvme_io_md": false, 00:10:53.315 "write_zeroes": true, 00:10:53.315 "zcopy": false, 00:10:53.315 "get_zone_info": false, 00:10:53.315 "zone_management": false, 00:10:53.315 "zone_append": false, 00:10:53.315 "compare": false, 00:10:53.315 "compare_and_write": false, 00:10:53.315 "abort": false, 00:10:53.315 "seek_hole": false, 00:10:53.315 "seek_data": false, 00:10:53.315 "copy": false, 00:10:53.315 "nvme_iov_md": false 00:10:53.315 }, 00:10:53.315 "memory_domains": [ 00:10:53.315 { 00:10:53.315 "dma_device_id": "system", 00:10:53.315 "dma_device_type": 1 00:10:53.315 }, 00:10:53.315 { 00:10:53.315 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:53.315 "dma_device_type": 2 00:10:53.315 }, 00:10:53.315 { 00:10:53.315 "dma_device_id": "system", 00:10:53.315 "dma_device_type": 1 00:10:53.315 }, 00:10:53.315 { 00:10:53.315 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:53.315 "dma_device_type": 2 00:10:53.315 }, 00:10:53.315 { 00:10:53.315 "dma_device_id": "system", 00:10:53.315 "dma_device_type": 1 00:10:53.315 }, 00:10:53.315 { 00:10:53.315 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:53.315 "dma_device_type": 2 00:10:53.315 } 00:10:53.315 ], 00:10:53.315 "driver_specific": { 00:10:53.315 "raid": { 00:10:53.315 "uuid": "c0394123-66f8-4306-b3e2-927b1fb6fb83", 00:10:53.315 "strip_size_kb": 64, 00:10:53.315 "state": "online", 00:10:53.315 "raid_level": "raid0", 00:10:53.315 "superblock": false, 00:10:53.315 "num_base_bdevs": 3, 00:10:53.315 "num_base_bdevs_discovered": 3, 00:10:53.315 "num_base_bdevs_operational": 3, 00:10:53.315 "base_bdevs_list": [ 00:10:53.315 { 00:10:53.315 "name": "BaseBdev1", 00:10:53.315 "uuid": "d7830a20-464d-407a-a6d3-c3bd6c8b4488", 00:10:53.315 "is_configured": true, 00:10:53.315 "data_offset": 0, 00:10:53.315 "data_size": 65536 00:10:53.315 }, 00:10:53.315 { 00:10:53.315 "name": "BaseBdev2", 00:10:53.315 "uuid": "36287e24-c7c2-4c14-81f5-44960320b89d", 00:10:53.315 "is_configured": true, 00:10:53.315 "data_offset": 0, 00:10:53.315 "data_size": 65536 00:10:53.315 }, 00:10:53.315 { 00:10:53.315 "name": "BaseBdev3", 00:10:53.315 "uuid": "10b30bb1-e51b-44b5-83bb-926efc82f24a", 00:10:53.315 "is_configured": true, 00:10:53.315 "data_offset": 0, 00:10:53.315 "data_size": 65536 00:10:53.315 } 00:10:53.315 ] 00:10:53.315 } 00:10:53.315 } 00:10:53.315 }' 00:10:53.315 23:32:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:53.315 23:32:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:10:53.315 BaseBdev2 00:10:53.315 BaseBdev3' 00:10:53.315 23:32:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:53.315 23:32:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:53.315 23:32:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:10:53.315 23:32:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:53.315 "name": "BaseBdev1", 00:10:53.315 "aliases": [ 00:10:53.315 "d7830a20-464d-407a-a6d3-c3bd6c8b4488" 00:10:53.315 ], 00:10:53.315 "product_name": "Malloc disk", 00:10:53.315 "block_size": 512, 00:10:53.315 "num_blocks": 65536, 00:10:53.315 "uuid": "d7830a20-464d-407a-a6d3-c3bd6c8b4488", 00:10:53.315 "assigned_rate_limits": { 00:10:53.315 "rw_ios_per_sec": 0, 00:10:53.315 "rw_mbytes_per_sec": 0, 00:10:53.315 "r_mbytes_per_sec": 0, 00:10:53.315 "w_mbytes_per_sec": 0 00:10:53.315 }, 00:10:53.315 "claimed": true, 00:10:53.315 "claim_type": "exclusive_write", 00:10:53.315 "zoned": false, 00:10:53.315 "supported_io_types": { 00:10:53.315 "read": true, 00:10:53.315 "write": true, 00:10:53.315 "unmap": true, 00:10:53.315 "flush": true, 00:10:53.315 "reset": true, 00:10:53.315 "nvme_admin": false, 00:10:53.315 "nvme_io": false, 00:10:53.315 "nvme_io_md": false, 00:10:53.315 "write_zeroes": true, 00:10:53.315 "zcopy": true, 00:10:53.315 "get_zone_info": false, 00:10:53.315 "zone_management": false, 00:10:53.315 "zone_append": false, 00:10:53.315 "compare": false, 00:10:53.315 "compare_and_write": false, 00:10:53.315 "abort": true, 00:10:53.315 "seek_hole": false, 00:10:53.315 "seek_data": false, 00:10:53.315 "copy": true, 00:10:53.315 "nvme_iov_md": false 00:10:53.315 }, 00:10:53.315 "memory_domains": [ 00:10:53.315 { 00:10:53.315 "dma_device_id": "system", 00:10:53.315 "dma_device_type": 1 00:10:53.315 }, 00:10:53.315 { 00:10:53.315 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:53.315 "dma_device_type": 2 00:10:53.315 } 00:10:53.315 ], 00:10:53.315 "driver_specific": {} 00:10:53.315 }' 00:10:53.315 23:32:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:53.573 23:32:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:53.573 23:32:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:53.573 23:32:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:53.573 23:32:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:53.573 23:32:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:53.573 23:32:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:53.573 23:32:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:53.573 23:32:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:53.573 23:32:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:53.831 23:32:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:53.831 23:32:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:53.831 23:32:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:53.831 23:32:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:53.831 23:32:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:10:53.831 23:32:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:53.831 "name": "BaseBdev2", 00:10:53.831 "aliases": [ 00:10:53.831 "36287e24-c7c2-4c14-81f5-44960320b89d" 00:10:53.831 ], 00:10:53.831 "product_name": "Malloc disk", 00:10:53.831 "block_size": 512, 00:10:53.831 "num_blocks": 65536, 00:10:53.831 "uuid": "36287e24-c7c2-4c14-81f5-44960320b89d", 00:10:53.831 "assigned_rate_limits": { 00:10:53.831 "rw_ios_per_sec": 0, 00:10:53.831 "rw_mbytes_per_sec": 0, 00:10:53.831 "r_mbytes_per_sec": 0, 00:10:53.831 "w_mbytes_per_sec": 0 00:10:53.831 }, 00:10:53.831 "claimed": true, 00:10:53.831 "claim_type": "exclusive_write", 00:10:53.831 "zoned": false, 00:10:53.831 "supported_io_types": { 00:10:53.831 "read": true, 00:10:53.831 "write": true, 00:10:53.831 "unmap": true, 00:10:53.831 "flush": true, 00:10:53.831 "reset": true, 00:10:53.831 "nvme_admin": false, 00:10:53.831 "nvme_io": false, 00:10:53.831 "nvme_io_md": false, 00:10:53.831 "write_zeroes": true, 00:10:53.831 "zcopy": true, 00:10:53.831 "get_zone_info": false, 00:10:53.831 "zone_management": false, 00:10:53.831 "zone_append": false, 00:10:53.831 "compare": false, 00:10:53.831 "compare_and_write": false, 00:10:53.831 "abort": true, 00:10:53.831 "seek_hole": false, 00:10:53.831 "seek_data": false, 00:10:53.831 "copy": true, 00:10:53.831 "nvme_iov_md": false 00:10:53.831 }, 00:10:53.831 "memory_domains": [ 00:10:53.831 { 00:10:53.831 "dma_device_id": "system", 00:10:53.831 "dma_device_type": 1 00:10:53.831 }, 00:10:53.831 { 00:10:53.831 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:53.831 "dma_device_type": 2 00:10:53.831 } 00:10:53.831 ], 00:10:53.831 "driver_specific": {} 00:10:53.831 }' 00:10:53.831 23:32:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:54.089 23:32:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:54.089 23:32:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:54.089 23:32:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:54.089 23:32:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:54.089 23:32:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:54.089 23:32:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:54.089 23:32:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:54.089 23:32:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:54.089 23:32:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:54.089 23:32:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:54.347 23:32:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:54.347 23:32:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:54.347 23:32:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:10:54.347 23:32:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:54.347 23:32:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:54.347 "name": "BaseBdev3", 00:10:54.347 "aliases": [ 00:10:54.347 "10b30bb1-e51b-44b5-83bb-926efc82f24a" 00:10:54.347 ], 00:10:54.347 "product_name": "Malloc disk", 00:10:54.347 "block_size": 512, 00:10:54.347 "num_blocks": 65536, 00:10:54.347 "uuid": "10b30bb1-e51b-44b5-83bb-926efc82f24a", 00:10:54.347 "assigned_rate_limits": { 00:10:54.347 "rw_ios_per_sec": 0, 00:10:54.347 "rw_mbytes_per_sec": 0, 00:10:54.347 "r_mbytes_per_sec": 0, 00:10:54.347 "w_mbytes_per_sec": 0 00:10:54.347 }, 00:10:54.347 "claimed": true, 00:10:54.347 "claim_type": "exclusive_write", 00:10:54.347 "zoned": false, 00:10:54.347 "supported_io_types": { 00:10:54.347 "read": true, 00:10:54.347 "write": true, 00:10:54.347 "unmap": true, 00:10:54.347 "flush": true, 00:10:54.347 "reset": true, 00:10:54.347 "nvme_admin": false, 00:10:54.347 "nvme_io": false, 00:10:54.347 "nvme_io_md": false, 00:10:54.347 "write_zeroes": true, 00:10:54.347 "zcopy": true, 00:10:54.347 "get_zone_info": false, 00:10:54.347 "zone_management": false, 00:10:54.347 "zone_append": false, 00:10:54.347 "compare": false, 00:10:54.347 "compare_and_write": false, 00:10:54.347 "abort": true, 00:10:54.347 "seek_hole": false, 00:10:54.347 "seek_data": false, 00:10:54.347 "copy": true, 00:10:54.347 "nvme_iov_md": false 00:10:54.347 }, 00:10:54.347 "memory_domains": [ 00:10:54.347 { 00:10:54.347 "dma_device_id": "system", 00:10:54.347 "dma_device_type": 1 00:10:54.347 }, 00:10:54.347 { 00:10:54.347 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:54.347 "dma_device_type": 2 00:10:54.347 } 00:10:54.347 ], 00:10:54.347 "driver_specific": {} 00:10:54.347 }' 00:10:54.347 23:32:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:54.347 23:32:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:54.605 23:32:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:54.605 23:32:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:54.605 23:32:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:54.605 23:32:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:54.605 23:32:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:54.605 23:32:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:54.605 23:32:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:54.605 23:32:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:54.605 23:32:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:54.605 23:32:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:54.605 23:32:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:10:54.863 [2024-07-24 23:32:39.753973] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:10:54.863 [2024-07-24 23:32:39.753993] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:54.863 [2024-07-24 23:32:39.754021] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:54.863 23:32:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:10:54.863 23:32:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:10:54.863 23:32:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:10:54.863 23:32:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:10:54.863 23:32:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:10:54.863 23:32:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 2 00:10:54.863 23:32:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:54.863 23:32:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:10:54.863 23:32:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:54.863 23:32:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:54.863 23:32:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:54.863 23:32:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:54.863 23:32:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:54.863 23:32:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:54.863 23:32:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:54.863 23:32:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:54.863 23:32:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:55.121 23:32:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:55.121 "name": "Existed_Raid", 00:10:55.121 "uuid": "c0394123-66f8-4306-b3e2-927b1fb6fb83", 00:10:55.121 "strip_size_kb": 64, 00:10:55.121 "state": "offline", 00:10:55.121 "raid_level": "raid0", 00:10:55.121 "superblock": false, 00:10:55.121 "num_base_bdevs": 3, 00:10:55.121 "num_base_bdevs_discovered": 2, 00:10:55.121 "num_base_bdevs_operational": 2, 00:10:55.121 "base_bdevs_list": [ 00:10:55.121 { 00:10:55.121 "name": null, 00:10:55.121 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:55.121 "is_configured": false, 00:10:55.121 "data_offset": 0, 00:10:55.121 "data_size": 65536 00:10:55.121 }, 00:10:55.121 { 00:10:55.121 "name": "BaseBdev2", 00:10:55.121 "uuid": "36287e24-c7c2-4c14-81f5-44960320b89d", 00:10:55.121 "is_configured": true, 00:10:55.121 "data_offset": 0, 00:10:55.121 "data_size": 65536 00:10:55.121 }, 00:10:55.121 { 00:10:55.121 "name": "BaseBdev3", 00:10:55.121 "uuid": "10b30bb1-e51b-44b5-83bb-926efc82f24a", 00:10:55.121 "is_configured": true, 00:10:55.121 "data_offset": 0, 00:10:55.121 "data_size": 65536 00:10:55.121 } 00:10:55.121 ] 00:10:55.121 }' 00:10:55.121 23:32:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:55.121 23:32:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:55.714 23:32:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:10:55.714 23:32:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:10:55.714 23:32:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:10:55.714 23:32:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:55.714 23:32:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:10:55.714 23:32:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:10:55.714 23:32:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:10:55.972 [2024-07-24 23:32:40.765530] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:10:55.972 23:32:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:10:55.972 23:32:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:10:55.972 23:32:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:55.972 23:32:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:10:55.972 23:32:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:10:55.972 23:32:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:10:55.972 23:32:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:10:56.229 [2024-07-24 23:32:41.100277] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:10:56.229 [2024-07-24 23:32:41.100313] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17692a0 name Existed_Raid, state offline 00:10:56.229 23:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:10:56.229 23:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:10:56.229 23:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:56.229 23:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:10:56.487 23:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:10:56.487 23:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:10:56.487 23:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:10:56.487 23:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:10:56.487 23:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:10:56.487 23:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:10:56.487 BaseBdev2 00:10:56.487 23:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:10:56.487 23:32:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:10:56.487 23:32:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:10:56.487 23:32:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:10:56.487 23:32:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:10:56.487 23:32:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:10:56.487 23:32:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:56.745 23:32:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:10:57.003 [ 00:10:57.003 { 00:10:57.003 "name": "BaseBdev2", 00:10:57.003 "aliases": [ 00:10:57.003 "d7010fa9-2e60-4663-acb9-fcfac5c92043" 00:10:57.003 ], 00:10:57.003 "product_name": "Malloc disk", 00:10:57.003 "block_size": 512, 00:10:57.003 "num_blocks": 65536, 00:10:57.003 "uuid": "d7010fa9-2e60-4663-acb9-fcfac5c92043", 00:10:57.003 "assigned_rate_limits": { 00:10:57.003 "rw_ios_per_sec": 0, 00:10:57.003 "rw_mbytes_per_sec": 0, 00:10:57.003 "r_mbytes_per_sec": 0, 00:10:57.003 "w_mbytes_per_sec": 0 00:10:57.003 }, 00:10:57.003 "claimed": false, 00:10:57.003 "zoned": false, 00:10:57.003 "supported_io_types": { 00:10:57.003 "read": true, 00:10:57.003 "write": true, 00:10:57.003 "unmap": true, 00:10:57.003 "flush": true, 00:10:57.003 "reset": true, 00:10:57.003 "nvme_admin": false, 00:10:57.003 "nvme_io": false, 00:10:57.003 "nvme_io_md": false, 00:10:57.003 "write_zeroes": true, 00:10:57.003 "zcopy": true, 00:10:57.003 "get_zone_info": false, 00:10:57.003 "zone_management": false, 00:10:57.003 "zone_append": false, 00:10:57.003 "compare": false, 00:10:57.003 "compare_and_write": false, 00:10:57.003 "abort": true, 00:10:57.003 "seek_hole": false, 00:10:57.003 "seek_data": false, 00:10:57.003 "copy": true, 00:10:57.003 "nvme_iov_md": false 00:10:57.003 }, 00:10:57.003 "memory_domains": [ 00:10:57.003 { 00:10:57.003 "dma_device_id": "system", 00:10:57.003 "dma_device_type": 1 00:10:57.003 }, 00:10:57.003 { 00:10:57.003 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:57.003 "dma_device_type": 2 00:10:57.003 } 00:10:57.003 ], 00:10:57.003 "driver_specific": {} 00:10:57.003 } 00:10:57.003 ] 00:10:57.003 23:32:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:10:57.003 23:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:10:57.003 23:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:10:57.003 23:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:10:57.003 BaseBdev3 00:10:57.003 23:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:10:57.003 23:32:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:10:57.003 23:32:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:10:57.003 23:32:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:10:57.003 23:32:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:10:57.003 23:32:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:10:57.003 23:32:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:57.261 23:32:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:10:57.519 [ 00:10:57.519 { 00:10:57.519 "name": "BaseBdev3", 00:10:57.519 "aliases": [ 00:10:57.519 "40351b8f-5862-423f-885d-71880253c516" 00:10:57.519 ], 00:10:57.519 "product_name": "Malloc disk", 00:10:57.519 "block_size": 512, 00:10:57.519 "num_blocks": 65536, 00:10:57.519 "uuid": "40351b8f-5862-423f-885d-71880253c516", 00:10:57.519 "assigned_rate_limits": { 00:10:57.519 "rw_ios_per_sec": 0, 00:10:57.519 "rw_mbytes_per_sec": 0, 00:10:57.519 "r_mbytes_per_sec": 0, 00:10:57.519 "w_mbytes_per_sec": 0 00:10:57.519 }, 00:10:57.519 "claimed": false, 00:10:57.519 "zoned": false, 00:10:57.519 "supported_io_types": { 00:10:57.519 "read": true, 00:10:57.519 "write": true, 00:10:57.519 "unmap": true, 00:10:57.519 "flush": true, 00:10:57.519 "reset": true, 00:10:57.519 "nvme_admin": false, 00:10:57.519 "nvme_io": false, 00:10:57.519 "nvme_io_md": false, 00:10:57.519 "write_zeroes": true, 00:10:57.519 "zcopy": true, 00:10:57.519 "get_zone_info": false, 00:10:57.519 "zone_management": false, 00:10:57.519 "zone_append": false, 00:10:57.519 "compare": false, 00:10:57.519 "compare_and_write": false, 00:10:57.519 "abort": true, 00:10:57.519 "seek_hole": false, 00:10:57.519 "seek_data": false, 00:10:57.519 "copy": true, 00:10:57.519 "nvme_iov_md": false 00:10:57.519 }, 00:10:57.519 "memory_domains": [ 00:10:57.519 { 00:10:57.519 "dma_device_id": "system", 00:10:57.519 "dma_device_type": 1 00:10:57.519 }, 00:10:57.519 { 00:10:57.519 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:57.519 "dma_device_type": 2 00:10:57.519 } 00:10:57.519 ], 00:10:57.519 "driver_specific": {} 00:10:57.519 } 00:10:57.519 ] 00:10:57.519 23:32:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:10:57.519 23:32:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:10:57.519 23:32:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:10:57.519 23:32:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:10:57.519 [2024-07-24 23:32:42.425069] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:57.519 [2024-07-24 23:32:42.425101] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:57.519 [2024-07-24 23:32:42.425112] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:10:57.519 [2024-07-24 23:32:42.426079] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:10:57.519 23:32:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:10:57.519 23:32:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:57.519 23:32:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:57.519 23:32:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:57.519 23:32:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:57.519 23:32:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:10:57.519 23:32:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:57.519 23:32:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:57.519 23:32:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:57.519 23:32:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:57.519 23:32:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:57.519 23:32:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:57.777 23:32:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:57.777 "name": "Existed_Raid", 00:10:57.777 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:57.777 "strip_size_kb": 64, 00:10:57.777 "state": "configuring", 00:10:57.777 "raid_level": "raid0", 00:10:57.777 "superblock": false, 00:10:57.777 "num_base_bdevs": 3, 00:10:57.777 "num_base_bdevs_discovered": 2, 00:10:57.777 "num_base_bdevs_operational": 3, 00:10:57.777 "base_bdevs_list": [ 00:10:57.777 { 00:10:57.777 "name": "BaseBdev1", 00:10:57.777 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:57.777 "is_configured": false, 00:10:57.777 "data_offset": 0, 00:10:57.777 "data_size": 0 00:10:57.777 }, 00:10:57.777 { 00:10:57.777 "name": "BaseBdev2", 00:10:57.777 "uuid": "d7010fa9-2e60-4663-acb9-fcfac5c92043", 00:10:57.777 "is_configured": true, 00:10:57.777 "data_offset": 0, 00:10:57.777 "data_size": 65536 00:10:57.777 }, 00:10:57.777 { 00:10:57.777 "name": "BaseBdev3", 00:10:57.777 "uuid": "40351b8f-5862-423f-885d-71880253c516", 00:10:57.777 "is_configured": true, 00:10:57.777 "data_offset": 0, 00:10:57.777 "data_size": 65536 00:10:57.777 } 00:10:57.777 ] 00:10:57.777 }' 00:10:57.777 23:32:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:57.777 23:32:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:58.343 23:32:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:10:58.343 [2024-07-24 23:32:43.247179] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:10:58.343 23:32:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:10:58.343 23:32:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:58.343 23:32:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:58.343 23:32:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:58.343 23:32:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:58.343 23:32:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:10:58.343 23:32:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:58.343 23:32:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:58.343 23:32:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:58.343 23:32:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:58.343 23:32:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:58.343 23:32:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:58.600 23:32:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:58.601 "name": "Existed_Raid", 00:10:58.601 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:58.601 "strip_size_kb": 64, 00:10:58.601 "state": "configuring", 00:10:58.601 "raid_level": "raid0", 00:10:58.601 "superblock": false, 00:10:58.601 "num_base_bdevs": 3, 00:10:58.601 "num_base_bdevs_discovered": 1, 00:10:58.601 "num_base_bdevs_operational": 3, 00:10:58.601 "base_bdevs_list": [ 00:10:58.601 { 00:10:58.601 "name": "BaseBdev1", 00:10:58.601 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:58.601 "is_configured": false, 00:10:58.601 "data_offset": 0, 00:10:58.601 "data_size": 0 00:10:58.601 }, 00:10:58.601 { 00:10:58.601 "name": null, 00:10:58.601 "uuid": "d7010fa9-2e60-4663-acb9-fcfac5c92043", 00:10:58.601 "is_configured": false, 00:10:58.601 "data_offset": 0, 00:10:58.601 "data_size": 65536 00:10:58.601 }, 00:10:58.601 { 00:10:58.601 "name": "BaseBdev3", 00:10:58.601 "uuid": "40351b8f-5862-423f-885d-71880253c516", 00:10:58.601 "is_configured": true, 00:10:58.601 "data_offset": 0, 00:10:58.601 "data_size": 65536 00:10:58.601 } 00:10:58.601 ] 00:10:58.601 }' 00:10:58.601 23:32:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:58.601 23:32:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:59.166 23:32:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:59.166 23:32:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:10:59.166 23:32:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:10:59.166 23:32:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:10:59.423 [2024-07-24 23:32:44.244515] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:59.423 BaseBdev1 00:10:59.423 23:32:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:10:59.423 23:32:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:10:59.423 23:32:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:10:59.423 23:32:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:10:59.423 23:32:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:10:59.423 23:32:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:10:59.423 23:32:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:59.423 23:32:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:10:59.679 [ 00:10:59.679 { 00:10:59.679 "name": "BaseBdev1", 00:10:59.680 "aliases": [ 00:10:59.680 "53d245ae-49e9-4301-9bee-2bb5c5fa575c" 00:10:59.680 ], 00:10:59.680 "product_name": "Malloc disk", 00:10:59.680 "block_size": 512, 00:10:59.680 "num_blocks": 65536, 00:10:59.680 "uuid": "53d245ae-49e9-4301-9bee-2bb5c5fa575c", 00:10:59.680 "assigned_rate_limits": { 00:10:59.680 "rw_ios_per_sec": 0, 00:10:59.680 "rw_mbytes_per_sec": 0, 00:10:59.680 "r_mbytes_per_sec": 0, 00:10:59.680 "w_mbytes_per_sec": 0 00:10:59.680 }, 00:10:59.680 "claimed": true, 00:10:59.680 "claim_type": "exclusive_write", 00:10:59.680 "zoned": false, 00:10:59.680 "supported_io_types": { 00:10:59.680 "read": true, 00:10:59.680 "write": true, 00:10:59.680 "unmap": true, 00:10:59.680 "flush": true, 00:10:59.680 "reset": true, 00:10:59.680 "nvme_admin": false, 00:10:59.680 "nvme_io": false, 00:10:59.680 "nvme_io_md": false, 00:10:59.680 "write_zeroes": true, 00:10:59.680 "zcopy": true, 00:10:59.680 "get_zone_info": false, 00:10:59.680 "zone_management": false, 00:10:59.680 "zone_append": false, 00:10:59.680 "compare": false, 00:10:59.680 "compare_and_write": false, 00:10:59.680 "abort": true, 00:10:59.680 "seek_hole": false, 00:10:59.680 "seek_data": false, 00:10:59.680 "copy": true, 00:10:59.680 "nvme_iov_md": false 00:10:59.680 }, 00:10:59.680 "memory_domains": [ 00:10:59.680 { 00:10:59.680 "dma_device_id": "system", 00:10:59.680 "dma_device_type": 1 00:10:59.680 }, 00:10:59.680 { 00:10:59.680 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:59.680 "dma_device_type": 2 00:10:59.680 } 00:10:59.680 ], 00:10:59.680 "driver_specific": {} 00:10:59.680 } 00:10:59.680 ] 00:10:59.680 23:32:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:10:59.680 23:32:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:10:59.680 23:32:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:59.680 23:32:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:59.680 23:32:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:59.680 23:32:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:59.680 23:32:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:10:59.680 23:32:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:59.680 23:32:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:59.680 23:32:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:59.680 23:32:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:59.680 23:32:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:59.680 23:32:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:59.938 23:32:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:59.938 "name": "Existed_Raid", 00:10:59.938 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:59.938 "strip_size_kb": 64, 00:10:59.938 "state": "configuring", 00:10:59.938 "raid_level": "raid0", 00:10:59.938 "superblock": false, 00:10:59.938 "num_base_bdevs": 3, 00:10:59.938 "num_base_bdevs_discovered": 2, 00:10:59.938 "num_base_bdevs_operational": 3, 00:10:59.938 "base_bdevs_list": [ 00:10:59.938 { 00:10:59.938 "name": "BaseBdev1", 00:10:59.938 "uuid": "53d245ae-49e9-4301-9bee-2bb5c5fa575c", 00:10:59.938 "is_configured": true, 00:10:59.938 "data_offset": 0, 00:10:59.938 "data_size": 65536 00:10:59.938 }, 00:10:59.938 { 00:10:59.938 "name": null, 00:10:59.938 "uuid": "d7010fa9-2e60-4663-acb9-fcfac5c92043", 00:10:59.938 "is_configured": false, 00:10:59.938 "data_offset": 0, 00:10:59.938 "data_size": 65536 00:10:59.938 }, 00:10:59.938 { 00:10:59.938 "name": "BaseBdev3", 00:10:59.938 "uuid": "40351b8f-5862-423f-885d-71880253c516", 00:10:59.938 "is_configured": true, 00:10:59.938 "data_offset": 0, 00:10:59.938 "data_size": 65536 00:10:59.938 } 00:10:59.938 ] 00:10:59.938 }' 00:10:59.938 23:32:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:59.938 23:32:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:00.503 23:32:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:11:00.503 23:32:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:00.503 23:32:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:11:00.503 23:32:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:11:00.761 [2024-07-24 23:32:45.567960] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:11:00.761 23:32:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:00.761 23:32:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:00.761 23:32:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:00.761 23:32:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:00.761 23:32:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:00.761 23:32:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:00.761 23:32:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:00.761 23:32:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:00.761 23:32:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:00.761 23:32:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:00.761 23:32:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:00.761 23:32:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:00.761 23:32:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:00.761 "name": "Existed_Raid", 00:11:00.761 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:00.761 "strip_size_kb": 64, 00:11:00.761 "state": "configuring", 00:11:00.761 "raid_level": "raid0", 00:11:00.761 "superblock": false, 00:11:00.761 "num_base_bdevs": 3, 00:11:00.761 "num_base_bdevs_discovered": 1, 00:11:00.761 "num_base_bdevs_operational": 3, 00:11:00.761 "base_bdevs_list": [ 00:11:00.761 { 00:11:00.761 "name": "BaseBdev1", 00:11:00.761 "uuid": "53d245ae-49e9-4301-9bee-2bb5c5fa575c", 00:11:00.761 "is_configured": true, 00:11:00.761 "data_offset": 0, 00:11:00.761 "data_size": 65536 00:11:00.761 }, 00:11:00.761 { 00:11:00.761 "name": null, 00:11:00.761 "uuid": "d7010fa9-2e60-4663-acb9-fcfac5c92043", 00:11:00.761 "is_configured": false, 00:11:00.761 "data_offset": 0, 00:11:00.761 "data_size": 65536 00:11:00.761 }, 00:11:00.761 { 00:11:00.761 "name": null, 00:11:00.761 "uuid": "40351b8f-5862-423f-885d-71880253c516", 00:11:00.761 "is_configured": false, 00:11:00.761 "data_offset": 0, 00:11:00.761 "data_size": 65536 00:11:00.761 } 00:11:00.761 ] 00:11:00.761 }' 00:11:00.761 23:32:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:00.761 23:32:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:01.327 23:32:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:01.327 23:32:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:11:01.585 23:32:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:11:01.585 23:32:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:11:01.585 [2024-07-24 23:32:46.562557] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:11:01.585 23:32:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:01.585 23:32:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:01.585 23:32:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:01.585 23:32:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:01.585 23:32:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:01.585 23:32:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:01.585 23:32:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:01.585 23:32:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:01.585 23:32:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:01.585 23:32:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:01.585 23:32:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:01.585 23:32:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:01.842 23:32:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:01.842 "name": "Existed_Raid", 00:11:01.842 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:01.842 "strip_size_kb": 64, 00:11:01.842 "state": "configuring", 00:11:01.842 "raid_level": "raid0", 00:11:01.842 "superblock": false, 00:11:01.842 "num_base_bdevs": 3, 00:11:01.842 "num_base_bdevs_discovered": 2, 00:11:01.842 "num_base_bdevs_operational": 3, 00:11:01.842 "base_bdevs_list": [ 00:11:01.842 { 00:11:01.842 "name": "BaseBdev1", 00:11:01.842 "uuid": "53d245ae-49e9-4301-9bee-2bb5c5fa575c", 00:11:01.842 "is_configured": true, 00:11:01.842 "data_offset": 0, 00:11:01.842 "data_size": 65536 00:11:01.842 }, 00:11:01.842 { 00:11:01.842 "name": null, 00:11:01.842 "uuid": "d7010fa9-2e60-4663-acb9-fcfac5c92043", 00:11:01.842 "is_configured": false, 00:11:01.842 "data_offset": 0, 00:11:01.842 "data_size": 65536 00:11:01.842 }, 00:11:01.842 { 00:11:01.842 "name": "BaseBdev3", 00:11:01.842 "uuid": "40351b8f-5862-423f-885d-71880253c516", 00:11:01.842 "is_configured": true, 00:11:01.842 "data_offset": 0, 00:11:01.842 "data_size": 65536 00:11:01.842 } 00:11:01.842 ] 00:11:01.842 }' 00:11:01.842 23:32:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:01.842 23:32:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:02.404 23:32:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:02.404 23:32:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:11:02.404 23:32:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:11:02.404 23:32:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:02.663 [2024-07-24 23:32:47.533067] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:02.663 23:32:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:02.663 23:32:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:02.663 23:32:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:02.663 23:32:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:02.663 23:32:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:02.663 23:32:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:02.663 23:32:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:02.663 23:32:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:02.663 23:32:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:02.663 23:32:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:02.663 23:32:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:02.663 23:32:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:02.920 23:32:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:02.920 "name": "Existed_Raid", 00:11:02.920 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:02.920 "strip_size_kb": 64, 00:11:02.921 "state": "configuring", 00:11:02.921 "raid_level": "raid0", 00:11:02.921 "superblock": false, 00:11:02.921 "num_base_bdevs": 3, 00:11:02.921 "num_base_bdevs_discovered": 1, 00:11:02.921 "num_base_bdevs_operational": 3, 00:11:02.921 "base_bdevs_list": [ 00:11:02.921 { 00:11:02.921 "name": null, 00:11:02.921 "uuid": "53d245ae-49e9-4301-9bee-2bb5c5fa575c", 00:11:02.921 "is_configured": false, 00:11:02.921 "data_offset": 0, 00:11:02.921 "data_size": 65536 00:11:02.921 }, 00:11:02.921 { 00:11:02.921 "name": null, 00:11:02.921 "uuid": "d7010fa9-2e60-4663-acb9-fcfac5c92043", 00:11:02.921 "is_configured": false, 00:11:02.921 "data_offset": 0, 00:11:02.921 "data_size": 65536 00:11:02.921 }, 00:11:02.921 { 00:11:02.921 "name": "BaseBdev3", 00:11:02.921 "uuid": "40351b8f-5862-423f-885d-71880253c516", 00:11:02.921 "is_configured": true, 00:11:02.921 "data_offset": 0, 00:11:02.921 "data_size": 65536 00:11:02.921 } 00:11:02.921 ] 00:11:02.921 }' 00:11:02.921 23:32:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:02.921 23:32:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:03.485 23:32:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:11:03.485 23:32:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:03.485 23:32:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:11:03.485 23:32:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:11:03.742 [2024-07-24 23:32:48.545526] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:03.742 23:32:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:03.742 23:32:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:03.742 23:32:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:03.742 23:32:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:03.742 23:32:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:03.742 23:32:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:03.743 23:32:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:03.743 23:32:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:03.743 23:32:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:03.743 23:32:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:03.743 23:32:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:03.743 23:32:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:03.743 23:32:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:03.743 "name": "Existed_Raid", 00:11:03.743 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:03.743 "strip_size_kb": 64, 00:11:03.743 "state": "configuring", 00:11:03.743 "raid_level": "raid0", 00:11:03.743 "superblock": false, 00:11:03.743 "num_base_bdevs": 3, 00:11:03.743 "num_base_bdevs_discovered": 2, 00:11:03.743 "num_base_bdevs_operational": 3, 00:11:03.743 "base_bdevs_list": [ 00:11:03.743 { 00:11:03.743 "name": null, 00:11:03.743 "uuid": "53d245ae-49e9-4301-9bee-2bb5c5fa575c", 00:11:03.743 "is_configured": false, 00:11:03.743 "data_offset": 0, 00:11:03.743 "data_size": 65536 00:11:03.743 }, 00:11:03.743 { 00:11:03.743 "name": "BaseBdev2", 00:11:03.743 "uuid": "d7010fa9-2e60-4663-acb9-fcfac5c92043", 00:11:03.743 "is_configured": true, 00:11:03.743 "data_offset": 0, 00:11:03.743 "data_size": 65536 00:11:03.743 }, 00:11:03.743 { 00:11:03.743 "name": "BaseBdev3", 00:11:03.743 "uuid": "40351b8f-5862-423f-885d-71880253c516", 00:11:03.743 "is_configured": true, 00:11:03.743 "data_offset": 0, 00:11:03.743 "data_size": 65536 00:11:03.743 } 00:11:03.743 ] 00:11:03.743 }' 00:11:03.743 23:32:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:03.743 23:32:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:04.306 23:32:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:04.306 23:32:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:11:04.563 23:32:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:11:04.563 23:32:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:04.563 23:32:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:11:04.563 23:32:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 53d245ae-49e9-4301-9bee-2bb5c5fa575c 00:11:04.821 [2024-07-24 23:32:49.687166] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:11:04.821 [2024-07-24 23:32:49.687194] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x175ff60 00:11:04.821 [2024-07-24 23:32:49.687198] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:11:04.821 [2024-07-24 23:32:49.687327] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1765ef0 00:11:04.821 [2024-07-24 23:32:49.687403] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x175ff60 00:11:04.821 [2024-07-24 23:32:49.687408] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x175ff60 00:11:04.821 [2024-07-24 23:32:49.687534] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:04.821 NewBaseBdev 00:11:04.821 23:32:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:11:04.821 23:32:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=NewBaseBdev 00:11:04.821 23:32:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:11:04.821 23:32:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:11:04.821 23:32:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:11:04.821 23:32:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:11:04.821 23:32:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:05.079 23:32:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:11:05.079 [ 00:11:05.079 { 00:11:05.079 "name": "NewBaseBdev", 00:11:05.079 "aliases": [ 00:11:05.079 "53d245ae-49e9-4301-9bee-2bb5c5fa575c" 00:11:05.079 ], 00:11:05.079 "product_name": "Malloc disk", 00:11:05.079 "block_size": 512, 00:11:05.079 "num_blocks": 65536, 00:11:05.079 "uuid": "53d245ae-49e9-4301-9bee-2bb5c5fa575c", 00:11:05.079 "assigned_rate_limits": { 00:11:05.079 "rw_ios_per_sec": 0, 00:11:05.079 "rw_mbytes_per_sec": 0, 00:11:05.079 "r_mbytes_per_sec": 0, 00:11:05.079 "w_mbytes_per_sec": 0 00:11:05.079 }, 00:11:05.079 "claimed": true, 00:11:05.079 "claim_type": "exclusive_write", 00:11:05.079 "zoned": false, 00:11:05.079 "supported_io_types": { 00:11:05.079 "read": true, 00:11:05.079 "write": true, 00:11:05.079 "unmap": true, 00:11:05.079 "flush": true, 00:11:05.079 "reset": true, 00:11:05.079 "nvme_admin": false, 00:11:05.079 "nvme_io": false, 00:11:05.079 "nvme_io_md": false, 00:11:05.079 "write_zeroes": true, 00:11:05.079 "zcopy": true, 00:11:05.079 "get_zone_info": false, 00:11:05.079 "zone_management": false, 00:11:05.079 "zone_append": false, 00:11:05.079 "compare": false, 00:11:05.079 "compare_and_write": false, 00:11:05.079 "abort": true, 00:11:05.079 "seek_hole": false, 00:11:05.079 "seek_data": false, 00:11:05.079 "copy": true, 00:11:05.079 "nvme_iov_md": false 00:11:05.079 }, 00:11:05.079 "memory_domains": [ 00:11:05.079 { 00:11:05.079 "dma_device_id": "system", 00:11:05.079 "dma_device_type": 1 00:11:05.079 }, 00:11:05.079 { 00:11:05.079 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:05.079 "dma_device_type": 2 00:11:05.079 } 00:11:05.079 ], 00:11:05.079 "driver_specific": {} 00:11:05.079 } 00:11:05.079 ] 00:11:05.079 23:32:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:11:05.079 23:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:11:05.079 23:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:05.079 23:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:05.079 23:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:05.079 23:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:05.079 23:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:05.079 23:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:05.079 23:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:05.079 23:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:05.079 23:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:05.079 23:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:05.079 23:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:05.336 23:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:05.336 "name": "Existed_Raid", 00:11:05.336 "uuid": "0bd41871-ce84-45f9-8659-d67afcbe1c95", 00:11:05.336 "strip_size_kb": 64, 00:11:05.336 "state": "online", 00:11:05.336 "raid_level": "raid0", 00:11:05.336 "superblock": false, 00:11:05.336 "num_base_bdevs": 3, 00:11:05.336 "num_base_bdevs_discovered": 3, 00:11:05.336 "num_base_bdevs_operational": 3, 00:11:05.336 "base_bdevs_list": [ 00:11:05.336 { 00:11:05.336 "name": "NewBaseBdev", 00:11:05.336 "uuid": "53d245ae-49e9-4301-9bee-2bb5c5fa575c", 00:11:05.336 "is_configured": true, 00:11:05.336 "data_offset": 0, 00:11:05.336 "data_size": 65536 00:11:05.336 }, 00:11:05.336 { 00:11:05.336 "name": "BaseBdev2", 00:11:05.336 "uuid": "d7010fa9-2e60-4663-acb9-fcfac5c92043", 00:11:05.336 "is_configured": true, 00:11:05.336 "data_offset": 0, 00:11:05.336 "data_size": 65536 00:11:05.336 }, 00:11:05.336 { 00:11:05.336 "name": "BaseBdev3", 00:11:05.336 "uuid": "40351b8f-5862-423f-885d-71880253c516", 00:11:05.336 "is_configured": true, 00:11:05.336 "data_offset": 0, 00:11:05.336 "data_size": 65536 00:11:05.336 } 00:11:05.336 ] 00:11:05.336 }' 00:11:05.336 23:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:05.336 23:32:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:05.899 23:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:11:05.899 23:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:11:05.899 23:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:05.899 23:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:05.899 23:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:05.899 23:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:05.899 23:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:05.899 23:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:05.899 [2024-07-24 23:32:50.790218] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:05.899 23:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:05.899 "name": "Existed_Raid", 00:11:05.899 "aliases": [ 00:11:05.899 "0bd41871-ce84-45f9-8659-d67afcbe1c95" 00:11:05.899 ], 00:11:05.899 "product_name": "Raid Volume", 00:11:05.899 "block_size": 512, 00:11:05.899 "num_blocks": 196608, 00:11:05.899 "uuid": "0bd41871-ce84-45f9-8659-d67afcbe1c95", 00:11:05.899 "assigned_rate_limits": { 00:11:05.899 "rw_ios_per_sec": 0, 00:11:05.899 "rw_mbytes_per_sec": 0, 00:11:05.899 "r_mbytes_per_sec": 0, 00:11:05.899 "w_mbytes_per_sec": 0 00:11:05.899 }, 00:11:05.899 "claimed": false, 00:11:05.899 "zoned": false, 00:11:05.899 "supported_io_types": { 00:11:05.899 "read": true, 00:11:05.899 "write": true, 00:11:05.899 "unmap": true, 00:11:05.899 "flush": true, 00:11:05.899 "reset": true, 00:11:05.899 "nvme_admin": false, 00:11:05.899 "nvme_io": false, 00:11:05.899 "nvme_io_md": false, 00:11:05.899 "write_zeroes": true, 00:11:05.899 "zcopy": false, 00:11:05.899 "get_zone_info": false, 00:11:05.899 "zone_management": false, 00:11:05.899 "zone_append": false, 00:11:05.899 "compare": false, 00:11:05.899 "compare_and_write": false, 00:11:05.899 "abort": false, 00:11:05.899 "seek_hole": false, 00:11:05.899 "seek_data": false, 00:11:05.899 "copy": false, 00:11:05.899 "nvme_iov_md": false 00:11:05.899 }, 00:11:05.899 "memory_domains": [ 00:11:05.899 { 00:11:05.899 "dma_device_id": "system", 00:11:05.899 "dma_device_type": 1 00:11:05.899 }, 00:11:05.899 { 00:11:05.899 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:05.899 "dma_device_type": 2 00:11:05.899 }, 00:11:05.899 { 00:11:05.899 "dma_device_id": "system", 00:11:05.899 "dma_device_type": 1 00:11:05.899 }, 00:11:05.899 { 00:11:05.899 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:05.899 "dma_device_type": 2 00:11:05.899 }, 00:11:05.899 { 00:11:05.899 "dma_device_id": "system", 00:11:05.899 "dma_device_type": 1 00:11:05.899 }, 00:11:05.899 { 00:11:05.899 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:05.900 "dma_device_type": 2 00:11:05.900 } 00:11:05.900 ], 00:11:05.900 "driver_specific": { 00:11:05.900 "raid": { 00:11:05.900 "uuid": "0bd41871-ce84-45f9-8659-d67afcbe1c95", 00:11:05.900 "strip_size_kb": 64, 00:11:05.900 "state": "online", 00:11:05.900 "raid_level": "raid0", 00:11:05.900 "superblock": false, 00:11:05.900 "num_base_bdevs": 3, 00:11:05.900 "num_base_bdevs_discovered": 3, 00:11:05.900 "num_base_bdevs_operational": 3, 00:11:05.900 "base_bdevs_list": [ 00:11:05.900 { 00:11:05.900 "name": "NewBaseBdev", 00:11:05.900 "uuid": "53d245ae-49e9-4301-9bee-2bb5c5fa575c", 00:11:05.900 "is_configured": true, 00:11:05.900 "data_offset": 0, 00:11:05.900 "data_size": 65536 00:11:05.900 }, 00:11:05.900 { 00:11:05.900 "name": "BaseBdev2", 00:11:05.900 "uuid": "d7010fa9-2e60-4663-acb9-fcfac5c92043", 00:11:05.900 "is_configured": true, 00:11:05.900 "data_offset": 0, 00:11:05.900 "data_size": 65536 00:11:05.900 }, 00:11:05.900 { 00:11:05.900 "name": "BaseBdev3", 00:11:05.900 "uuid": "40351b8f-5862-423f-885d-71880253c516", 00:11:05.900 "is_configured": true, 00:11:05.900 "data_offset": 0, 00:11:05.900 "data_size": 65536 00:11:05.900 } 00:11:05.900 ] 00:11:05.900 } 00:11:05.900 } 00:11:05.900 }' 00:11:05.900 23:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:05.900 23:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:11:05.900 BaseBdev2 00:11:05.900 BaseBdev3' 00:11:05.900 23:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:05.900 23:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:11:05.900 23:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:06.157 23:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:06.157 "name": "NewBaseBdev", 00:11:06.157 "aliases": [ 00:11:06.157 "53d245ae-49e9-4301-9bee-2bb5c5fa575c" 00:11:06.157 ], 00:11:06.157 "product_name": "Malloc disk", 00:11:06.157 "block_size": 512, 00:11:06.157 "num_blocks": 65536, 00:11:06.157 "uuid": "53d245ae-49e9-4301-9bee-2bb5c5fa575c", 00:11:06.157 "assigned_rate_limits": { 00:11:06.157 "rw_ios_per_sec": 0, 00:11:06.157 "rw_mbytes_per_sec": 0, 00:11:06.157 "r_mbytes_per_sec": 0, 00:11:06.157 "w_mbytes_per_sec": 0 00:11:06.157 }, 00:11:06.157 "claimed": true, 00:11:06.157 "claim_type": "exclusive_write", 00:11:06.157 "zoned": false, 00:11:06.157 "supported_io_types": { 00:11:06.157 "read": true, 00:11:06.157 "write": true, 00:11:06.157 "unmap": true, 00:11:06.157 "flush": true, 00:11:06.157 "reset": true, 00:11:06.157 "nvme_admin": false, 00:11:06.157 "nvme_io": false, 00:11:06.157 "nvme_io_md": false, 00:11:06.157 "write_zeroes": true, 00:11:06.157 "zcopy": true, 00:11:06.157 "get_zone_info": false, 00:11:06.157 "zone_management": false, 00:11:06.157 "zone_append": false, 00:11:06.157 "compare": false, 00:11:06.157 "compare_and_write": false, 00:11:06.157 "abort": true, 00:11:06.157 "seek_hole": false, 00:11:06.157 "seek_data": false, 00:11:06.157 "copy": true, 00:11:06.157 "nvme_iov_md": false 00:11:06.157 }, 00:11:06.157 "memory_domains": [ 00:11:06.157 { 00:11:06.157 "dma_device_id": "system", 00:11:06.157 "dma_device_type": 1 00:11:06.157 }, 00:11:06.157 { 00:11:06.157 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:06.157 "dma_device_type": 2 00:11:06.157 } 00:11:06.157 ], 00:11:06.157 "driver_specific": {} 00:11:06.157 }' 00:11:06.157 23:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:06.157 23:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:06.157 23:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:06.157 23:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:06.157 23:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:06.415 23:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:06.415 23:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:06.415 23:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:06.415 23:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:06.415 23:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:06.415 23:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:06.415 23:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:06.415 23:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:06.415 23:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:06.415 23:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:06.672 23:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:06.672 "name": "BaseBdev2", 00:11:06.672 "aliases": [ 00:11:06.672 "d7010fa9-2e60-4663-acb9-fcfac5c92043" 00:11:06.672 ], 00:11:06.672 "product_name": "Malloc disk", 00:11:06.672 "block_size": 512, 00:11:06.672 "num_blocks": 65536, 00:11:06.672 "uuid": "d7010fa9-2e60-4663-acb9-fcfac5c92043", 00:11:06.672 "assigned_rate_limits": { 00:11:06.672 "rw_ios_per_sec": 0, 00:11:06.672 "rw_mbytes_per_sec": 0, 00:11:06.672 "r_mbytes_per_sec": 0, 00:11:06.672 "w_mbytes_per_sec": 0 00:11:06.672 }, 00:11:06.672 "claimed": true, 00:11:06.672 "claim_type": "exclusive_write", 00:11:06.672 "zoned": false, 00:11:06.672 "supported_io_types": { 00:11:06.672 "read": true, 00:11:06.672 "write": true, 00:11:06.672 "unmap": true, 00:11:06.672 "flush": true, 00:11:06.672 "reset": true, 00:11:06.672 "nvme_admin": false, 00:11:06.672 "nvme_io": false, 00:11:06.672 "nvme_io_md": false, 00:11:06.672 "write_zeroes": true, 00:11:06.672 "zcopy": true, 00:11:06.672 "get_zone_info": false, 00:11:06.672 "zone_management": false, 00:11:06.672 "zone_append": false, 00:11:06.672 "compare": false, 00:11:06.672 "compare_and_write": false, 00:11:06.672 "abort": true, 00:11:06.672 "seek_hole": false, 00:11:06.672 "seek_data": false, 00:11:06.672 "copy": true, 00:11:06.672 "nvme_iov_md": false 00:11:06.672 }, 00:11:06.672 "memory_domains": [ 00:11:06.672 { 00:11:06.672 "dma_device_id": "system", 00:11:06.672 "dma_device_type": 1 00:11:06.672 }, 00:11:06.672 { 00:11:06.672 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:06.672 "dma_device_type": 2 00:11:06.672 } 00:11:06.672 ], 00:11:06.672 "driver_specific": {} 00:11:06.672 }' 00:11:06.672 23:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:06.672 23:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:06.672 23:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:06.673 23:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:06.673 23:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:06.673 23:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:06.673 23:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:06.930 23:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:06.930 23:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:06.930 23:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:06.930 23:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:06.930 23:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:06.930 23:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:06.930 23:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:11:06.930 23:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:07.188 23:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:07.188 "name": "BaseBdev3", 00:11:07.188 "aliases": [ 00:11:07.188 "40351b8f-5862-423f-885d-71880253c516" 00:11:07.188 ], 00:11:07.188 "product_name": "Malloc disk", 00:11:07.188 "block_size": 512, 00:11:07.188 "num_blocks": 65536, 00:11:07.188 "uuid": "40351b8f-5862-423f-885d-71880253c516", 00:11:07.188 "assigned_rate_limits": { 00:11:07.188 "rw_ios_per_sec": 0, 00:11:07.188 "rw_mbytes_per_sec": 0, 00:11:07.188 "r_mbytes_per_sec": 0, 00:11:07.188 "w_mbytes_per_sec": 0 00:11:07.188 }, 00:11:07.188 "claimed": true, 00:11:07.188 "claim_type": "exclusive_write", 00:11:07.188 "zoned": false, 00:11:07.188 "supported_io_types": { 00:11:07.188 "read": true, 00:11:07.188 "write": true, 00:11:07.188 "unmap": true, 00:11:07.188 "flush": true, 00:11:07.188 "reset": true, 00:11:07.188 "nvme_admin": false, 00:11:07.188 "nvme_io": false, 00:11:07.188 "nvme_io_md": false, 00:11:07.188 "write_zeroes": true, 00:11:07.188 "zcopy": true, 00:11:07.188 "get_zone_info": false, 00:11:07.188 "zone_management": false, 00:11:07.188 "zone_append": false, 00:11:07.188 "compare": false, 00:11:07.188 "compare_and_write": false, 00:11:07.188 "abort": true, 00:11:07.188 "seek_hole": false, 00:11:07.188 "seek_data": false, 00:11:07.188 "copy": true, 00:11:07.188 "nvme_iov_md": false 00:11:07.188 }, 00:11:07.188 "memory_domains": [ 00:11:07.188 { 00:11:07.188 "dma_device_id": "system", 00:11:07.188 "dma_device_type": 1 00:11:07.188 }, 00:11:07.188 { 00:11:07.188 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:07.188 "dma_device_type": 2 00:11:07.188 } 00:11:07.188 ], 00:11:07.188 "driver_specific": {} 00:11:07.188 }' 00:11:07.188 23:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:07.188 23:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:07.188 23:32:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:07.188 23:32:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:07.188 23:32:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:07.188 23:32:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:07.188 23:32:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:07.188 23:32:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:07.188 23:32:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:07.188 23:32:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:07.188 23:32:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:07.447 23:32:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:07.447 23:32:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:07.447 [2024-07-24 23:32:52.362099] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:07.447 [2024-07-24 23:32:52.362119] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:07.447 [2024-07-24 23:32:52.362158] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:07.447 [2024-07-24 23:32:52.362195] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:07.447 [2024-07-24 23:32:52.362201] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x175ff60 name Existed_Raid, state offline 00:11:07.447 23:32:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 263277 00:11:07.447 23:32:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@950 -- # '[' -z 263277 ']' 00:11:07.447 23:32:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # kill -0 263277 00:11:07.447 23:32:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # uname 00:11:07.447 23:32:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:11:07.447 23:32:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 263277 00:11:07.447 23:32:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:11:07.447 23:32:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:11:07.447 23:32:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 263277' 00:11:07.447 killing process with pid 263277 00:11:07.447 23:32:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@969 -- # kill 263277 00:11:07.447 [2024-07-24 23:32:52.415348] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:07.447 23:32:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@974 -- # wait 263277 00:11:07.447 [2024-07-24 23:32:52.438432] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:07.705 23:32:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:11:07.705 00:11:07.705 real 0m21.270s 00:11:07.705 user 0m39.664s 00:11:07.705 sys 0m3.250s 00:11:07.705 23:32:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:11:07.705 23:32:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:07.705 ************************************ 00:11:07.705 END TEST raid_state_function_test 00:11:07.705 ************************************ 00:11:07.705 23:32:52 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 3 true 00:11:07.705 23:32:52 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:11:07.705 23:32:52 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:11:07.705 23:32:52 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:07.705 ************************************ 00:11:07.705 START TEST raid_state_function_test_sb 00:11:07.705 ************************************ 00:11:07.705 23:32:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # raid_state_function_test raid0 3 true 00:11:07.705 23:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:11:07.705 23:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:11:07.705 23:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:11:07.705 23:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:11:07.705 23:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:11:07.705 23:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:07.705 23:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:11:07.705 23:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:07.705 23:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:07.705 23:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:11:07.705 23:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:07.705 23:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:07.705 23:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:11:07.705 23:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:07.706 23:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:07.706 23:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:11:07.706 23:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:11:07.706 23:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:11:07.706 23:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:11:07.706 23:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:11:07.706 23:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:11:07.706 23:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:11:07.706 23:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:11:07.706 23:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:11:07.706 23:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:11:07.706 23:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:11:07.706 23:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=267471 00:11:07.706 23:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 267471' 00:11:07.706 Process raid pid: 267471 00:11:07.706 23:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:07.706 23:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 267471 /var/tmp/spdk-raid.sock 00:11:07.706 23:32:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@831 -- # '[' -z 267471 ']' 00:11:07.706 23:32:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:07.706 23:32:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # local max_retries=100 00:11:07.706 23:32:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:07.706 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:07.706 23:32:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@840 -- # xtrace_disable 00:11:07.706 23:32:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:07.963 [2024-07-24 23:32:52.737156] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:11:07.963 [2024-07-24 23:32:52.737195] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:07.963 [2024-07-24 23:32:52.800693] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:07.963 [2024-07-24 23:32:52.878840] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:07.963 [2024-07-24 23:32:52.929296] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:07.963 [2024-07-24 23:32:52.929319] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:08.527 23:32:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:11:08.527 23:32:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@864 -- # return 0 00:11:08.527 23:32:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:11:08.784 [2024-07-24 23:32:53.680253] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:08.784 [2024-07-24 23:32:53.680281] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:08.784 [2024-07-24 23:32:53.680287] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:08.784 [2024-07-24 23:32:53.680293] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:08.784 [2024-07-24 23:32:53.680297] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:11:08.784 [2024-07-24 23:32:53.680301] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:11:08.784 23:32:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:08.784 23:32:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:08.784 23:32:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:08.784 23:32:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:08.784 23:32:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:08.784 23:32:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:08.784 23:32:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:08.784 23:32:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:08.784 23:32:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:08.784 23:32:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:08.784 23:32:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:08.784 23:32:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:09.041 23:32:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:09.041 "name": "Existed_Raid", 00:11:09.041 "uuid": "b09c481d-2ce9-48d4-92c1-6d261c3dfadb", 00:11:09.041 "strip_size_kb": 64, 00:11:09.041 "state": "configuring", 00:11:09.041 "raid_level": "raid0", 00:11:09.041 "superblock": true, 00:11:09.041 "num_base_bdevs": 3, 00:11:09.041 "num_base_bdevs_discovered": 0, 00:11:09.041 "num_base_bdevs_operational": 3, 00:11:09.041 "base_bdevs_list": [ 00:11:09.041 { 00:11:09.041 "name": "BaseBdev1", 00:11:09.041 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:09.041 "is_configured": false, 00:11:09.041 "data_offset": 0, 00:11:09.041 "data_size": 0 00:11:09.041 }, 00:11:09.041 { 00:11:09.041 "name": "BaseBdev2", 00:11:09.041 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:09.041 "is_configured": false, 00:11:09.041 "data_offset": 0, 00:11:09.041 "data_size": 0 00:11:09.041 }, 00:11:09.041 { 00:11:09.041 "name": "BaseBdev3", 00:11:09.041 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:09.042 "is_configured": false, 00:11:09.042 "data_offset": 0, 00:11:09.042 "data_size": 0 00:11:09.042 } 00:11:09.042 ] 00:11:09.042 }' 00:11:09.042 23:32:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:09.042 23:32:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:09.606 23:32:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:09.606 [2024-07-24 23:32:54.474213] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:09.606 [2024-07-24 23:32:54.474233] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ea9b30 name Existed_Raid, state configuring 00:11:09.606 23:32:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:11:09.864 [2024-07-24 23:32:54.642667] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:09.864 [2024-07-24 23:32:54.642683] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:09.864 [2024-07-24 23:32:54.642688] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:09.864 [2024-07-24 23:32:54.642693] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:09.864 [2024-07-24 23:32:54.642696] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:11:09.864 [2024-07-24 23:32:54.642701] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:11:09.864 23:32:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:09.864 [2024-07-24 23:32:54.819126] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:09.864 BaseBdev1 00:11:09.864 23:32:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:11:09.864 23:32:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:11:09.864 23:32:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:11:09.864 23:32:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:11:09.864 23:32:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:11:09.864 23:32:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:11:09.864 23:32:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:10.121 23:32:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:10.379 [ 00:11:10.379 { 00:11:10.379 "name": "BaseBdev1", 00:11:10.379 "aliases": [ 00:11:10.379 "89423e85-f0ee-4cf6-8674-a7f4a0827d34" 00:11:10.379 ], 00:11:10.379 "product_name": "Malloc disk", 00:11:10.379 "block_size": 512, 00:11:10.379 "num_blocks": 65536, 00:11:10.379 "uuid": "89423e85-f0ee-4cf6-8674-a7f4a0827d34", 00:11:10.379 "assigned_rate_limits": { 00:11:10.379 "rw_ios_per_sec": 0, 00:11:10.379 "rw_mbytes_per_sec": 0, 00:11:10.379 "r_mbytes_per_sec": 0, 00:11:10.379 "w_mbytes_per_sec": 0 00:11:10.379 }, 00:11:10.379 "claimed": true, 00:11:10.379 "claim_type": "exclusive_write", 00:11:10.379 "zoned": false, 00:11:10.379 "supported_io_types": { 00:11:10.379 "read": true, 00:11:10.379 "write": true, 00:11:10.379 "unmap": true, 00:11:10.379 "flush": true, 00:11:10.379 "reset": true, 00:11:10.379 "nvme_admin": false, 00:11:10.379 "nvme_io": false, 00:11:10.379 "nvme_io_md": false, 00:11:10.379 "write_zeroes": true, 00:11:10.379 "zcopy": true, 00:11:10.379 "get_zone_info": false, 00:11:10.379 "zone_management": false, 00:11:10.379 "zone_append": false, 00:11:10.379 "compare": false, 00:11:10.379 "compare_and_write": false, 00:11:10.379 "abort": true, 00:11:10.379 "seek_hole": false, 00:11:10.379 "seek_data": false, 00:11:10.379 "copy": true, 00:11:10.379 "nvme_iov_md": false 00:11:10.379 }, 00:11:10.379 "memory_domains": [ 00:11:10.379 { 00:11:10.379 "dma_device_id": "system", 00:11:10.379 "dma_device_type": 1 00:11:10.379 }, 00:11:10.379 { 00:11:10.379 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:10.379 "dma_device_type": 2 00:11:10.379 } 00:11:10.379 ], 00:11:10.379 "driver_specific": {} 00:11:10.379 } 00:11:10.379 ] 00:11:10.379 23:32:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:11:10.379 23:32:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:10.379 23:32:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:10.379 23:32:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:10.379 23:32:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:10.379 23:32:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:10.379 23:32:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:10.379 23:32:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:10.379 23:32:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:10.379 23:32:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:10.379 23:32:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:10.379 23:32:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:10.379 23:32:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:10.379 23:32:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:10.379 "name": "Existed_Raid", 00:11:10.379 "uuid": "1ac741f0-9a59-47b1-b129-3bfda7c11979", 00:11:10.379 "strip_size_kb": 64, 00:11:10.379 "state": "configuring", 00:11:10.379 "raid_level": "raid0", 00:11:10.379 "superblock": true, 00:11:10.379 "num_base_bdevs": 3, 00:11:10.379 "num_base_bdevs_discovered": 1, 00:11:10.379 "num_base_bdevs_operational": 3, 00:11:10.379 "base_bdevs_list": [ 00:11:10.379 { 00:11:10.379 "name": "BaseBdev1", 00:11:10.379 "uuid": "89423e85-f0ee-4cf6-8674-a7f4a0827d34", 00:11:10.379 "is_configured": true, 00:11:10.379 "data_offset": 2048, 00:11:10.379 "data_size": 63488 00:11:10.379 }, 00:11:10.379 { 00:11:10.379 "name": "BaseBdev2", 00:11:10.379 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:10.379 "is_configured": false, 00:11:10.379 "data_offset": 0, 00:11:10.379 "data_size": 0 00:11:10.379 }, 00:11:10.379 { 00:11:10.379 "name": "BaseBdev3", 00:11:10.379 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:10.379 "is_configured": false, 00:11:10.379 "data_offset": 0, 00:11:10.379 "data_size": 0 00:11:10.379 } 00:11:10.379 ] 00:11:10.379 }' 00:11:10.379 23:32:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:10.379 23:32:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:10.944 23:32:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:11.202 [2024-07-24 23:32:55.990158] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:11.202 [2024-07-24 23:32:55.990185] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ea93a0 name Existed_Raid, state configuring 00:11:11.202 23:32:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:11:11.202 [2024-07-24 23:32:56.158620] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:11.202 [2024-07-24 23:32:56.159680] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:11.202 [2024-07-24 23:32:56.159704] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:11.202 [2024-07-24 23:32:56.159709] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:11:11.202 [2024-07-24 23:32:56.159714] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:11:11.202 23:32:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:11:11.202 23:32:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:11.202 23:32:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:11.202 23:32:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:11.202 23:32:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:11.202 23:32:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:11.202 23:32:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:11.202 23:32:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:11.202 23:32:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:11.202 23:32:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:11.202 23:32:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:11.202 23:32:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:11.202 23:32:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:11.202 23:32:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:11.460 23:32:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:11.460 "name": "Existed_Raid", 00:11:11.460 "uuid": "4e234105-a48e-4e53-919a-869548b30d5a", 00:11:11.460 "strip_size_kb": 64, 00:11:11.460 "state": "configuring", 00:11:11.460 "raid_level": "raid0", 00:11:11.460 "superblock": true, 00:11:11.460 "num_base_bdevs": 3, 00:11:11.460 "num_base_bdevs_discovered": 1, 00:11:11.460 "num_base_bdevs_operational": 3, 00:11:11.460 "base_bdevs_list": [ 00:11:11.460 { 00:11:11.460 "name": "BaseBdev1", 00:11:11.460 "uuid": "89423e85-f0ee-4cf6-8674-a7f4a0827d34", 00:11:11.460 "is_configured": true, 00:11:11.460 "data_offset": 2048, 00:11:11.460 "data_size": 63488 00:11:11.460 }, 00:11:11.460 { 00:11:11.460 "name": "BaseBdev2", 00:11:11.460 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:11.460 "is_configured": false, 00:11:11.460 "data_offset": 0, 00:11:11.460 "data_size": 0 00:11:11.460 }, 00:11:11.460 { 00:11:11.460 "name": "BaseBdev3", 00:11:11.460 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:11.460 "is_configured": false, 00:11:11.460 "data_offset": 0, 00:11:11.460 "data_size": 0 00:11:11.460 } 00:11:11.460 ] 00:11:11.460 }' 00:11:11.460 23:32:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:11.460 23:32:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:12.046 23:32:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:12.046 [2024-07-24 23:32:56.975376] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:12.046 BaseBdev2 00:11:12.046 23:32:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:11:12.046 23:32:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:11:12.046 23:32:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:11:12.046 23:32:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:11:12.046 23:32:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:11:12.046 23:32:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:11:12.046 23:32:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:12.317 23:32:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:12.317 [ 00:11:12.317 { 00:11:12.317 "name": "BaseBdev2", 00:11:12.317 "aliases": [ 00:11:12.317 "895c4e20-2154-4966-a420-e9c8a49bdd34" 00:11:12.317 ], 00:11:12.317 "product_name": "Malloc disk", 00:11:12.317 "block_size": 512, 00:11:12.317 "num_blocks": 65536, 00:11:12.317 "uuid": "895c4e20-2154-4966-a420-e9c8a49bdd34", 00:11:12.317 "assigned_rate_limits": { 00:11:12.317 "rw_ios_per_sec": 0, 00:11:12.317 "rw_mbytes_per_sec": 0, 00:11:12.317 "r_mbytes_per_sec": 0, 00:11:12.317 "w_mbytes_per_sec": 0 00:11:12.317 }, 00:11:12.317 "claimed": true, 00:11:12.317 "claim_type": "exclusive_write", 00:11:12.317 "zoned": false, 00:11:12.317 "supported_io_types": { 00:11:12.317 "read": true, 00:11:12.317 "write": true, 00:11:12.317 "unmap": true, 00:11:12.317 "flush": true, 00:11:12.317 "reset": true, 00:11:12.317 "nvme_admin": false, 00:11:12.317 "nvme_io": false, 00:11:12.317 "nvme_io_md": false, 00:11:12.317 "write_zeroes": true, 00:11:12.317 "zcopy": true, 00:11:12.317 "get_zone_info": false, 00:11:12.317 "zone_management": false, 00:11:12.317 "zone_append": false, 00:11:12.317 "compare": false, 00:11:12.317 "compare_and_write": false, 00:11:12.317 "abort": true, 00:11:12.317 "seek_hole": false, 00:11:12.317 "seek_data": false, 00:11:12.317 "copy": true, 00:11:12.317 "nvme_iov_md": false 00:11:12.317 }, 00:11:12.317 "memory_domains": [ 00:11:12.317 { 00:11:12.317 "dma_device_id": "system", 00:11:12.317 "dma_device_type": 1 00:11:12.317 }, 00:11:12.317 { 00:11:12.317 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:12.317 "dma_device_type": 2 00:11:12.317 } 00:11:12.317 ], 00:11:12.317 "driver_specific": {} 00:11:12.317 } 00:11:12.317 ] 00:11:12.317 23:32:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:11:12.317 23:32:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:11:12.317 23:32:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:12.317 23:32:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:12.317 23:32:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:12.317 23:32:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:12.317 23:32:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:12.317 23:32:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:12.317 23:32:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:12.317 23:32:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:12.317 23:32:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:12.317 23:32:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:12.317 23:32:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:12.317 23:32:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:12.317 23:32:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:12.575 23:32:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:12.575 "name": "Existed_Raid", 00:11:12.575 "uuid": "4e234105-a48e-4e53-919a-869548b30d5a", 00:11:12.575 "strip_size_kb": 64, 00:11:12.575 "state": "configuring", 00:11:12.575 "raid_level": "raid0", 00:11:12.575 "superblock": true, 00:11:12.575 "num_base_bdevs": 3, 00:11:12.575 "num_base_bdevs_discovered": 2, 00:11:12.575 "num_base_bdevs_operational": 3, 00:11:12.575 "base_bdevs_list": [ 00:11:12.575 { 00:11:12.575 "name": "BaseBdev1", 00:11:12.575 "uuid": "89423e85-f0ee-4cf6-8674-a7f4a0827d34", 00:11:12.575 "is_configured": true, 00:11:12.575 "data_offset": 2048, 00:11:12.575 "data_size": 63488 00:11:12.575 }, 00:11:12.575 { 00:11:12.575 "name": "BaseBdev2", 00:11:12.575 "uuid": "895c4e20-2154-4966-a420-e9c8a49bdd34", 00:11:12.575 "is_configured": true, 00:11:12.575 "data_offset": 2048, 00:11:12.575 "data_size": 63488 00:11:12.575 }, 00:11:12.575 { 00:11:12.575 "name": "BaseBdev3", 00:11:12.575 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:12.575 "is_configured": false, 00:11:12.575 "data_offset": 0, 00:11:12.575 "data_size": 0 00:11:12.575 } 00:11:12.575 ] 00:11:12.575 }' 00:11:12.575 23:32:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:12.575 23:32:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:13.141 23:32:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:11:13.141 [2024-07-24 23:32:58.112924] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:11:13.141 [2024-07-24 23:32:58.113060] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1eaa2a0 00:11:13.141 [2024-07-24 23:32:58.113069] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:11:13.141 [2024-07-24 23:32:58.113191] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1eab310 00:11:13.141 [2024-07-24 23:32:58.113274] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1eaa2a0 00:11:13.141 [2024-07-24 23:32:58.113280] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1eaa2a0 00:11:13.141 [2024-07-24 23:32:58.113345] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:13.141 BaseBdev3 00:11:13.141 23:32:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:11:13.141 23:32:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:11:13.141 23:32:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:11:13.141 23:32:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:11:13.141 23:32:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:11:13.141 23:32:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:11:13.141 23:32:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:13.398 23:32:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:11:13.656 [ 00:11:13.656 { 00:11:13.656 "name": "BaseBdev3", 00:11:13.656 "aliases": [ 00:11:13.656 "0f9d8d74-1b77-4654-9dad-c1cc8c832ede" 00:11:13.656 ], 00:11:13.656 "product_name": "Malloc disk", 00:11:13.656 "block_size": 512, 00:11:13.656 "num_blocks": 65536, 00:11:13.656 "uuid": "0f9d8d74-1b77-4654-9dad-c1cc8c832ede", 00:11:13.656 "assigned_rate_limits": { 00:11:13.656 "rw_ios_per_sec": 0, 00:11:13.656 "rw_mbytes_per_sec": 0, 00:11:13.656 "r_mbytes_per_sec": 0, 00:11:13.656 "w_mbytes_per_sec": 0 00:11:13.656 }, 00:11:13.656 "claimed": true, 00:11:13.656 "claim_type": "exclusive_write", 00:11:13.656 "zoned": false, 00:11:13.656 "supported_io_types": { 00:11:13.656 "read": true, 00:11:13.656 "write": true, 00:11:13.656 "unmap": true, 00:11:13.656 "flush": true, 00:11:13.656 "reset": true, 00:11:13.656 "nvme_admin": false, 00:11:13.656 "nvme_io": false, 00:11:13.656 "nvme_io_md": false, 00:11:13.656 "write_zeroes": true, 00:11:13.656 "zcopy": true, 00:11:13.656 "get_zone_info": false, 00:11:13.656 "zone_management": false, 00:11:13.656 "zone_append": false, 00:11:13.656 "compare": false, 00:11:13.656 "compare_and_write": false, 00:11:13.656 "abort": true, 00:11:13.656 "seek_hole": false, 00:11:13.656 "seek_data": false, 00:11:13.656 "copy": true, 00:11:13.656 "nvme_iov_md": false 00:11:13.656 }, 00:11:13.656 "memory_domains": [ 00:11:13.656 { 00:11:13.656 "dma_device_id": "system", 00:11:13.656 "dma_device_type": 1 00:11:13.656 }, 00:11:13.656 { 00:11:13.656 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:13.656 "dma_device_type": 2 00:11:13.656 } 00:11:13.656 ], 00:11:13.656 "driver_specific": {} 00:11:13.656 } 00:11:13.656 ] 00:11:13.656 23:32:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:11:13.656 23:32:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:11:13.656 23:32:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:13.656 23:32:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:11:13.656 23:32:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:13.656 23:32:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:13.656 23:32:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:13.656 23:32:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:13.656 23:32:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:13.656 23:32:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:13.656 23:32:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:13.656 23:32:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:13.656 23:32:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:13.656 23:32:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:13.656 23:32:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:13.656 23:32:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:13.656 "name": "Existed_Raid", 00:11:13.656 "uuid": "4e234105-a48e-4e53-919a-869548b30d5a", 00:11:13.656 "strip_size_kb": 64, 00:11:13.656 "state": "online", 00:11:13.656 "raid_level": "raid0", 00:11:13.656 "superblock": true, 00:11:13.656 "num_base_bdevs": 3, 00:11:13.656 "num_base_bdevs_discovered": 3, 00:11:13.656 "num_base_bdevs_operational": 3, 00:11:13.656 "base_bdevs_list": [ 00:11:13.656 { 00:11:13.656 "name": "BaseBdev1", 00:11:13.656 "uuid": "89423e85-f0ee-4cf6-8674-a7f4a0827d34", 00:11:13.656 "is_configured": true, 00:11:13.656 "data_offset": 2048, 00:11:13.656 "data_size": 63488 00:11:13.656 }, 00:11:13.656 { 00:11:13.656 "name": "BaseBdev2", 00:11:13.656 "uuid": "895c4e20-2154-4966-a420-e9c8a49bdd34", 00:11:13.656 "is_configured": true, 00:11:13.656 "data_offset": 2048, 00:11:13.656 "data_size": 63488 00:11:13.656 }, 00:11:13.656 { 00:11:13.656 "name": "BaseBdev3", 00:11:13.656 "uuid": "0f9d8d74-1b77-4654-9dad-c1cc8c832ede", 00:11:13.656 "is_configured": true, 00:11:13.656 "data_offset": 2048, 00:11:13.656 "data_size": 63488 00:11:13.656 } 00:11:13.656 ] 00:11:13.656 }' 00:11:13.656 23:32:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:13.656 23:32:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:14.222 23:32:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:11:14.222 23:32:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:11:14.222 23:32:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:14.222 23:32:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:14.222 23:32:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:14.222 23:32:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:11:14.222 23:32:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:14.222 23:32:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:14.479 [2024-07-24 23:32:59.268243] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:14.479 23:32:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:14.479 "name": "Existed_Raid", 00:11:14.479 "aliases": [ 00:11:14.479 "4e234105-a48e-4e53-919a-869548b30d5a" 00:11:14.479 ], 00:11:14.479 "product_name": "Raid Volume", 00:11:14.479 "block_size": 512, 00:11:14.479 "num_blocks": 190464, 00:11:14.479 "uuid": "4e234105-a48e-4e53-919a-869548b30d5a", 00:11:14.479 "assigned_rate_limits": { 00:11:14.479 "rw_ios_per_sec": 0, 00:11:14.479 "rw_mbytes_per_sec": 0, 00:11:14.479 "r_mbytes_per_sec": 0, 00:11:14.479 "w_mbytes_per_sec": 0 00:11:14.480 }, 00:11:14.480 "claimed": false, 00:11:14.480 "zoned": false, 00:11:14.480 "supported_io_types": { 00:11:14.480 "read": true, 00:11:14.480 "write": true, 00:11:14.480 "unmap": true, 00:11:14.480 "flush": true, 00:11:14.480 "reset": true, 00:11:14.480 "nvme_admin": false, 00:11:14.480 "nvme_io": false, 00:11:14.480 "nvme_io_md": false, 00:11:14.480 "write_zeroes": true, 00:11:14.480 "zcopy": false, 00:11:14.480 "get_zone_info": false, 00:11:14.480 "zone_management": false, 00:11:14.480 "zone_append": false, 00:11:14.480 "compare": false, 00:11:14.480 "compare_and_write": false, 00:11:14.480 "abort": false, 00:11:14.480 "seek_hole": false, 00:11:14.480 "seek_data": false, 00:11:14.480 "copy": false, 00:11:14.480 "nvme_iov_md": false 00:11:14.480 }, 00:11:14.480 "memory_domains": [ 00:11:14.480 { 00:11:14.480 "dma_device_id": "system", 00:11:14.480 "dma_device_type": 1 00:11:14.480 }, 00:11:14.480 { 00:11:14.480 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:14.480 "dma_device_type": 2 00:11:14.480 }, 00:11:14.480 { 00:11:14.480 "dma_device_id": "system", 00:11:14.480 "dma_device_type": 1 00:11:14.480 }, 00:11:14.480 { 00:11:14.480 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:14.480 "dma_device_type": 2 00:11:14.480 }, 00:11:14.480 { 00:11:14.480 "dma_device_id": "system", 00:11:14.480 "dma_device_type": 1 00:11:14.480 }, 00:11:14.480 { 00:11:14.480 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:14.480 "dma_device_type": 2 00:11:14.480 } 00:11:14.480 ], 00:11:14.480 "driver_specific": { 00:11:14.480 "raid": { 00:11:14.480 "uuid": "4e234105-a48e-4e53-919a-869548b30d5a", 00:11:14.480 "strip_size_kb": 64, 00:11:14.480 "state": "online", 00:11:14.480 "raid_level": "raid0", 00:11:14.480 "superblock": true, 00:11:14.480 "num_base_bdevs": 3, 00:11:14.480 "num_base_bdevs_discovered": 3, 00:11:14.480 "num_base_bdevs_operational": 3, 00:11:14.480 "base_bdevs_list": [ 00:11:14.480 { 00:11:14.480 "name": "BaseBdev1", 00:11:14.480 "uuid": "89423e85-f0ee-4cf6-8674-a7f4a0827d34", 00:11:14.480 "is_configured": true, 00:11:14.480 "data_offset": 2048, 00:11:14.480 "data_size": 63488 00:11:14.480 }, 00:11:14.480 { 00:11:14.480 "name": "BaseBdev2", 00:11:14.480 "uuid": "895c4e20-2154-4966-a420-e9c8a49bdd34", 00:11:14.480 "is_configured": true, 00:11:14.480 "data_offset": 2048, 00:11:14.480 "data_size": 63488 00:11:14.480 }, 00:11:14.480 { 00:11:14.480 "name": "BaseBdev3", 00:11:14.480 "uuid": "0f9d8d74-1b77-4654-9dad-c1cc8c832ede", 00:11:14.480 "is_configured": true, 00:11:14.480 "data_offset": 2048, 00:11:14.480 "data_size": 63488 00:11:14.480 } 00:11:14.480 ] 00:11:14.480 } 00:11:14.480 } 00:11:14.480 }' 00:11:14.480 23:32:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:14.480 23:32:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:11:14.480 BaseBdev2 00:11:14.480 BaseBdev3' 00:11:14.480 23:32:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:14.480 23:32:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:14.480 23:32:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:11:14.737 23:32:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:14.738 "name": "BaseBdev1", 00:11:14.738 "aliases": [ 00:11:14.738 "89423e85-f0ee-4cf6-8674-a7f4a0827d34" 00:11:14.738 ], 00:11:14.738 "product_name": "Malloc disk", 00:11:14.738 "block_size": 512, 00:11:14.738 "num_blocks": 65536, 00:11:14.738 "uuid": "89423e85-f0ee-4cf6-8674-a7f4a0827d34", 00:11:14.738 "assigned_rate_limits": { 00:11:14.738 "rw_ios_per_sec": 0, 00:11:14.738 "rw_mbytes_per_sec": 0, 00:11:14.738 "r_mbytes_per_sec": 0, 00:11:14.738 "w_mbytes_per_sec": 0 00:11:14.738 }, 00:11:14.738 "claimed": true, 00:11:14.738 "claim_type": "exclusive_write", 00:11:14.738 "zoned": false, 00:11:14.738 "supported_io_types": { 00:11:14.738 "read": true, 00:11:14.738 "write": true, 00:11:14.738 "unmap": true, 00:11:14.738 "flush": true, 00:11:14.738 "reset": true, 00:11:14.738 "nvme_admin": false, 00:11:14.738 "nvme_io": false, 00:11:14.738 "nvme_io_md": false, 00:11:14.738 "write_zeroes": true, 00:11:14.738 "zcopy": true, 00:11:14.738 "get_zone_info": false, 00:11:14.738 "zone_management": false, 00:11:14.738 "zone_append": false, 00:11:14.738 "compare": false, 00:11:14.738 "compare_and_write": false, 00:11:14.738 "abort": true, 00:11:14.738 "seek_hole": false, 00:11:14.738 "seek_data": false, 00:11:14.738 "copy": true, 00:11:14.738 "nvme_iov_md": false 00:11:14.738 }, 00:11:14.738 "memory_domains": [ 00:11:14.738 { 00:11:14.738 "dma_device_id": "system", 00:11:14.738 "dma_device_type": 1 00:11:14.738 }, 00:11:14.738 { 00:11:14.738 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:14.738 "dma_device_type": 2 00:11:14.738 } 00:11:14.738 ], 00:11:14.738 "driver_specific": {} 00:11:14.738 }' 00:11:14.738 23:32:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:14.738 23:32:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:14.738 23:32:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:14.738 23:32:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:14.738 23:32:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:14.738 23:32:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:14.738 23:32:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:14.738 23:32:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:14.738 23:32:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:14.738 23:32:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:14.996 23:32:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:14.996 23:32:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:14.996 23:32:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:14.996 23:32:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:14.996 23:32:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:14.996 23:32:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:14.996 "name": "BaseBdev2", 00:11:14.996 "aliases": [ 00:11:14.996 "895c4e20-2154-4966-a420-e9c8a49bdd34" 00:11:14.996 ], 00:11:14.996 "product_name": "Malloc disk", 00:11:14.996 "block_size": 512, 00:11:14.996 "num_blocks": 65536, 00:11:14.996 "uuid": "895c4e20-2154-4966-a420-e9c8a49bdd34", 00:11:14.996 "assigned_rate_limits": { 00:11:14.996 "rw_ios_per_sec": 0, 00:11:14.996 "rw_mbytes_per_sec": 0, 00:11:14.996 "r_mbytes_per_sec": 0, 00:11:14.996 "w_mbytes_per_sec": 0 00:11:14.996 }, 00:11:14.996 "claimed": true, 00:11:14.996 "claim_type": "exclusive_write", 00:11:14.996 "zoned": false, 00:11:14.996 "supported_io_types": { 00:11:14.996 "read": true, 00:11:14.996 "write": true, 00:11:14.996 "unmap": true, 00:11:14.996 "flush": true, 00:11:14.996 "reset": true, 00:11:14.996 "nvme_admin": false, 00:11:14.996 "nvme_io": false, 00:11:14.996 "nvme_io_md": false, 00:11:14.996 "write_zeroes": true, 00:11:14.996 "zcopy": true, 00:11:14.996 "get_zone_info": false, 00:11:14.996 "zone_management": false, 00:11:14.996 "zone_append": false, 00:11:14.996 "compare": false, 00:11:14.996 "compare_and_write": false, 00:11:14.996 "abort": true, 00:11:14.996 "seek_hole": false, 00:11:14.996 "seek_data": false, 00:11:14.996 "copy": true, 00:11:14.996 "nvme_iov_md": false 00:11:14.996 }, 00:11:14.996 "memory_domains": [ 00:11:14.996 { 00:11:14.996 "dma_device_id": "system", 00:11:14.996 "dma_device_type": 1 00:11:14.996 }, 00:11:14.996 { 00:11:14.996 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:14.996 "dma_device_type": 2 00:11:14.996 } 00:11:14.996 ], 00:11:14.996 "driver_specific": {} 00:11:14.996 }' 00:11:14.996 23:32:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:15.261 23:33:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:15.261 23:33:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:15.261 23:33:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:15.261 23:33:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:15.261 23:33:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:15.261 23:33:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:15.261 23:33:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:15.261 23:33:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:15.261 23:33:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:15.520 23:33:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:15.520 23:33:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:15.520 23:33:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:15.520 23:33:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:11:15.520 23:33:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:15.520 23:33:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:15.520 "name": "BaseBdev3", 00:11:15.520 "aliases": [ 00:11:15.520 "0f9d8d74-1b77-4654-9dad-c1cc8c832ede" 00:11:15.520 ], 00:11:15.520 "product_name": "Malloc disk", 00:11:15.520 "block_size": 512, 00:11:15.520 "num_blocks": 65536, 00:11:15.520 "uuid": "0f9d8d74-1b77-4654-9dad-c1cc8c832ede", 00:11:15.520 "assigned_rate_limits": { 00:11:15.520 "rw_ios_per_sec": 0, 00:11:15.520 "rw_mbytes_per_sec": 0, 00:11:15.520 "r_mbytes_per_sec": 0, 00:11:15.520 "w_mbytes_per_sec": 0 00:11:15.520 }, 00:11:15.520 "claimed": true, 00:11:15.520 "claim_type": "exclusive_write", 00:11:15.520 "zoned": false, 00:11:15.520 "supported_io_types": { 00:11:15.520 "read": true, 00:11:15.520 "write": true, 00:11:15.520 "unmap": true, 00:11:15.520 "flush": true, 00:11:15.520 "reset": true, 00:11:15.520 "nvme_admin": false, 00:11:15.520 "nvme_io": false, 00:11:15.520 "nvme_io_md": false, 00:11:15.520 "write_zeroes": true, 00:11:15.520 "zcopy": true, 00:11:15.520 "get_zone_info": false, 00:11:15.520 "zone_management": false, 00:11:15.520 "zone_append": false, 00:11:15.520 "compare": false, 00:11:15.520 "compare_and_write": false, 00:11:15.520 "abort": true, 00:11:15.520 "seek_hole": false, 00:11:15.520 "seek_data": false, 00:11:15.520 "copy": true, 00:11:15.520 "nvme_iov_md": false 00:11:15.520 }, 00:11:15.520 "memory_domains": [ 00:11:15.520 { 00:11:15.520 "dma_device_id": "system", 00:11:15.520 "dma_device_type": 1 00:11:15.520 }, 00:11:15.520 { 00:11:15.520 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:15.520 "dma_device_type": 2 00:11:15.520 } 00:11:15.520 ], 00:11:15.520 "driver_specific": {} 00:11:15.520 }' 00:11:15.520 23:33:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:15.520 23:33:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:15.777 23:33:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:15.777 23:33:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:15.777 23:33:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:15.777 23:33:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:15.777 23:33:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:15.777 23:33:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:15.777 23:33:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:15.777 23:33:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:15.777 23:33:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:15.777 23:33:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:15.777 23:33:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:16.035 [2024-07-24 23:33:00.924401] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:16.035 [2024-07-24 23:33:00.924425] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:16.035 [2024-07-24 23:33:00.924457] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:16.035 23:33:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:11:16.035 23:33:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:11:16.035 23:33:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:16.035 23:33:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:11:16.035 23:33:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:11:16.035 23:33:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 2 00:11:16.035 23:33:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:16.035 23:33:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:11:16.035 23:33:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:16.035 23:33:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:16.035 23:33:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:16.035 23:33:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:16.035 23:33:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:16.035 23:33:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:16.035 23:33:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:16.035 23:33:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:16.035 23:33:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:16.294 23:33:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:16.294 "name": "Existed_Raid", 00:11:16.294 "uuid": "4e234105-a48e-4e53-919a-869548b30d5a", 00:11:16.294 "strip_size_kb": 64, 00:11:16.294 "state": "offline", 00:11:16.294 "raid_level": "raid0", 00:11:16.294 "superblock": true, 00:11:16.294 "num_base_bdevs": 3, 00:11:16.294 "num_base_bdevs_discovered": 2, 00:11:16.294 "num_base_bdevs_operational": 2, 00:11:16.294 "base_bdevs_list": [ 00:11:16.294 { 00:11:16.294 "name": null, 00:11:16.294 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:16.294 "is_configured": false, 00:11:16.294 "data_offset": 2048, 00:11:16.294 "data_size": 63488 00:11:16.294 }, 00:11:16.294 { 00:11:16.294 "name": "BaseBdev2", 00:11:16.294 "uuid": "895c4e20-2154-4966-a420-e9c8a49bdd34", 00:11:16.294 "is_configured": true, 00:11:16.294 "data_offset": 2048, 00:11:16.294 "data_size": 63488 00:11:16.294 }, 00:11:16.294 { 00:11:16.294 "name": "BaseBdev3", 00:11:16.294 "uuid": "0f9d8d74-1b77-4654-9dad-c1cc8c832ede", 00:11:16.294 "is_configured": true, 00:11:16.294 "data_offset": 2048, 00:11:16.294 "data_size": 63488 00:11:16.294 } 00:11:16.294 ] 00:11:16.294 }' 00:11:16.294 23:33:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:16.294 23:33:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:16.858 23:33:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:11:16.858 23:33:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:16.858 23:33:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:16.858 23:33:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:11:16.858 23:33:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:11:16.858 23:33:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:11:16.858 23:33:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:11:17.115 [2024-07-24 23:33:01.939986] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:11:17.115 23:33:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:11:17.115 23:33:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:17.115 23:33:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:11:17.115 23:33:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:17.373 23:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:11:17.373 23:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:11:17.373 23:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:11:17.373 [2024-07-24 23:33:02.282676] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:11:17.373 [2024-07-24 23:33:02.282708] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1eaa2a0 name Existed_Raid, state offline 00:11:17.373 23:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:11:17.373 23:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:17.373 23:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:17.373 23:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:11:17.630 23:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:11:17.630 23:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:11:17.630 23:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:11:17.630 23:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:11:17.630 23:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:11:17.630 23:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:17.887 BaseBdev2 00:11:17.887 23:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:11:17.887 23:33:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:11:17.887 23:33:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:11:17.887 23:33:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:11:17.887 23:33:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:11:17.888 23:33:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:11:17.888 23:33:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:17.888 23:33:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:18.145 [ 00:11:18.145 { 00:11:18.145 "name": "BaseBdev2", 00:11:18.145 "aliases": [ 00:11:18.145 "67851700-fd19-49ca-80b0-69bb8501a856" 00:11:18.145 ], 00:11:18.145 "product_name": "Malloc disk", 00:11:18.145 "block_size": 512, 00:11:18.145 "num_blocks": 65536, 00:11:18.145 "uuid": "67851700-fd19-49ca-80b0-69bb8501a856", 00:11:18.145 "assigned_rate_limits": { 00:11:18.145 "rw_ios_per_sec": 0, 00:11:18.145 "rw_mbytes_per_sec": 0, 00:11:18.145 "r_mbytes_per_sec": 0, 00:11:18.145 "w_mbytes_per_sec": 0 00:11:18.145 }, 00:11:18.145 "claimed": false, 00:11:18.145 "zoned": false, 00:11:18.145 "supported_io_types": { 00:11:18.145 "read": true, 00:11:18.145 "write": true, 00:11:18.145 "unmap": true, 00:11:18.145 "flush": true, 00:11:18.145 "reset": true, 00:11:18.145 "nvme_admin": false, 00:11:18.145 "nvme_io": false, 00:11:18.145 "nvme_io_md": false, 00:11:18.145 "write_zeroes": true, 00:11:18.145 "zcopy": true, 00:11:18.145 "get_zone_info": false, 00:11:18.145 "zone_management": false, 00:11:18.145 "zone_append": false, 00:11:18.145 "compare": false, 00:11:18.145 "compare_and_write": false, 00:11:18.145 "abort": true, 00:11:18.145 "seek_hole": false, 00:11:18.145 "seek_data": false, 00:11:18.146 "copy": true, 00:11:18.146 "nvme_iov_md": false 00:11:18.146 }, 00:11:18.146 "memory_domains": [ 00:11:18.146 { 00:11:18.146 "dma_device_id": "system", 00:11:18.146 "dma_device_type": 1 00:11:18.146 }, 00:11:18.146 { 00:11:18.146 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:18.146 "dma_device_type": 2 00:11:18.146 } 00:11:18.146 ], 00:11:18.146 "driver_specific": {} 00:11:18.146 } 00:11:18.146 ] 00:11:18.146 23:33:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:11:18.146 23:33:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:11:18.146 23:33:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:11:18.146 23:33:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:11:18.403 BaseBdev3 00:11:18.403 23:33:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:11:18.403 23:33:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:11:18.403 23:33:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:11:18.403 23:33:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:11:18.403 23:33:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:11:18.403 23:33:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:11:18.403 23:33:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:18.403 23:33:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:11:18.662 [ 00:11:18.662 { 00:11:18.662 "name": "BaseBdev3", 00:11:18.662 "aliases": [ 00:11:18.662 "aca1f8c3-a6a2-4344-8cd2-50d9afce9a33" 00:11:18.662 ], 00:11:18.662 "product_name": "Malloc disk", 00:11:18.662 "block_size": 512, 00:11:18.662 "num_blocks": 65536, 00:11:18.662 "uuid": "aca1f8c3-a6a2-4344-8cd2-50d9afce9a33", 00:11:18.662 "assigned_rate_limits": { 00:11:18.662 "rw_ios_per_sec": 0, 00:11:18.662 "rw_mbytes_per_sec": 0, 00:11:18.662 "r_mbytes_per_sec": 0, 00:11:18.662 "w_mbytes_per_sec": 0 00:11:18.662 }, 00:11:18.662 "claimed": false, 00:11:18.662 "zoned": false, 00:11:18.662 "supported_io_types": { 00:11:18.662 "read": true, 00:11:18.662 "write": true, 00:11:18.662 "unmap": true, 00:11:18.662 "flush": true, 00:11:18.662 "reset": true, 00:11:18.662 "nvme_admin": false, 00:11:18.662 "nvme_io": false, 00:11:18.662 "nvme_io_md": false, 00:11:18.662 "write_zeroes": true, 00:11:18.662 "zcopy": true, 00:11:18.662 "get_zone_info": false, 00:11:18.662 "zone_management": false, 00:11:18.662 "zone_append": false, 00:11:18.662 "compare": false, 00:11:18.662 "compare_and_write": false, 00:11:18.662 "abort": true, 00:11:18.662 "seek_hole": false, 00:11:18.662 "seek_data": false, 00:11:18.662 "copy": true, 00:11:18.662 "nvme_iov_md": false 00:11:18.662 }, 00:11:18.662 "memory_domains": [ 00:11:18.662 { 00:11:18.662 "dma_device_id": "system", 00:11:18.662 "dma_device_type": 1 00:11:18.662 }, 00:11:18.662 { 00:11:18.662 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:18.662 "dma_device_type": 2 00:11:18.662 } 00:11:18.662 ], 00:11:18.662 "driver_specific": {} 00:11:18.662 } 00:11:18.662 ] 00:11:18.662 23:33:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:11:18.662 23:33:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:11:18.662 23:33:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:11:18.662 23:33:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:11:18.919 [2024-07-24 23:33:03.665151] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:18.919 [2024-07-24 23:33:03.665181] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:18.919 [2024-07-24 23:33:03.665193] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:18.919 [2024-07-24 23:33:03.666343] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:11:18.919 23:33:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:18.919 23:33:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:18.919 23:33:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:18.919 23:33:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:18.919 23:33:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:18.919 23:33:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:18.919 23:33:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:18.919 23:33:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:18.919 23:33:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:18.919 23:33:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:18.919 23:33:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:18.919 23:33:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:18.919 23:33:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:18.919 "name": "Existed_Raid", 00:11:18.919 "uuid": "f244eb35-2e93-45e5-b9e8-2e6d6f9ca4cc", 00:11:18.919 "strip_size_kb": 64, 00:11:18.919 "state": "configuring", 00:11:18.919 "raid_level": "raid0", 00:11:18.919 "superblock": true, 00:11:18.919 "num_base_bdevs": 3, 00:11:18.919 "num_base_bdevs_discovered": 2, 00:11:18.919 "num_base_bdevs_operational": 3, 00:11:18.919 "base_bdevs_list": [ 00:11:18.919 { 00:11:18.919 "name": "BaseBdev1", 00:11:18.919 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:18.919 "is_configured": false, 00:11:18.919 "data_offset": 0, 00:11:18.919 "data_size": 0 00:11:18.919 }, 00:11:18.919 { 00:11:18.919 "name": "BaseBdev2", 00:11:18.919 "uuid": "67851700-fd19-49ca-80b0-69bb8501a856", 00:11:18.919 "is_configured": true, 00:11:18.919 "data_offset": 2048, 00:11:18.919 "data_size": 63488 00:11:18.919 }, 00:11:18.919 { 00:11:18.919 "name": "BaseBdev3", 00:11:18.919 "uuid": "aca1f8c3-a6a2-4344-8cd2-50d9afce9a33", 00:11:18.919 "is_configured": true, 00:11:18.919 "data_offset": 2048, 00:11:18.919 "data_size": 63488 00:11:18.919 } 00:11:18.919 ] 00:11:18.919 }' 00:11:18.919 23:33:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:18.919 23:33:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:19.483 23:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:11:19.483 [2024-07-24 23:33:04.463196] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:11:19.741 23:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:19.741 23:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:19.741 23:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:19.741 23:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:19.741 23:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:19.741 23:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:19.741 23:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:19.741 23:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:19.741 23:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:19.741 23:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:19.741 23:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:19.741 23:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:19.741 23:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:19.741 "name": "Existed_Raid", 00:11:19.741 "uuid": "f244eb35-2e93-45e5-b9e8-2e6d6f9ca4cc", 00:11:19.741 "strip_size_kb": 64, 00:11:19.741 "state": "configuring", 00:11:19.741 "raid_level": "raid0", 00:11:19.741 "superblock": true, 00:11:19.741 "num_base_bdevs": 3, 00:11:19.741 "num_base_bdevs_discovered": 1, 00:11:19.741 "num_base_bdevs_operational": 3, 00:11:19.741 "base_bdevs_list": [ 00:11:19.741 { 00:11:19.741 "name": "BaseBdev1", 00:11:19.741 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:19.741 "is_configured": false, 00:11:19.741 "data_offset": 0, 00:11:19.741 "data_size": 0 00:11:19.741 }, 00:11:19.741 { 00:11:19.741 "name": null, 00:11:19.741 "uuid": "67851700-fd19-49ca-80b0-69bb8501a856", 00:11:19.741 "is_configured": false, 00:11:19.741 "data_offset": 2048, 00:11:19.741 "data_size": 63488 00:11:19.741 }, 00:11:19.741 { 00:11:19.741 "name": "BaseBdev3", 00:11:19.741 "uuid": "aca1f8c3-a6a2-4344-8cd2-50d9afce9a33", 00:11:19.741 "is_configured": true, 00:11:19.741 "data_offset": 2048, 00:11:19.741 "data_size": 63488 00:11:19.741 } 00:11:19.741 ] 00:11:19.741 }' 00:11:19.741 23:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:19.741 23:33:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:20.307 23:33:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:20.307 23:33:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:11:20.564 23:33:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:11:20.564 23:33:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:20.564 [2024-07-24 23:33:05.481646] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:20.564 BaseBdev1 00:11:20.564 23:33:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:11:20.564 23:33:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:11:20.564 23:33:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:11:20.564 23:33:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:11:20.564 23:33:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:11:20.564 23:33:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:11:20.564 23:33:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:20.821 23:33:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:20.821 [ 00:11:20.821 { 00:11:20.821 "name": "BaseBdev1", 00:11:20.821 "aliases": [ 00:11:20.821 "a50bd7b5-90f4-4efd-b39d-ac932e206b6d" 00:11:20.821 ], 00:11:20.821 "product_name": "Malloc disk", 00:11:20.821 "block_size": 512, 00:11:20.821 "num_blocks": 65536, 00:11:20.821 "uuid": "a50bd7b5-90f4-4efd-b39d-ac932e206b6d", 00:11:20.821 "assigned_rate_limits": { 00:11:20.821 "rw_ios_per_sec": 0, 00:11:20.821 "rw_mbytes_per_sec": 0, 00:11:20.821 "r_mbytes_per_sec": 0, 00:11:20.821 "w_mbytes_per_sec": 0 00:11:20.821 }, 00:11:20.821 "claimed": true, 00:11:20.821 "claim_type": "exclusive_write", 00:11:20.821 "zoned": false, 00:11:20.821 "supported_io_types": { 00:11:20.821 "read": true, 00:11:20.821 "write": true, 00:11:20.821 "unmap": true, 00:11:20.821 "flush": true, 00:11:20.821 "reset": true, 00:11:20.821 "nvme_admin": false, 00:11:20.821 "nvme_io": false, 00:11:20.821 "nvme_io_md": false, 00:11:20.821 "write_zeroes": true, 00:11:20.821 "zcopy": true, 00:11:20.821 "get_zone_info": false, 00:11:20.821 "zone_management": false, 00:11:20.821 "zone_append": false, 00:11:20.821 "compare": false, 00:11:20.821 "compare_and_write": false, 00:11:20.821 "abort": true, 00:11:20.821 "seek_hole": false, 00:11:20.821 "seek_data": false, 00:11:20.821 "copy": true, 00:11:20.821 "nvme_iov_md": false 00:11:20.821 }, 00:11:20.821 "memory_domains": [ 00:11:20.821 { 00:11:20.821 "dma_device_id": "system", 00:11:20.821 "dma_device_type": 1 00:11:20.821 }, 00:11:20.821 { 00:11:20.821 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:20.821 "dma_device_type": 2 00:11:20.821 } 00:11:20.821 ], 00:11:20.821 "driver_specific": {} 00:11:20.821 } 00:11:20.821 ] 00:11:20.821 23:33:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:11:20.821 23:33:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:20.821 23:33:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:20.821 23:33:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:20.821 23:33:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:20.821 23:33:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:20.821 23:33:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:20.821 23:33:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:20.821 23:33:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:20.821 23:33:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:20.821 23:33:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:20.821 23:33:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:20.822 23:33:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:21.079 23:33:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:21.079 "name": "Existed_Raid", 00:11:21.079 "uuid": "f244eb35-2e93-45e5-b9e8-2e6d6f9ca4cc", 00:11:21.079 "strip_size_kb": 64, 00:11:21.079 "state": "configuring", 00:11:21.079 "raid_level": "raid0", 00:11:21.080 "superblock": true, 00:11:21.080 "num_base_bdevs": 3, 00:11:21.080 "num_base_bdevs_discovered": 2, 00:11:21.080 "num_base_bdevs_operational": 3, 00:11:21.080 "base_bdevs_list": [ 00:11:21.080 { 00:11:21.080 "name": "BaseBdev1", 00:11:21.080 "uuid": "a50bd7b5-90f4-4efd-b39d-ac932e206b6d", 00:11:21.080 "is_configured": true, 00:11:21.080 "data_offset": 2048, 00:11:21.080 "data_size": 63488 00:11:21.080 }, 00:11:21.080 { 00:11:21.080 "name": null, 00:11:21.080 "uuid": "67851700-fd19-49ca-80b0-69bb8501a856", 00:11:21.080 "is_configured": false, 00:11:21.080 "data_offset": 2048, 00:11:21.080 "data_size": 63488 00:11:21.080 }, 00:11:21.080 { 00:11:21.080 "name": "BaseBdev3", 00:11:21.080 "uuid": "aca1f8c3-a6a2-4344-8cd2-50d9afce9a33", 00:11:21.080 "is_configured": true, 00:11:21.080 "data_offset": 2048, 00:11:21.080 "data_size": 63488 00:11:21.080 } 00:11:21.080 ] 00:11:21.080 }' 00:11:21.080 23:33:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:21.080 23:33:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:21.646 23:33:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:21.646 23:33:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:11:21.904 23:33:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:11:21.904 23:33:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:11:21.904 [2024-07-24 23:33:06.801058] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:11:21.904 23:33:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:21.904 23:33:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:21.904 23:33:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:21.904 23:33:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:21.904 23:33:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:21.904 23:33:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:21.904 23:33:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:21.904 23:33:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:21.904 23:33:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:21.904 23:33:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:21.904 23:33:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:21.904 23:33:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:22.161 23:33:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:22.161 "name": "Existed_Raid", 00:11:22.161 "uuid": "f244eb35-2e93-45e5-b9e8-2e6d6f9ca4cc", 00:11:22.161 "strip_size_kb": 64, 00:11:22.161 "state": "configuring", 00:11:22.161 "raid_level": "raid0", 00:11:22.161 "superblock": true, 00:11:22.161 "num_base_bdevs": 3, 00:11:22.161 "num_base_bdevs_discovered": 1, 00:11:22.161 "num_base_bdevs_operational": 3, 00:11:22.161 "base_bdevs_list": [ 00:11:22.161 { 00:11:22.161 "name": "BaseBdev1", 00:11:22.161 "uuid": "a50bd7b5-90f4-4efd-b39d-ac932e206b6d", 00:11:22.161 "is_configured": true, 00:11:22.161 "data_offset": 2048, 00:11:22.161 "data_size": 63488 00:11:22.161 }, 00:11:22.161 { 00:11:22.161 "name": null, 00:11:22.161 "uuid": "67851700-fd19-49ca-80b0-69bb8501a856", 00:11:22.161 "is_configured": false, 00:11:22.161 "data_offset": 2048, 00:11:22.161 "data_size": 63488 00:11:22.161 }, 00:11:22.161 { 00:11:22.161 "name": null, 00:11:22.161 "uuid": "aca1f8c3-a6a2-4344-8cd2-50d9afce9a33", 00:11:22.161 "is_configured": false, 00:11:22.161 "data_offset": 2048, 00:11:22.161 "data_size": 63488 00:11:22.161 } 00:11:22.161 ] 00:11:22.161 }' 00:11:22.161 23:33:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:22.161 23:33:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:22.728 23:33:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:22.728 23:33:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:11:22.728 23:33:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:11:22.728 23:33:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:11:22.987 [2024-07-24 23:33:07.791625] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:11:22.987 23:33:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:22.987 23:33:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:22.987 23:33:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:22.987 23:33:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:22.987 23:33:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:22.987 23:33:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:22.987 23:33:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:22.987 23:33:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:22.987 23:33:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:22.987 23:33:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:22.987 23:33:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:22.987 23:33:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:22.987 23:33:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:22.987 "name": "Existed_Raid", 00:11:22.987 "uuid": "f244eb35-2e93-45e5-b9e8-2e6d6f9ca4cc", 00:11:22.987 "strip_size_kb": 64, 00:11:22.987 "state": "configuring", 00:11:22.987 "raid_level": "raid0", 00:11:22.987 "superblock": true, 00:11:22.987 "num_base_bdevs": 3, 00:11:22.987 "num_base_bdevs_discovered": 2, 00:11:22.987 "num_base_bdevs_operational": 3, 00:11:22.987 "base_bdevs_list": [ 00:11:22.987 { 00:11:22.987 "name": "BaseBdev1", 00:11:22.987 "uuid": "a50bd7b5-90f4-4efd-b39d-ac932e206b6d", 00:11:22.987 "is_configured": true, 00:11:22.987 "data_offset": 2048, 00:11:22.987 "data_size": 63488 00:11:22.987 }, 00:11:22.987 { 00:11:22.987 "name": null, 00:11:22.987 "uuid": "67851700-fd19-49ca-80b0-69bb8501a856", 00:11:22.987 "is_configured": false, 00:11:22.987 "data_offset": 2048, 00:11:22.987 "data_size": 63488 00:11:22.987 }, 00:11:22.987 { 00:11:22.987 "name": "BaseBdev3", 00:11:22.987 "uuid": "aca1f8c3-a6a2-4344-8cd2-50d9afce9a33", 00:11:22.987 "is_configured": true, 00:11:22.987 "data_offset": 2048, 00:11:22.987 "data_size": 63488 00:11:22.987 } 00:11:22.987 ] 00:11:22.987 }' 00:11:22.987 23:33:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:22.987 23:33:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:23.553 23:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:23.553 23:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:11:23.811 23:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:11:23.811 23:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:23.811 [2024-07-24 23:33:08.754132] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:23.811 23:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:23.811 23:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:23.811 23:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:23.811 23:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:23.811 23:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:23.811 23:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:23.811 23:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:23.811 23:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:23.811 23:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:23.811 23:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:23.811 23:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:23.811 23:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:24.068 23:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:24.068 "name": "Existed_Raid", 00:11:24.068 "uuid": "f244eb35-2e93-45e5-b9e8-2e6d6f9ca4cc", 00:11:24.068 "strip_size_kb": 64, 00:11:24.068 "state": "configuring", 00:11:24.068 "raid_level": "raid0", 00:11:24.068 "superblock": true, 00:11:24.068 "num_base_bdevs": 3, 00:11:24.068 "num_base_bdevs_discovered": 1, 00:11:24.068 "num_base_bdevs_operational": 3, 00:11:24.068 "base_bdevs_list": [ 00:11:24.069 { 00:11:24.069 "name": null, 00:11:24.069 "uuid": "a50bd7b5-90f4-4efd-b39d-ac932e206b6d", 00:11:24.069 "is_configured": false, 00:11:24.069 "data_offset": 2048, 00:11:24.069 "data_size": 63488 00:11:24.069 }, 00:11:24.069 { 00:11:24.069 "name": null, 00:11:24.069 "uuid": "67851700-fd19-49ca-80b0-69bb8501a856", 00:11:24.069 "is_configured": false, 00:11:24.069 "data_offset": 2048, 00:11:24.069 "data_size": 63488 00:11:24.069 }, 00:11:24.069 { 00:11:24.069 "name": "BaseBdev3", 00:11:24.069 "uuid": "aca1f8c3-a6a2-4344-8cd2-50d9afce9a33", 00:11:24.069 "is_configured": true, 00:11:24.069 "data_offset": 2048, 00:11:24.069 "data_size": 63488 00:11:24.069 } 00:11:24.069 ] 00:11:24.069 }' 00:11:24.069 23:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:24.069 23:33:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:24.634 23:33:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:24.634 23:33:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:11:24.635 23:33:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:11:24.635 23:33:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:11:24.893 [2024-07-24 23:33:09.778270] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:24.893 23:33:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:24.893 23:33:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:24.893 23:33:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:24.893 23:33:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:24.893 23:33:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:24.893 23:33:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:24.893 23:33:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:24.893 23:33:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:24.893 23:33:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:24.893 23:33:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:24.893 23:33:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:24.893 23:33:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:25.151 23:33:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:25.151 "name": "Existed_Raid", 00:11:25.151 "uuid": "f244eb35-2e93-45e5-b9e8-2e6d6f9ca4cc", 00:11:25.151 "strip_size_kb": 64, 00:11:25.151 "state": "configuring", 00:11:25.151 "raid_level": "raid0", 00:11:25.151 "superblock": true, 00:11:25.151 "num_base_bdevs": 3, 00:11:25.151 "num_base_bdevs_discovered": 2, 00:11:25.151 "num_base_bdevs_operational": 3, 00:11:25.151 "base_bdevs_list": [ 00:11:25.151 { 00:11:25.151 "name": null, 00:11:25.151 "uuid": "a50bd7b5-90f4-4efd-b39d-ac932e206b6d", 00:11:25.151 "is_configured": false, 00:11:25.151 "data_offset": 2048, 00:11:25.151 "data_size": 63488 00:11:25.151 }, 00:11:25.151 { 00:11:25.151 "name": "BaseBdev2", 00:11:25.151 "uuid": "67851700-fd19-49ca-80b0-69bb8501a856", 00:11:25.151 "is_configured": true, 00:11:25.151 "data_offset": 2048, 00:11:25.151 "data_size": 63488 00:11:25.151 }, 00:11:25.151 { 00:11:25.151 "name": "BaseBdev3", 00:11:25.151 "uuid": "aca1f8c3-a6a2-4344-8cd2-50d9afce9a33", 00:11:25.151 "is_configured": true, 00:11:25.151 "data_offset": 2048, 00:11:25.151 "data_size": 63488 00:11:25.151 } 00:11:25.151 ] 00:11:25.151 }' 00:11:25.151 23:33:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:25.151 23:33:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:25.717 23:33:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:25.717 23:33:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:11:25.717 23:33:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:11:25.717 23:33:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:11:25.717 23:33:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:25.975 23:33:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u a50bd7b5-90f4-4efd-b39d-ac932e206b6d 00:11:25.975 [2024-07-24 23:33:10.945309] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:11:25.975 [2024-07-24 23:33:10.945442] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1ea0600 00:11:25.975 [2024-07-24 23:33:10.945450] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:11:25.975 [2024-07-24 23:33:10.945593] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x205be90 00:11:25.975 [2024-07-24 23:33:10.945681] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1ea0600 00:11:25.975 [2024-07-24 23:33:10.945686] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1ea0600 00:11:25.975 [2024-07-24 23:33:10.945760] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:25.975 NewBaseBdev 00:11:25.975 23:33:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:11:25.975 23:33:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=NewBaseBdev 00:11:25.975 23:33:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:11:25.975 23:33:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:11:25.975 23:33:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:11:25.976 23:33:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:11:25.976 23:33:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:26.234 23:33:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:11:26.492 [ 00:11:26.492 { 00:11:26.492 "name": "NewBaseBdev", 00:11:26.492 "aliases": [ 00:11:26.492 "a50bd7b5-90f4-4efd-b39d-ac932e206b6d" 00:11:26.492 ], 00:11:26.492 "product_name": "Malloc disk", 00:11:26.492 "block_size": 512, 00:11:26.492 "num_blocks": 65536, 00:11:26.492 "uuid": "a50bd7b5-90f4-4efd-b39d-ac932e206b6d", 00:11:26.492 "assigned_rate_limits": { 00:11:26.492 "rw_ios_per_sec": 0, 00:11:26.492 "rw_mbytes_per_sec": 0, 00:11:26.492 "r_mbytes_per_sec": 0, 00:11:26.492 "w_mbytes_per_sec": 0 00:11:26.492 }, 00:11:26.492 "claimed": true, 00:11:26.492 "claim_type": "exclusive_write", 00:11:26.492 "zoned": false, 00:11:26.492 "supported_io_types": { 00:11:26.492 "read": true, 00:11:26.492 "write": true, 00:11:26.492 "unmap": true, 00:11:26.492 "flush": true, 00:11:26.492 "reset": true, 00:11:26.492 "nvme_admin": false, 00:11:26.492 "nvme_io": false, 00:11:26.492 "nvme_io_md": false, 00:11:26.492 "write_zeroes": true, 00:11:26.492 "zcopy": true, 00:11:26.492 "get_zone_info": false, 00:11:26.492 "zone_management": false, 00:11:26.492 "zone_append": false, 00:11:26.492 "compare": false, 00:11:26.492 "compare_and_write": false, 00:11:26.492 "abort": true, 00:11:26.492 "seek_hole": false, 00:11:26.492 "seek_data": false, 00:11:26.492 "copy": true, 00:11:26.492 "nvme_iov_md": false 00:11:26.492 }, 00:11:26.492 "memory_domains": [ 00:11:26.492 { 00:11:26.492 "dma_device_id": "system", 00:11:26.492 "dma_device_type": 1 00:11:26.492 }, 00:11:26.492 { 00:11:26.492 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:26.492 "dma_device_type": 2 00:11:26.492 } 00:11:26.492 ], 00:11:26.492 "driver_specific": {} 00:11:26.492 } 00:11:26.492 ] 00:11:26.492 23:33:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:11:26.492 23:33:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:11:26.492 23:33:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:26.492 23:33:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:26.492 23:33:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:26.492 23:33:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:26.492 23:33:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:26.492 23:33:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:26.492 23:33:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:26.492 23:33:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:26.492 23:33:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:26.492 23:33:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:26.492 23:33:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:26.492 23:33:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:26.492 "name": "Existed_Raid", 00:11:26.492 "uuid": "f244eb35-2e93-45e5-b9e8-2e6d6f9ca4cc", 00:11:26.492 "strip_size_kb": 64, 00:11:26.492 "state": "online", 00:11:26.492 "raid_level": "raid0", 00:11:26.492 "superblock": true, 00:11:26.492 "num_base_bdevs": 3, 00:11:26.492 "num_base_bdevs_discovered": 3, 00:11:26.492 "num_base_bdevs_operational": 3, 00:11:26.492 "base_bdevs_list": [ 00:11:26.492 { 00:11:26.492 "name": "NewBaseBdev", 00:11:26.492 "uuid": "a50bd7b5-90f4-4efd-b39d-ac932e206b6d", 00:11:26.492 "is_configured": true, 00:11:26.492 "data_offset": 2048, 00:11:26.492 "data_size": 63488 00:11:26.492 }, 00:11:26.492 { 00:11:26.492 "name": "BaseBdev2", 00:11:26.492 "uuid": "67851700-fd19-49ca-80b0-69bb8501a856", 00:11:26.492 "is_configured": true, 00:11:26.492 "data_offset": 2048, 00:11:26.492 "data_size": 63488 00:11:26.492 }, 00:11:26.492 { 00:11:26.492 "name": "BaseBdev3", 00:11:26.492 "uuid": "aca1f8c3-a6a2-4344-8cd2-50d9afce9a33", 00:11:26.492 "is_configured": true, 00:11:26.492 "data_offset": 2048, 00:11:26.492 "data_size": 63488 00:11:26.492 } 00:11:26.492 ] 00:11:26.492 }' 00:11:26.492 23:33:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:26.492 23:33:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:27.058 23:33:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:11:27.058 23:33:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:11:27.058 23:33:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:27.058 23:33:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:27.058 23:33:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:27.058 23:33:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:11:27.058 23:33:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:27.058 23:33:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:27.058 [2024-07-24 23:33:12.016265] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:27.058 23:33:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:27.058 "name": "Existed_Raid", 00:11:27.058 "aliases": [ 00:11:27.058 "f244eb35-2e93-45e5-b9e8-2e6d6f9ca4cc" 00:11:27.058 ], 00:11:27.059 "product_name": "Raid Volume", 00:11:27.059 "block_size": 512, 00:11:27.059 "num_blocks": 190464, 00:11:27.059 "uuid": "f244eb35-2e93-45e5-b9e8-2e6d6f9ca4cc", 00:11:27.059 "assigned_rate_limits": { 00:11:27.059 "rw_ios_per_sec": 0, 00:11:27.059 "rw_mbytes_per_sec": 0, 00:11:27.059 "r_mbytes_per_sec": 0, 00:11:27.059 "w_mbytes_per_sec": 0 00:11:27.059 }, 00:11:27.059 "claimed": false, 00:11:27.059 "zoned": false, 00:11:27.059 "supported_io_types": { 00:11:27.059 "read": true, 00:11:27.059 "write": true, 00:11:27.059 "unmap": true, 00:11:27.059 "flush": true, 00:11:27.059 "reset": true, 00:11:27.059 "nvme_admin": false, 00:11:27.059 "nvme_io": false, 00:11:27.059 "nvme_io_md": false, 00:11:27.059 "write_zeroes": true, 00:11:27.059 "zcopy": false, 00:11:27.059 "get_zone_info": false, 00:11:27.059 "zone_management": false, 00:11:27.059 "zone_append": false, 00:11:27.059 "compare": false, 00:11:27.059 "compare_and_write": false, 00:11:27.059 "abort": false, 00:11:27.059 "seek_hole": false, 00:11:27.059 "seek_data": false, 00:11:27.059 "copy": false, 00:11:27.059 "nvme_iov_md": false 00:11:27.059 }, 00:11:27.059 "memory_domains": [ 00:11:27.059 { 00:11:27.059 "dma_device_id": "system", 00:11:27.059 "dma_device_type": 1 00:11:27.059 }, 00:11:27.059 { 00:11:27.059 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:27.059 "dma_device_type": 2 00:11:27.059 }, 00:11:27.059 { 00:11:27.059 "dma_device_id": "system", 00:11:27.059 "dma_device_type": 1 00:11:27.059 }, 00:11:27.059 { 00:11:27.059 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:27.059 "dma_device_type": 2 00:11:27.059 }, 00:11:27.059 { 00:11:27.059 "dma_device_id": "system", 00:11:27.059 "dma_device_type": 1 00:11:27.059 }, 00:11:27.059 { 00:11:27.059 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:27.059 "dma_device_type": 2 00:11:27.059 } 00:11:27.059 ], 00:11:27.059 "driver_specific": { 00:11:27.059 "raid": { 00:11:27.059 "uuid": "f244eb35-2e93-45e5-b9e8-2e6d6f9ca4cc", 00:11:27.059 "strip_size_kb": 64, 00:11:27.059 "state": "online", 00:11:27.059 "raid_level": "raid0", 00:11:27.059 "superblock": true, 00:11:27.059 "num_base_bdevs": 3, 00:11:27.059 "num_base_bdevs_discovered": 3, 00:11:27.059 "num_base_bdevs_operational": 3, 00:11:27.059 "base_bdevs_list": [ 00:11:27.059 { 00:11:27.059 "name": "NewBaseBdev", 00:11:27.059 "uuid": "a50bd7b5-90f4-4efd-b39d-ac932e206b6d", 00:11:27.059 "is_configured": true, 00:11:27.059 "data_offset": 2048, 00:11:27.059 "data_size": 63488 00:11:27.059 }, 00:11:27.059 { 00:11:27.059 "name": "BaseBdev2", 00:11:27.059 "uuid": "67851700-fd19-49ca-80b0-69bb8501a856", 00:11:27.059 "is_configured": true, 00:11:27.059 "data_offset": 2048, 00:11:27.059 "data_size": 63488 00:11:27.059 }, 00:11:27.059 { 00:11:27.059 "name": "BaseBdev3", 00:11:27.059 "uuid": "aca1f8c3-a6a2-4344-8cd2-50d9afce9a33", 00:11:27.059 "is_configured": true, 00:11:27.059 "data_offset": 2048, 00:11:27.059 "data_size": 63488 00:11:27.059 } 00:11:27.059 ] 00:11:27.059 } 00:11:27.059 } 00:11:27.059 }' 00:11:27.059 23:33:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:27.318 23:33:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:11:27.318 BaseBdev2 00:11:27.318 BaseBdev3' 00:11:27.318 23:33:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:27.318 23:33:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:11:27.318 23:33:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:27.318 23:33:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:27.318 "name": "NewBaseBdev", 00:11:27.318 "aliases": [ 00:11:27.318 "a50bd7b5-90f4-4efd-b39d-ac932e206b6d" 00:11:27.318 ], 00:11:27.318 "product_name": "Malloc disk", 00:11:27.318 "block_size": 512, 00:11:27.318 "num_blocks": 65536, 00:11:27.318 "uuid": "a50bd7b5-90f4-4efd-b39d-ac932e206b6d", 00:11:27.318 "assigned_rate_limits": { 00:11:27.318 "rw_ios_per_sec": 0, 00:11:27.318 "rw_mbytes_per_sec": 0, 00:11:27.318 "r_mbytes_per_sec": 0, 00:11:27.318 "w_mbytes_per_sec": 0 00:11:27.318 }, 00:11:27.318 "claimed": true, 00:11:27.318 "claim_type": "exclusive_write", 00:11:27.318 "zoned": false, 00:11:27.318 "supported_io_types": { 00:11:27.318 "read": true, 00:11:27.318 "write": true, 00:11:27.318 "unmap": true, 00:11:27.318 "flush": true, 00:11:27.318 "reset": true, 00:11:27.318 "nvme_admin": false, 00:11:27.318 "nvme_io": false, 00:11:27.318 "nvme_io_md": false, 00:11:27.318 "write_zeroes": true, 00:11:27.318 "zcopy": true, 00:11:27.318 "get_zone_info": false, 00:11:27.318 "zone_management": false, 00:11:27.318 "zone_append": false, 00:11:27.318 "compare": false, 00:11:27.318 "compare_and_write": false, 00:11:27.318 "abort": true, 00:11:27.318 "seek_hole": false, 00:11:27.318 "seek_data": false, 00:11:27.318 "copy": true, 00:11:27.318 "nvme_iov_md": false 00:11:27.318 }, 00:11:27.318 "memory_domains": [ 00:11:27.318 { 00:11:27.318 "dma_device_id": "system", 00:11:27.318 "dma_device_type": 1 00:11:27.318 }, 00:11:27.318 { 00:11:27.318 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:27.318 "dma_device_type": 2 00:11:27.318 } 00:11:27.318 ], 00:11:27.318 "driver_specific": {} 00:11:27.318 }' 00:11:27.318 23:33:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:27.318 23:33:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:27.576 23:33:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:27.576 23:33:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:27.576 23:33:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:27.576 23:33:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:27.576 23:33:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:27.576 23:33:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:27.576 23:33:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:27.576 23:33:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:27.576 23:33:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:27.576 23:33:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:27.576 23:33:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:27.576 23:33:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:27.576 23:33:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:27.834 23:33:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:27.834 "name": "BaseBdev2", 00:11:27.834 "aliases": [ 00:11:27.834 "67851700-fd19-49ca-80b0-69bb8501a856" 00:11:27.834 ], 00:11:27.834 "product_name": "Malloc disk", 00:11:27.834 "block_size": 512, 00:11:27.834 "num_blocks": 65536, 00:11:27.834 "uuid": "67851700-fd19-49ca-80b0-69bb8501a856", 00:11:27.834 "assigned_rate_limits": { 00:11:27.834 "rw_ios_per_sec": 0, 00:11:27.834 "rw_mbytes_per_sec": 0, 00:11:27.834 "r_mbytes_per_sec": 0, 00:11:27.834 "w_mbytes_per_sec": 0 00:11:27.834 }, 00:11:27.834 "claimed": true, 00:11:27.834 "claim_type": "exclusive_write", 00:11:27.834 "zoned": false, 00:11:27.834 "supported_io_types": { 00:11:27.834 "read": true, 00:11:27.834 "write": true, 00:11:27.834 "unmap": true, 00:11:27.834 "flush": true, 00:11:27.834 "reset": true, 00:11:27.834 "nvme_admin": false, 00:11:27.834 "nvme_io": false, 00:11:27.834 "nvme_io_md": false, 00:11:27.834 "write_zeroes": true, 00:11:27.834 "zcopy": true, 00:11:27.834 "get_zone_info": false, 00:11:27.834 "zone_management": false, 00:11:27.834 "zone_append": false, 00:11:27.834 "compare": false, 00:11:27.834 "compare_and_write": false, 00:11:27.834 "abort": true, 00:11:27.834 "seek_hole": false, 00:11:27.834 "seek_data": false, 00:11:27.834 "copy": true, 00:11:27.834 "nvme_iov_md": false 00:11:27.834 }, 00:11:27.834 "memory_domains": [ 00:11:27.834 { 00:11:27.834 "dma_device_id": "system", 00:11:27.834 "dma_device_type": 1 00:11:27.834 }, 00:11:27.834 { 00:11:27.834 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:27.834 "dma_device_type": 2 00:11:27.834 } 00:11:27.834 ], 00:11:27.834 "driver_specific": {} 00:11:27.834 }' 00:11:27.834 23:33:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:27.834 23:33:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:27.834 23:33:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:27.834 23:33:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:27.834 23:33:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:28.092 23:33:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:28.092 23:33:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:28.092 23:33:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:28.092 23:33:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:28.092 23:33:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:28.092 23:33:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:28.093 23:33:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:28.093 23:33:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:28.093 23:33:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:11:28.093 23:33:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:28.352 23:33:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:28.352 "name": "BaseBdev3", 00:11:28.352 "aliases": [ 00:11:28.352 "aca1f8c3-a6a2-4344-8cd2-50d9afce9a33" 00:11:28.352 ], 00:11:28.352 "product_name": "Malloc disk", 00:11:28.352 "block_size": 512, 00:11:28.352 "num_blocks": 65536, 00:11:28.352 "uuid": "aca1f8c3-a6a2-4344-8cd2-50d9afce9a33", 00:11:28.352 "assigned_rate_limits": { 00:11:28.352 "rw_ios_per_sec": 0, 00:11:28.352 "rw_mbytes_per_sec": 0, 00:11:28.352 "r_mbytes_per_sec": 0, 00:11:28.352 "w_mbytes_per_sec": 0 00:11:28.352 }, 00:11:28.352 "claimed": true, 00:11:28.352 "claim_type": "exclusive_write", 00:11:28.352 "zoned": false, 00:11:28.352 "supported_io_types": { 00:11:28.352 "read": true, 00:11:28.352 "write": true, 00:11:28.352 "unmap": true, 00:11:28.352 "flush": true, 00:11:28.352 "reset": true, 00:11:28.352 "nvme_admin": false, 00:11:28.352 "nvme_io": false, 00:11:28.352 "nvme_io_md": false, 00:11:28.352 "write_zeroes": true, 00:11:28.352 "zcopy": true, 00:11:28.352 "get_zone_info": false, 00:11:28.352 "zone_management": false, 00:11:28.352 "zone_append": false, 00:11:28.352 "compare": false, 00:11:28.352 "compare_and_write": false, 00:11:28.352 "abort": true, 00:11:28.352 "seek_hole": false, 00:11:28.352 "seek_data": false, 00:11:28.352 "copy": true, 00:11:28.352 "nvme_iov_md": false 00:11:28.352 }, 00:11:28.352 "memory_domains": [ 00:11:28.352 { 00:11:28.352 "dma_device_id": "system", 00:11:28.352 "dma_device_type": 1 00:11:28.352 }, 00:11:28.352 { 00:11:28.352 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:28.352 "dma_device_type": 2 00:11:28.352 } 00:11:28.352 ], 00:11:28.352 "driver_specific": {} 00:11:28.352 }' 00:11:28.352 23:33:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:28.352 23:33:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:28.352 23:33:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:28.352 23:33:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:28.352 23:33:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:28.352 23:33:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:28.352 23:33:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:28.649 23:33:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:28.649 23:33:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:28.649 23:33:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:28.649 23:33:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:28.649 23:33:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:28.649 23:33:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:28.925 [2024-07-24 23:33:13.660353] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:28.925 [2024-07-24 23:33:13.660373] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:28.925 [2024-07-24 23:33:13.660418] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:28.925 [2024-07-24 23:33:13.660454] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:28.925 [2024-07-24 23:33:13.660460] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ea0600 name Existed_Raid, state offline 00:11:28.925 23:33:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 267471 00:11:28.925 23:33:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@950 -- # '[' -z 267471 ']' 00:11:28.925 23:33:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # kill -0 267471 00:11:28.925 23:33:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # uname 00:11:28.925 23:33:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:11:28.925 23:33:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 267471 00:11:28.925 23:33:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:11:28.925 23:33:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:11:28.925 23:33:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # echo 'killing process with pid 267471' 00:11:28.925 killing process with pid 267471 00:11:28.925 23:33:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@969 -- # kill 267471 00:11:28.925 [2024-07-24 23:33:13.718215] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:28.925 23:33:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@974 -- # wait 267471 00:11:28.925 [2024-07-24 23:33:13.759767] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:29.184 23:33:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:11:29.184 00:11:29.184 real 0m21.374s 00:11:29.184 user 0m39.584s 00:11:29.184 sys 0m3.337s 00:11:29.184 23:33:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1126 -- # xtrace_disable 00:11:29.184 23:33:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:29.184 ************************************ 00:11:29.184 END TEST raid_state_function_test_sb 00:11:29.184 ************************************ 00:11:29.184 23:33:14 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 3 00:11:29.184 23:33:14 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:11:29.184 23:33:14 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:11:29.184 23:33:14 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:29.184 ************************************ 00:11:29.184 START TEST raid_superblock_test 00:11:29.184 ************************************ 00:11:29.184 23:33:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # raid_superblock_test raid0 3 00:11:29.184 23:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:11:29.184 23:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:11:29.184 23:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:11:29.184 23:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:11:29.184 23:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:11:29.184 23:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:11:29.184 23:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:11:29.184 23:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:11:29.184 23:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:11:29.184 23:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:11:29.184 23:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:11:29.184 23:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:11:29.184 23:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:11:29.184 23:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:11:29.184 23:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:11:29.184 23:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:11:29.184 23:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=272096 00:11:29.184 23:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 272096 /var/tmp/spdk-raid.sock 00:11:29.184 23:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:11:29.184 23:33:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@831 -- # '[' -z 272096 ']' 00:11:29.184 23:33:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:29.184 23:33:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:11:29.184 23:33:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:29.184 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:29.184 23:33:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:11:29.184 23:33:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:29.184 [2024-07-24 23:33:14.175808] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:11:29.184 [2024-07-24 23:33:14.175845] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid272096 ] 00:11:29.442 [2024-07-24 23:33:14.240902] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:29.442 [2024-07-24 23:33:14.314355] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:29.442 [2024-07-24 23:33:14.376153] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:29.442 [2024-07-24 23:33:14.376182] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:30.009 23:33:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:11:30.009 23:33:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@864 -- # return 0 00:11:30.009 23:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:11:30.009 23:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:11:30.009 23:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:11:30.009 23:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:11:30.009 23:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:11:30.009 23:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:11:30.009 23:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:11:30.009 23:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:11:30.009 23:33:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:11:30.267 malloc1 00:11:30.267 23:33:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:11:30.527 [2024-07-24 23:33:15.276321] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:11:30.527 [2024-07-24 23:33:15.276355] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:30.527 [2024-07-24 23:33:15.276365] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21b8dd0 00:11:30.527 [2024-07-24 23:33:15.276371] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:30.527 [2024-07-24 23:33:15.277321] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:30.527 [2024-07-24 23:33:15.277345] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:11:30.527 pt1 00:11:30.527 23:33:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:11:30.527 23:33:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:11:30.527 23:33:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:11:30.527 23:33:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:11:30.527 23:33:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:11:30.527 23:33:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:11:30.527 23:33:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:11:30.527 23:33:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:11:30.527 23:33:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:11:30.527 malloc2 00:11:30.527 23:33:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:11:30.785 [2024-07-24 23:33:15.636644] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:11:30.785 [2024-07-24 23:33:15.636672] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:30.785 [2024-07-24 23:33:15.636681] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21b98d0 00:11:30.785 [2024-07-24 23:33:15.636686] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:30.785 [2024-07-24 23:33:15.637598] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:30.785 [2024-07-24 23:33:15.637618] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:11:30.785 pt2 00:11:30.785 23:33:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:11:30.785 23:33:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:11:30.785 23:33:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:11:30.785 23:33:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:11:30.785 23:33:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:11:30.785 23:33:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:11:30.785 23:33:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:11:30.785 23:33:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:11:30.785 23:33:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:11:31.043 malloc3 00:11:31.043 23:33:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:11:31.043 [2024-07-24 23:33:15.992927] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:11:31.043 [2024-07-24 23:33:15.992958] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:31.043 [2024-07-24 23:33:15.992967] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x227a740 00:11:31.043 [2024-07-24 23:33:15.992973] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:31.043 [2024-07-24 23:33:15.993915] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:31.043 [2024-07-24 23:33:15.993934] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:11:31.043 pt3 00:11:31.043 23:33:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:11:31.043 23:33:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:11:31.043 23:33:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:11:31.302 [2024-07-24 23:33:16.165387] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:11:31.302 [2024-07-24 23:33:16.166145] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:11:31.302 [2024-07-24 23:33:16.166179] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:11:31.302 [2024-07-24 23:33:16.166272] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x227ad20 00:11:31.302 [2024-07-24 23:33:16.166278] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:11:31.302 [2024-07-24 23:33:16.166387] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x227c860 00:11:31.302 [2024-07-24 23:33:16.166486] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x227ad20 00:11:31.302 [2024-07-24 23:33:16.166492] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x227ad20 00:11:31.302 [2024-07-24 23:33:16.166549] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:31.302 23:33:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:11:31.302 23:33:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:31.302 23:33:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:31.302 23:33:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:31.302 23:33:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:31.302 23:33:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:31.302 23:33:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:31.302 23:33:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:31.302 23:33:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:31.302 23:33:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:31.302 23:33:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:31.302 23:33:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:31.561 23:33:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:31.561 "name": "raid_bdev1", 00:11:31.561 "uuid": "f14f0ce1-5186-4828-8ef2-fc05822a9f80", 00:11:31.561 "strip_size_kb": 64, 00:11:31.561 "state": "online", 00:11:31.561 "raid_level": "raid0", 00:11:31.561 "superblock": true, 00:11:31.561 "num_base_bdevs": 3, 00:11:31.561 "num_base_bdevs_discovered": 3, 00:11:31.561 "num_base_bdevs_operational": 3, 00:11:31.561 "base_bdevs_list": [ 00:11:31.561 { 00:11:31.561 "name": "pt1", 00:11:31.561 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:31.561 "is_configured": true, 00:11:31.561 "data_offset": 2048, 00:11:31.561 "data_size": 63488 00:11:31.561 }, 00:11:31.561 { 00:11:31.561 "name": "pt2", 00:11:31.561 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:31.561 "is_configured": true, 00:11:31.561 "data_offset": 2048, 00:11:31.561 "data_size": 63488 00:11:31.561 }, 00:11:31.561 { 00:11:31.561 "name": "pt3", 00:11:31.561 "uuid": "00000000-0000-0000-0000-000000000003", 00:11:31.561 "is_configured": true, 00:11:31.561 "data_offset": 2048, 00:11:31.561 "data_size": 63488 00:11:31.561 } 00:11:31.561 ] 00:11:31.561 }' 00:11:31.561 23:33:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:31.561 23:33:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:31.819 23:33:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:11:31.819 23:33:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:11:31.819 23:33:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:31.819 23:33:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:31.819 23:33:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:31.819 23:33:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:31.819 23:33:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:31.819 23:33:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:32.078 [2024-07-24 23:33:16.959628] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:32.078 23:33:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:32.078 "name": "raid_bdev1", 00:11:32.078 "aliases": [ 00:11:32.078 "f14f0ce1-5186-4828-8ef2-fc05822a9f80" 00:11:32.078 ], 00:11:32.078 "product_name": "Raid Volume", 00:11:32.078 "block_size": 512, 00:11:32.078 "num_blocks": 190464, 00:11:32.078 "uuid": "f14f0ce1-5186-4828-8ef2-fc05822a9f80", 00:11:32.078 "assigned_rate_limits": { 00:11:32.078 "rw_ios_per_sec": 0, 00:11:32.078 "rw_mbytes_per_sec": 0, 00:11:32.078 "r_mbytes_per_sec": 0, 00:11:32.078 "w_mbytes_per_sec": 0 00:11:32.078 }, 00:11:32.078 "claimed": false, 00:11:32.078 "zoned": false, 00:11:32.078 "supported_io_types": { 00:11:32.078 "read": true, 00:11:32.078 "write": true, 00:11:32.078 "unmap": true, 00:11:32.078 "flush": true, 00:11:32.078 "reset": true, 00:11:32.078 "nvme_admin": false, 00:11:32.078 "nvme_io": false, 00:11:32.078 "nvme_io_md": false, 00:11:32.078 "write_zeroes": true, 00:11:32.078 "zcopy": false, 00:11:32.078 "get_zone_info": false, 00:11:32.078 "zone_management": false, 00:11:32.078 "zone_append": false, 00:11:32.078 "compare": false, 00:11:32.078 "compare_and_write": false, 00:11:32.078 "abort": false, 00:11:32.078 "seek_hole": false, 00:11:32.078 "seek_data": false, 00:11:32.078 "copy": false, 00:11:32.078 "nvme_iov_md": false 00:11:32.078 }, 00:11:32.078 "memory_domains": [ 00:11:32.078 { 00:11:32.078 "dma_device_id": "system", 00:11:32.078 "dma_device_type": 1 00:11:32.078 }, 00:11:32.078 { 00:11:32.078 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:32.078 "dma_device_type": 2 00:11:32.078 }, 00:11:32.078 { 00:11:32.078 "dma_device_id": "system", 00:11:32.078 "dma_device_type": 1 00:11:32.078 }, 00:11:32.078 { 00:11:32.078 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:32.078 "dma_device_type": 2 00:11:32.078 }, 00:11:32.078 { 00:11:32.078 "dma_device_id": "system", 00:11:32.078 "dma_device_type": 1 00:11:32.078 }, 00:11:32.078 { 00:11:32.078 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:32.078 "dma_device_type": 2 00:11:32.078 } 00:11:32.078 ], 00:11:32.078 "driver_specific": { 00:11:32.078 "raid": { 00:11:32.078 "uuid": "f14f0ce1-5186-4828-8ef2-fc05822a9f80", 00:11:32.078 "strip_size_kb": 64, 00:11:32.078 "state": "online", 00:11:32.078 "raid_level": "raid0", 00:11:32.078 "superblock": true, 00:11:32.078 "num_base_bdevs": 3, 00:11:32.078 "num_base_bdevs_discovered": 3, 00:11:32.078 "num_base_bdevs_operational": 3, 00:11:32.078 "base_bdevs_list": [ 00:11:32.078 { 00:11:32.078 "name": "pt1", 00:11:32.078 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:32.078 "is_configured": true, 00:11:32.078 "data_offset": 2048, 00:11:32.078 "data_size": 63488 00:11:32.078 }, 00:11:32.078 { 00:11:32.078 "name": "pt2", 00:11:32.078 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:32.078 "is_configured": true, 00:11:32.078 "data_offset": 2048, 00:11:32.078 "data_size": 63488 00:11:32.078 }, 00:11:32.078 { 00:11:32.078 "name": "pt3", 00:11:32.078 "uuid": "00000000-0000-0000-0000-000000000003", 00:11:32.078 "is_configured": true, 00:11:32.078 "data_offset": 2048, 00:11:32.078 "data_size": 63488 00:11:32.078 } 00:11:32.078 ] 00:11:32.078 } 00:11:32.078 } 00:11:32.078 }' 00:11:32.078 23:33:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:32.078 23:33:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:11:32.078 pt2 00:11:32.078 pt3' 00:11:32.078 23:33:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:32.078 23:33:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:11:32.078 23:33:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:32.337 23:33:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:32.337 "name": "pt1", 00:11:32.337 "aliases": [ 00:11:32.337 "00000000-0000-0000-0000-000000000001" 00:11:32.337 ], 00:11:32.337 "product_name": "passthru", 00:11:32.337 "block_size": 512, 00:11:32.337 "num_blocks": 65536, 00:11:32.337 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:32.337 "assigned_rate_limits": { 00:11:32.337 "rw_ios_per_sec": 0, 00:11:32.337 "rw_mbytes_per_sec": 0, 00:11:32.337 "r_mbytes_per_sec": 0, 00:11:32.337 "w_mbytes_per_sec": 0 00:11:32.337 }, 00:11:32.337 "claimed": true, 00:11:32.337 "claim_type": "exclusive_write", 00:11:32.337 "zoned": false, 00:11:32.337 "supported_io_types": { 00:11:32.337 "read": true, 00:11:32.337 "write": true, 00:11:32.337 "unmap": true, 00:11:32.337 "flush": true, 00:11:32.337 "reset": true, 00:11:32.337 "nvme_admin": false, 00:11:32.337 "nvme_io": false, 00:11:32.337 "nvme_io_md": false, 00:11:32.337 "write_zeroes": true, 00:11:32.337 "zcopy": true, 00:11:32.337 "get_zone_info": false, 00:11:32.337 "zone_management": false, 00:11:32.337 "zone_append": false, 00:11:32.337 "compare": false, 00:11:32.337 "compare_and_write": false, 00:11:32.337 "abort": true, 00:11:32.337 "seek_hole": false, 00:11:32.337 "seek_data": false, 00:11:32.337 "copy": true, 00:11:32.337 "nvme_iov_md": false 00:11:32.337 }, 00:11:32.337 "memory_domains": [ 00:11:32.337 { 00:11:32.337 "dma_device_id": "system", 00:11:32.337 "dma_device_type": 1 00:11:32.337 }, 00:11:32.337 { 00:11:32.337 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:32.337 "dma_device_type": 2 00:11:32.337 } 00:11:32.337 ], 00:11:32.337 "driver_specific": { 00:11:32.337 "passthru": { 00:11:32.337 "name": "pt1", 00:11:32.337 "base_bdev_name": "malloc1" 00:11:32.337 } 00:11:32.337 } 00:11:32.337 }' 00:11:32.337 23:33:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:32.337 23:33:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:32.337 23:33:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:32.337 23:33:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:32.337 23:33:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:32.595 23:33:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:32.595 23:33:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:32.595 23:33:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:32.595 23:33:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:32.595 23:33:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:32.595 23:33:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:32.595 23:33:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:32.595 23:33:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:32.595 23:33:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:32.595 23:33:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:11:32.854 23:33:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:32.854 "name": "pt2", 00:11:32.854 "aliases": [ 00:11:32.854 "00000000-0000-0000-0000-000000000002" 00:11:32.854 ], 00:11:32.854 "product_name": "passthru", 00:11:32.854 "block_size": 512, 00:11:32.854 "num_blocks": 65536, 00:11:32.854 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:32.854 "assigned_rate_limits": { 00:11:32.854 "rw_ios_per_sec": 0, 00:11:32.854 "rw_mbytes_per_sec": 0, 00:11:32.854 "r_mbytes_per_sec": 0, 00:11:32.854 "w_mbytes_per_sec": 0 00:11:32.854 }, 00:11:32.854 "claimed": true, 00:11:32.854 "claim_type": "exclusive_write", 00:11:32.854 "zoned": false, 00:11:32.854 "supported_io_types": { 00:11:32.854 "read": true, 00:11:32.854 "write": true, 00:11:32.854 "unmap": true, 00:11:32.854 "flush": true, 00:11:32.854 "reset": true, 00:11:32.854 "nvme_admin": false, 00:11:32.854 "nvme_io": false, 00:11:32.854 "nvme_io_md": false, 00:11:32.854 "write_zeroes": true, 00:11:32.854 "zcopy": true, 00:11:32.854 "get_zone_info": false, 00:11:32.854 "zone_management": false, 00:11:32.854 "zone_append": false, 00:11:32.854 "compare": false, 00:11:32.854 "compare_and_write": false, 00:11:32.854 "abort": true, 00:11:32.854 "seek_hole": false, 00:11:32.854 "seek_data": false, 00:11:32.854 "copy": true, 00:11:32.854 "nvme_iov_md": false 00:11:32.854 }, 00:11:32.854 "memory_domains": [ 00:11:32.854 { 00:11:32.854 "dma_device_id": "system", 00:11:32.854 "dma_device_type": 1 00:11:32.854 }, 00:11:32.854 { 00:11:32.854 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:32.854 "dma_device_type": 2 00:11:32.854 } 00:11:32.854 ], 00:11:32.854 "driver_specific": { 00:11:32.854 "passthru": { 00:11:32.854 "name": "pt2", 00:11:32.854 "base_bdev_name": "malloc2" 00:11:32.855 } 00:11:32.855 } 00:11:32.855 }' 00:11:32.855 23:33:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:32.855 23:33:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:32.855 23:33:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:32.855 23:33:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:32.855 23:33:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:32.855 23:33:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:32.855 23:33:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:33.113 23:33:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:33.113 23:33:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:33.113 23:33:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:33.113 23:33:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:33.113 23:33:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:33.113 23:33:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:33.113 23:33:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:11:33.113 23:33:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:33.371 23:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:33.371 "name": "pt3", 00:11:33.371 "aliases": [ 00:11:33.371 "00000000-0000-0000-0000-000000000003" 00:11:33.371 ], 00:11:33.371 "product_name": "passthru", 00:11:33.371 "block_size": 512, 00:11:33.371 "num_blocks": 65536, 00:11:33.371 "uuid": "00000000-0000-0000-0000-000000000003", 00:11:33.371 "assigned_rate_limits": { 00:11:33.371 "rw_ios_per_sec": 0, 00:11:33.371 "rw_mbytes_per_sec": 0, 00:11:33.371 "r_mbytes_per_sec": 0, 00:11:33.371 "w_mbytes_per_sec": 0 00:11:33.371 }, 00:11:33.371 "claimed": true, 00:11:33.371 "claim_type": "exclusive_write", 00:11:33.371 "zoned": false, 00:11:33.371 "supported_io_types": { 00:11:33.371 "read": true, 00:11:33.371 "write": true, 00:11:33.371 "unmap": true, 00:11:33.371 "flush": true, 00:11:33.371 "reset": true, 00:11:33.371 "nvme_admin": false, 00:11:33.371 "nvme_io": false, 00:11:33.371 "nvme_io_md": false, 00:11:33.371 "write_zeroes": true, 00:11:33.371 "zcopy": true, 00:11:33.371 "get_zone_info": false, 00:11:33.371 "zone_management": false, 00:11:33.371 "zone_append": false, 00:11:33.371 "compare": false, 00:11:33.371 "compare_and_write": false, 00:11:33.371 "abort": true, 00:11:33.371 "seek_hole": false, 00:11:33.371 "seek_data": false, 00:11:33.371 "copy": true, 00:11:33.371 "nvme_iov_md": false 00:11:33.371 }, 00:11:33.371 "memory_domains": [ 00:11:33.371 { 00:11:33.371 "dma_device_id": "system", 00:11:33.371 "dma_device_type": 1 00:11:33.371 }, 00:11:33.371 { 00:11:33.371 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:33.371 "dma_device_type": 2 00:11:33.371 } 00:11:33.371 ], 00:11:33.371 "driver_specific": { 00:11:33.371 "passthru": { 00:11:33.371 "name": "pt3", 00:11:33.371 "base_bdev_name": "malloc3" 00:11:33.371 } 00:11:33.371 } 00:11:33.371 }' 00:11:33.371 23:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:33.371 23:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:33.371 23:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:33.371 23:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:33.371 23:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:33.371 23:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:33.371 23:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:33.371 23:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:33.629 23:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:33.629 23:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:33.629 23:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:33.629 23:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:33.629 23:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:33.629 23:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:11:33.629 [2024-07-24 23:33:18.615882] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:33.888 23:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=f14f0ce1-5186-4828-8ef2-fc05822a9f80 00:11:33.888 23:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z f14f0ce1-5186-4828-8ef2-fc05822a9f80 ']' 00:11:33.888 23:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:33.888 [2024-07-24 23:33:18.780120] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:33.888 [2024-07-24 23:33:18.780135] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:33.888 [2024-07-24 23:33:18.780171] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:33.888 [2024-07-24 23:33:18.780213] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:33.888 [2024-07-24 23:33:18.780219] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x227ad20 name raid_bdev1, state offline 00:11:33.888 23:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:33.888 23:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:11:34.146 23:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:11:34.146 23:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:11:34.146 23:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:11:34.146 23:33:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:11:34.146 23:33:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:11:34.146 23:33:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:11:34.404 23:33:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:11:34.404 23:33:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:11:34.662 23:33:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:11:34.662 23:33:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:11:34.662 23:33:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:11:34.662 23:33:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:11:34.662 23:33:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # local es=0 00:11:34.662 23:33:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:11:34.662 23:33:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:34.662 23:33:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:11:34.662 23:33:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:34.662 23:33:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:11:34.662 23:33:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:34.663 23:33:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:11:34.663 23:33:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:34.663 23:33:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:11:34.663 23:33:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:11:34.921 [2024-07-24 23:33:19.794727] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:11:34.921 [2024-07-24 23:33:19.795731] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:11:34.921 [2024-07-24 23:33:19.795769] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:11:34.921 [2024-07-24 23:33:19.795803] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:11:34.921 [2024-07-24 23:33:19.795830] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:11:34.921 [2024-07-24 23:33:19.795843] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:11:34.921 [2024-07-24 23:33:19.795852] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:34.921 [2024-07-24 23:33:19.795858] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21b9250 name raid_bdev1, state configuring 00:11:34.921 request: 00:11:34.921 { 00:11:34.921 "name": "raid_bdev1", 00:11:34.921 "raid_level": "raid0", 00:11:34.921 "base_bdevs": [ 00:11:34.921 "malloc1", 00:11:34.921 "malloc2", 00:11:34.921 "malloc3" 00:11:34.921 ], 00:11:34.921 "strip_size_kb": 64, 00:11:34.921 "superblock": false, 00:11:34.921 "method": "bdev_raid_create", 00:11:34.921 "req_id": 1 00:11:34.921 } 00:11:34.921 Got JSON-RPC error response 00:11:34.921 response: 00:11:34.921 { 00:11:34.921 "code": -17, 00:11:34.921 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:11:34.921 } 00:11:34.921 23:33:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # es=1 00:11:34.921 23:33:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:11:34.921 23:33:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:11:34.921 23:33:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:11:34.921 23:33:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:34.921 23:33:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:11:35.180 23:33:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:11:35.180 23:33:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:11:35.180 23:33:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:11:35.180 [2024-07-24 23:33:20.135572] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:11:35.180 [2024-07-24 23:33:20.135610] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:35.180 [2024-07-24 23:33:20.135621] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21b9000 00:11:35.180 [2024-07-24 23:33:20.135627] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:35.180 [2024-07-24 23:33:20.136765] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:35.180 [2024-07-24 23:33:20.136788] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:11:35.180 [2024-07-24 23:33:20.136834] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:11:35.180 [2024-07-24 23:33:20.136852] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:11:35.180 pt1 00:11:35.180 23:33:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 3 00:11:35.180 23:33:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:35.180 23:33:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:35.180 23:33:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:35.180 23:33:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:35.180 23:33:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:35.180 23:33:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:35.180 23:33:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:35.180 23:33:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:35.180 23:33:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:35.180 23:33:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:35.180 23:33:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:35.437 23:33:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:35.437 "name": "raid_bdev1", 00:11:35.437 "uuid": "f14f0ce1-5186-4828-8ef2-fc05822a9f80", 00:11:35.437 "strip_size_kb": 64, 00:11:35.437 "state": "configuring", 00:11:35.437 "raid_level": "raid0", 00:11:35.437 "superblock": true, 00:11:35.437 "num_base_bdevs": 3, 00:11:35.437 "num_base_bdevs_discovered": 1, 00:11:35.437 "num_base_bdevs_operational": 3, 00:11:35.437 "base_bdevs_list": [ 00:11:35.437 { 00:11:35.437 "name": "pt1", 00:11:35.437 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:35.437 "is_configured": true, 00:11:35.437 "data_offset": 2048, 00:11:35.437 "data_size": 63488 00:11:35.437 }, 00:11:35.437 { 00:11:35.437 "name": null, 00:11:35.437 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:35.437 "is_configured": false, 00:11:35.437 "data_offset": 2048, 00:11:35.437 "data_size": 63488 00:11:35.437 }, 00:11:35.437 { 00:11:35.437 "name": null, 00:11:35.437 "uuid": "00000000-0000-0000-0000-000000000003", 00:11:35.437 "is_configured": false, 00:11:35.437 "data_offset": 2048, 00:11:35.437 "data_size": 63488 00:11:35.437 } 00:11:35.437 ] 00:11:35.437 }' 00:11:35.437 23:33:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:35.437 23:33:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:36.002 23:33:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:11:36.002 23:33:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:11:36.002 [2024-07-24 23:33:20.913598] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:11:36.002 [2024-07-24 23:33:20.913638] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:36.002 [2024-07-24 23:33:20.913649] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x227a970 00:11:36.002 [2024-07-24 23:33:20.913655] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:36.002 [2024-07-24 23:33:20.913905] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:36.002 [2024-07-24 23:33:20.913916] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:11:36.002 [2024-07-24 23:33:20.913961] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:11:36.002 [2024-07-24 23:33:20.913974] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:11:36.002 pt2 00:11:36.002 23:33:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:11:36.259 [2024-07-24 23:33:21.082056] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:11:36.259 23:33:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 3 00:11:36.259 23:33:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:36.259 23:33:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:36.259 23:33:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:36.259 23:33:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:36.259 23:33:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:36.259 23:33:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:36.259 23:33:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:36.259 23:33:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:36.259 23:33:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:36.259 23:33:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:36.259 23:33:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:36.516 23:33:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:36.516 "name": "raid_bdev1", 00:11:36.516 "uuid": "f14f0ce1-5186-4828-8ef2-fc05822a9f80", 00:11:36.516 "strip_size_kb": 64, 00:11:36.516 "state": "configuring", 00:11:36.516 "raid_level": "raid0", 00:11:36.516 "superblock": true, 00:11:36.516 "num_base_bdevs": 3, 00:11:36.516 "num_base_bdevs_discovered": 1, 00:11:36.516 "num_base_bdevs_operational": 3, 00:11:36.516 "base_bdevs_list": [ 00:11:36.516 { 00:11:36.516 "name": "pt1", 00:11:36.516 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:36.516 "is_configured": true, 00:11:36.516 "data_offset": 2048, 00:11:36.516 "data_size": 63488 00:11:36.516 }, 00:11:36.516 { 00:11:36.516 "name": null, 00:11:36.516 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:36.516 "is_configured": false, 00:11:36.516 "data_offset": 2048, 00:11:36.516 "data_size": 63488 00:11:36.516 }, 00:11:36.516 { 00:11:36.516 "name": null, 00:11:36.516 "uuid": "00000000-0000-0000-0000-000000000003", 00:11:36.516 "is_configured": false, 00:11:36.516 "data_offset": 2048, 00:11:36.516 "data_size": 63488 00:11:36.516 } 00:11:36.516 ] 00:11:36.516 }' 00:11:36.516 23:33:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:36.516 23:33:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:37.083 23:33:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:11:37.083 23:33:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:11:37.083 23:33:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:11:37.083 [2024-07-24 23:33:21.932237] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:11:37.083 [2024-07-24 23:33:21.932293] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:37.083 [2024-07-24 23:33:21.932307] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21b1ca0 00:11:37.083 [2024-07-24 23:33:21.932314] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:37.083 [2024-07-24 23:33:21.932575] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:37.083 [2024-07-24 23:33:21.932588] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:11:37.083 [2024-07-24 23:33:21.932631] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:11:37.083 [2024-07-24 23:33:21.932643] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:11:37.083 pt2 00:11:37.083 23:33:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:11:37.083 23:33:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:11:37.083 23:33:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:11:37.341 [2024-07-24 23:33:22.104684] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:11:37.341 [2024-07-24 23:33:22.104703] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:37.341 [2024-07-24 23:33:22.104711] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21b0a40 00:11:37.341 [2024-07-24 23:33:22.104716] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:37.341 [2024-07-24 23:33:22.104900] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:37.341 [2024-07-24 23:33:22.104910] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:11:37.341 [2024-07-24 23:33:22.104938] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:11:37.341 [2024-07-24 23:33:22.104948] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:11:37.341 [2024-07-24 23:33:22.105012] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x227c4f0 00:11:37.341 [2024-07-24 23:33:22.105017] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:11:37.341 [2024-07-24 23:33:22.105119] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21b49e0 00:11:37.341 [2024-07-24 23:33:22.105197] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x227c4f0 00:11:37.341 [2024-07-24 23:33:22.105206] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x227c4f0 00:11:37.341 [2024-07-24 23:33:22.105264] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:37.341 pt3 00:11:37.341 23:33:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:11:37.341 23:33:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:11:37.341 23:33:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:11:37.341 23:33:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:37.341 23:33:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:37.341 23:33:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:37.341 23:33:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:37.341 23:33:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:37.341 23:33:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:37.341 23:33:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:37.341 23:33:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:37.341 23:33:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:37.341 23:33:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:37.341 23:33:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:37.341 23:33:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:37.341 "name": "raid_bdev1", 00:11:37.341 "uuid": "f14f0ce1-5186-4828-8ef2-fc05822a9f80", 00:11:37.341 "strip_size_kb": 64, 00:11:37.341 "state": "online", 00:11:37.341 "raid_level": "raid0", 00:11:37.341 "superblock": true, 00:11:37.341 "num_base_bdevs": 3, 00:11:37.341 "num_base_bdevs_discovered": 3, 00:11:37.341 "num_base_bdevs_operational": 3, 00:11:37.341 "base_bdevs_list": [ 00:11:37.341 { 00:11:37.341 "name": "pt1", 00:11:37.341 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:37.341 "is_configured": true, 00:11:37.341 "data_offset": 2048, 00:11:37.341 "data_size": 63488 00:11:37.341 }, 00:11:37.341 { 00:11:37.341 "name": "pt2", 00:11:37.341 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:37.341 "is_configured": true, 00:11:37.341 "data_offset": 2048, 00:11:37.341 "data_size": 63488 00:11:37.341 }, 00:11:37.341 { 00:11:37.341 "name": "pt3", 00:11:37.341 "uuid": "00000000-0000-0000-0000-000000000003", 00:11:37.341 "is_configured": true, 00:11:37.341 "data_offset": 2048, 00:11:37.341 "data_size": 63488 00:11:37.341 } 00:11:37.341 ] 00:11:37.341 }' 00:11:37.341 23:33:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:37.341 23:33:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:37.906 23:33:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:11:37.906 23:33:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:11:37.906 23:33:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:37.906 23:33:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:37.906 23:33:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:37.906 23:33:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:37.906 23:33:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:37.906 23:33:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:38.184 [2024-07-24 23:33:22.939011] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:38.184 23:33:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:38.184 "name": "raid_bdev1", 00:11:38.184 "aliases": [ 00:11:38.184 "f14f0ce1-5186-4828-8ef2-fc05822a9f80" 00:11:38.184 ], 00:11:38.184 "product_name": "Raid Volume", 00:11:38.184 "block_size": 512, 00:11:38.184 "num_blocks": 190464, 00:11:38.184 "uuid": "f14f0ce1-5186-4828-8ef2-fc05822a9f80", 00:11:38.184 "assigned_rate_limits": { 00:11:38.184 "rw_ios_per_sec": 0, 00:11:38.184 "rw_mbytes_per_sec": 0, 00:11:38.184 "r_mbytes_per_sec": 0, 00:11:38.184 "w_mbytes_per_sec": 0 00:11:38.184 }, 00:11:38.184 "claimed": false, 00:11:38.184 "zoned": false, 00:11:38.184 "supported_io_types": { 00:11:38.184 "read": true, 00:11:38.184 "write": true, 00:11:38.184 "unmap": true, 00:11:38.184 "flush": true, 00:11:38.184 "reset": true, 00:11:38.184 "nvme_admin": false, 00:11:38.184 "nvme_io": false, 00:11:38.184 "nvme_io_md": false, 00:11:38.184 "write_zeroes": true, 00:11:38.184 "zcopy": false, 00:11:38.184 "get_zone_info": false, 00:11:38.184 "zone_management": false, 00:11:38.184 "zone_append": false, 00:11:38.184 "compare": false, 00:11:38.184 "compare_and_write": false, 00:11:38.184 "abort": false, 00:11:38.184 "seek_hole": false, 00:11:38.184 "seek_data": false, 00:11:38.184 "copy": false, 00:11:38.184 "nvme_iov_md": false 00:11:38.184 }, 00:11:38.184 "memory_domains": [ 00:11:38.184 { 00:11:38.184 "dma_device_id": "system", 00:11:38.184 "dma_device_type": 1 00:11:38.184 }, 00:11:38.184 { 00:11:38.184 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:38.184 "dma_device_type": 2 00:11:38.184 }, 00:11:38.184 { 00:11:38.184 "dma_device_id": "system", 00:11:38.184 "dma_device_type": 1 00:11:38.184 }, 00:11:38.184 { 00:11:38.184 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:38.184 "dma_device_type": 2 00:11:38.184 }, 00:11:38.184 { 00:11:38.184 "dma_device_id": "system", 00:11:38.184 "dma_device_type": 1 00:11:38.184 }, 00:11:38.184 { 00:11:38.184 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:38.184 "dma_device_type": 2 00:11:38.184 } 00:11:38.184 ], 00:11:38.184 "driver_specific": { 00:11:38.184 "raid": { 00:11:38.184 "uuid": "f14f0ce1-5186-4828-8ef2-fc05822a9f80", 00:11:38.184 "strip_size_kb": 64, 00:11:38.184 "state": "online", 00:11:38.184 "raid_level": "raid0", 00:11:38.184 "superblock": true, 00:11:38.184 "num_base_bdevs": 3, 00:11:38.184 "num_base_bdevs_discovered": 3, 00:11:38.184 "num_base_bdevs_operational": 3, 00:11:38.184 "base_bdevs_list": [ 00:11:38.184 { 00:11:38.184 "name": "pt1", 00:11:38.184 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:38.184 "is_configured": true, 00:11:38.184 "data_offset": 2048, 00:11:38.184 "data_size": 63488 00:11:38.184 }, 00:11:38.184 { 00:11:38.184 "name": "pt2", 00:11:38.184 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:38.184 "is_configured": true, 00:11:38.184 "data_offset": 2048, 00:11:38.184 "data_size": 63488 00:11:38.184 }, 00:11:38.184 { 00:11:38.184 "name": "pt3", 00:11:38.184 "uuid": "00000000-0000-0000-0000-000000000003", 00:11:38.184 "is_configured": true, 00:11:38.184 "data_offset": 2048, 00:11:38.184 "data_size": 63488 00:11:38.184 } 00:11:38.184 ] 00:11:38.184 } 00:11:38.184 } 00:11:38.184 }' 00:11:38.184 23:33:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:38.184 23:33:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:11:38.184 pt2 00:11:38.184 pt3' 00:11:38.184 23:33:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:38.184 23:33:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:11:38.184 23:33:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:38.184 23:33:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:38.184 "name": "pt1", 00:11:38.184 "aliases": [ 00:11:38.184 "00000000-0000-0000-0000-000000000001" 00:11:38.184 ], 00:11:38.184 "product_name": "passthru", 00:11:38.184 "block_size": 512, 00:11:38.184 "num_blocks": 65536, 00:11:38.184 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:38.184 "assigned_rate_limits": { 00:11:38.184 "rw_ios_per_sec": 0, 00:11:38.184 "rw_mbytes_per_sec": 0, 00:11:38.184 "r_mbytes_per_sec": 0, 00:11:38.184 "w_mbytes_per_sec": 0 00:11:38.184 }, 00:11:38.184 "claimed": true, 00:11:38.184 "claim_type": "exclusive_write", 00:11:38.184 "zoned": false, 00:11:38.184 "supported_io_types": { 00:11:38.184 "read": true, 00:11:38.184 "write": true, 00:11:38.184 "unmap": true, 00:11:38.184 "flush": true, 00:11:38.184 "reset": true, 00:11:38.184 "nvme_admin": false, 00:11:38.184 "nvme_io": false, 00:11:38.184 "nvme_io_md": false, 00:11:38.184 "write_zeroes": true, 00:11:38.184 "zcopy": true, 00:11:38.184 "get_zone_info": false, 00:11:38.184 "zone_management": false, 00:11:38.184 "zone_append": false, 00:11:38.184 "compare": false, 00:11:38.184 "compare_and_write": false, 00:11:38.184 "abort": true, 00:11:38.184 "seek_hole": false, 00:11:38.184 "seek_data": false, 00:11:38.184 "copy": true, 00:11:38.184 "nvme_iov_md": false 00:11:38.184 }, 00:11:38.184 "memory_domains": [ 00:11:38.184 { 00:11:38.184 "dma_device_id": "system", 00:11:38.184 "dma_device_type": 1 00:11:38.184 }, 00:11:38.184 { 00:11:38.184 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:38.184 "dma_device_type": 2 00:11:38.184 } 00:11:38.184 ], 00:11:38.184 "driver_specific": { 00:11:38.184 "passthru": { 00:11:38.184 "name": "pt1", 00:11:38.184 "base_bdev_name": "malloc1" 00:11:38.184 } 00:11:38.184 } 00:11:38.184 }' 00:11:38.184 23:33:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:38.442 23:33:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:38.442 23:33:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:38.442 23:33:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:38.442 23:33:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:38.442 23:33:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:38.442 23:33:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:38.442 23:33:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:38.442 23:33:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:38.442 23:33:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:38.700 23:33:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:38.700 23:33:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:38.700 23:33:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:38.700 23:33:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:11:38.700 23:33:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:38.700 23:33:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:38.700 "name": "pt2", 00:11:38.700 "aliases": [ 00:11:38.700 "00000000-0000-0000-0000-000000000002" 00:11:38.700 ], 00:11:38.700 "product_name": "passthru", 00:11:38.700 "block_size": 512, 00:11:38.700 "num_blocks": 65536, 00:11:38.700 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:38.700 "assigned_rate_limits": { 00:11:38.700 "rw_ios_per_sec": 0, 00:11:38.700 "rw_mbytes_per_sec": 0, 00:11:38.700 "r_mbytes_per_sec": 0, 00:11:38.700 "w_mbytes_per_sec": 0 00:11:38.700 }, 00:11:38.700 "claimed": true, 00:11:38.700 "claim_type": "exclusive_write", 00:11:38.700 "zoned": false, 00:11:38.700 "supported_io_types": { 00:11:38.700 "read": true, 00:11:38.700 "write": true, 00:11:38.700 "unmap": true, 00:11:38.700 "flush": true, 00:11:38.700 "reset": true, 00:11:38.700 "nvme_admin": false, 00:11:38.700 "nvme_io": false, 00:11:38.700 "nvme_io_md": false, 00:11:38.700 "write_zeroes": true, 00:11:38.700 "zcopy": true, 00:11:38.700 "get_zone_info": false, 00:11:38.700 "zone_management": false, 00:11:38.700 "zone_append": false, 00:11:38.700 "compare": false, 00:11:38.700 "compare_and_write": false, 00:11:38.700 "abort": true, 00:11:38.700 "seek_hole": false, 00:11:38.700 "seek_data": false, 00:11:38.700 "copy": true, 00:11:38.700 "nvme_iov_md": false 00:11:38.700 }, 00:11:38.700 "memory_domains": [ 00:11:38.700 { 00:11:38.700 "dma_device_id": "system", 00:11:38.700 "dma_device_type": 1 00:11:38.700 }, 00:11:38.700 { 00:11:38.700 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:38.700 "dma_device_type": 2 00:11:38.700 } 00:11:38.700 ], 00:11:38.700 "driver_specific": { 00:11:38.700 "passthru": { 00:11:38.700 "name": "pt2", 00:11:38.700 "base_bdev_name": "malloc2" 00:11:38.700 } 00:11:38.700 } 00:11:38.700 }' 00:11:38.700 23:33:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:38.956 23:33:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:38.956 23:33:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:38.956 23:33:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:38.956 23:33:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:38.956 23:33:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:38.956 23:33:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:38.956 23:33:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:38.957 23:33:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:38.957 23:33:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:38.957 23:33:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:38.957 23:33:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:38.957 23:33:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:38.957 23:33:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:11:38.957 23:33:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:39.214 23:33:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:39.214 "name": "pt3", 00:11:39.214 "aliases": [ 00:11:39.214 "00000000-0000-0000-0000-000000000003" 00:11:39.214 ], 00:11:39.214 "product_name": "passthru", 00:11:39.214 "block_size": 512, 00:11:39.214 "num_blocks": 65536, 00:11:39.214 "uuid": "00000000-0000-0000-0000-000000000003", 00:11:39.214 "assigned_rate_limits": { 00:11:39.214 "rw_ios_per_sec": 0, 00:11:39.214 "rw_mbytes_per_sec": 0, 00:11:39.214 "r_mbytes_per_sec": 0, 00:11:39.214 "w_mbytes_per_sec": 0 00:11:39.214 }, 00:11:39.214 "claimed": true, 00:11:39.214 "claim_type": "exclusive_write", 00:11:39.214 "zoned": false, 00:11:39.214 "supported_io_types": { 00:11:39.214 "read": true, 00:11:39.214 "write": true, 00:11:39.214 "unmap": true, 00:11:39.214 "flush": true, 00:11:39.214 "reset": true, 00:11:39.214 "nvme_admin": false, 00:11:39.214 "nvme_io": false, 00:11:39.214 "nvme_io_md": false, 00:11:39.214 "write_zeroes": true, 00:11:39.214 "zcopy": true, 00:11:39.214 "get_zone_info": false, 00:11:39.214 "zone_management": false, 00:11:39.214 "zone_append": false, 00:11:39.214 "compare": false, 00:11:39.214 "compare_and_write": false, 00:11:39.214 "abort": true, 00:11:39.214 "seek_hole": false, 00:11:39.214 "seek_data": false, 00:11:39.214 "copy": true, 00:11:39.214 "nvme_iov_md": false 00:11:39.214 }, 00:11:39.214 "memory_domains": [ 00:11:39.214 { 00:11:39.214 "dma_device_id": "system", 00:11:39.214 "dma_device_type": 1 00:11:39.214 }, 00:11:39.214 { 00:11:39.214 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:39.214 "dma_device_type": 2 00:11:39.214 } 00:11:39.214 ], 00:11:39.214 "driver_specific": { 00:11:39.214 "passthru": { 00:11:39.214 "name": "pt3", 00:11:39.214 "base_bdev_name": "malloc3" 00:11:39.214 } 00:11:39.214 } 00:11:39.214 }' 00:11:39.214 23:33:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:39.214 23:33:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:39.214 23:33:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:39.214 23:33:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:39.471 23:33:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:39.471 23:33:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:39.471 23:33:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:39.471 23:33:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:39.471 23:33:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:39.471 23:33:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:39.471 23:33:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:39.471 23:33:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:39.471 23:33:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:39.471 23:33:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:11:39.728 [2024-07-24 23:33:24.555188] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:39.728 23:33:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' f14f0ce1-5186-4828-8ef2-fc05822a9f80 '!=' f14f0ce1-5186-4828-8ef2-fc05822a9f80 ']' 00:11:39.728 23:33:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:11:39.728 23:33:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:39.728 23:33:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:11:39.728 23:33:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 272096 00:11:39.728 23:33:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@950 -- # '[' -z 272096 ']' 00:11:39.728 23:33:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # kill -0 272096 00:11:39.728 23:33:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # uname 00:11:39.728 23:33:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:11:39.728 23:33:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 272096 00:11:39.729 23:33:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:11:39.729 23:33:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:11:39.729 23:33:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 272096' 00:11:39.729 killing process with pid 272096 00:11:39.729 23:33:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@969 -- # kill 272096 00:11:39.729 [2024-07-24 23:33:24.604259] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:39.729 [2024-07-24 23:33:24.604306] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:39.729 [2024-07-24 23:33:24.604342] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:39.729 [2024-07-24 23:33:24.604348] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x227c4f0 name raid_bdev1, state offline 00:11:39.729 23:33:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@974 -- # wait 272096 00:11:39.729 [2024-07-24 23:33:24.627266] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:39.987 23:33:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:11:39.987 00:11:39.987 real 0m10.674s 00:11:39.987 user 0m19.489s 00:11:39.987 sys 0m1.642s 00:11:39.987 23:33:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:11:39.987 23:33:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:39.987 ************************************ 00:11:39.987 END TEST raid_superblock_test 00:11:39.987 ************************************ 00:11:39.987 23:33:24 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 3 read 00:11:39.987 23:33:24 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:11:39.987 23:33:24 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:11:39.987 23:33:24 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:39.987 ************************************ 00:11:39.987 START TEST raid_read_error_test 00:11:39.987 ************************************ 00:11:39.987 23:33:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test raid0 3 read 00:11:39.987 23:33:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:11:39.987 23:33:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:11:39.987 23:33:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:11:39.987 23:33:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:11:39.987 23:33:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:39.987 23:33:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:11:39.987 23:33:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:39.987 23:33:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:39.987 23:33:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:11:39.987 23:33:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:39.987 23:33:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:39.987 23:33:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:11:39.987 23:33:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:39.987 23:33:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:39.987 23:33:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:11:39.987 23:33:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:11:39.987 23:33:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:11:39.987 23:33:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:11:39.987 23:33:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:11:39.987 23:33:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:11:39.987 23:33:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:11:39.987 23:33:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:11:39.987 23:33:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:11:39.987 23:33:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:11:39.987 23:33:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:11:39.987 23:33:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.dBPF9opHGq 00:11:39.987 23:33:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=274170 00:11:39.987 23:33:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 274170 /var/tmp/spdk-raid.sock 00:11:39.987 23:33:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:11:39.987 23:33:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@831 -- # '[' -z 274170 ']' 00:11:39.987 23:33:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:39.987 23:33:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:11:39.987 23:33:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:39.987 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:39.987 23:33:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:11:39.987 23:33:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:39.987 [2024-07-24 23:33:24.926722] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:11:39.987 [2024-07-24 23:33:24.926763] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid274170 ] 00:11:40.246 [2024-07-24 23:33:24.991357] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:40.246 [2024-07-24 23:33:25.069565] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:40.246 [2024-07-24 23:33:25.124127] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:40.246 [2024-07-24 23:33:25.124155] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:40.812 23:33:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:11:40.812 23:33:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@864 -- # return 0 00:11:40.812 23:33:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:40.812 23:33:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:11:41.070 BaseBdev1_malloc 00:11:41.070 23:33:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:11:41.070 true 00:11:41.070 23:33:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:11:41.328 [2024-07-24 23:33:26.187900] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:11:41.328 [2024-07-24 23:33:26.187932] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:41.328 [2024-07-24 23:33:26.187944] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe63550 00:11:41.328 [2024-07-24 23:33:26.187950] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:41.328 [2024-07-24 23:33:26.189172] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:41.328 [2024-07-24 23:33:26.189194] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:11:41.328 BaseBdev1 00:11:41.328 23:33:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:41.328 23:33:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:11:41.587 BaseBdev2_malloc 00:11:41.587 23:33:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:11:41.587 true 00:11:41.587 23:33:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:11:41.845 [2024-07-24 23:33:26.684714] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:11:41.845 [2024-07-24 23:33:26.684755] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:41.845 [2024-07-24 23:33:26.684766] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe67d90 00:11:41.845 [2024-07-24 23:33:26.684772] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:41.845 [2024-07-24 23:33:26.685781] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:41.845 [2024-07-24 23:33:26.685802] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:11:41.845 BaseBdev2 00:11:41.845 23:33:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:41.845 23:33:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:11:42.103 BaseBdev3_malloc 00:11:42.103 23:33:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:11:42.103 true 00:11:42.103 23:33:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:11:42.361 [2024-07-24 23:33:27.177411] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:11:42.361 [2024-07-24 23:33:27.177442] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:42.361 [2024-07-24 23:33:27.177453] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe6a050 00:11:42.361 [2024-07-24 23:33:27.177459] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:42.361 [2024-07-24 23:33:27.178485] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:42.361 [2024-07-24 23:33:27.178506] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:11:42.361 BaseBdev3 00:11:42.361 23:33:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:11:42.361 [2024-07-24 23:33:27.329833] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:42.361 [2024-07-24 23:33:27.330663] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:42.361 [2024-07-24 23:33:27.330708] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:11:42.361 [2024-07-24 23:33:27.330845] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xe6b700 00:11:42.361 [2024-07-24 23:33:27.330852] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:11:42.361 [2024-07-24 23:33:27.330976] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe6b2a0 00:11:42.361 [2024-07-24 23:33:27.331072] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe6b700 00:11:42.361 [2024-07-24 23:33:27.331077] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xe6b700 00:11:42.361 [2024-07-24 23:33:27.331143] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:42.361 23:33:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:11:42.361 23:33:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:42.361 23:33:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:42.361 23:33:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:42.361 23:33:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:42.362 23:33:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:42.362 23:33:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:42.362 23:33:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:42.362 23:33:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:42.362 23:33:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:42.362 23:33:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:42.362 23:33:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:42.620 23:33:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:42.620 "name": "raid_bdev1", 00:11:42.620 "uuid": "d570e8f6-ddbd-4ec3-8116-e08a1153fb54", 00:11:42.620 "strip_size_kb": 64, 00:11:42.620 "state": "online", 00:11:42.620 "raid_level": "raid0", 00:11:42.620 "superblock": true, 00:11:42.620 "num_base_bdevs": 3, 00:11:42.620 "num_base_bdevs_discovered": 3, 00:11:42.620 "num_base_bdevs_operational": 3, 00:11:42.620 "base_bdevs_list": [ 00:11:42.620 { 00:11:42.620 "name": "BaseBdev1", 00:11:42.620 "uuid": "01697663-3f00-5587-9f04-3f77de3d89fc", 00:11:42.620 "is_configured": true, 00:11:42.620 "data_offset": 2048, 00:11:42.620 "data_size": 63488 00:11:42.620 }, 00:11:42.620 { 00:11:42.620 "name": "BaseBdev2", 00:11:42.620 "uuid": "c955bf1d-f69d-5996-bc3a-9a586d203e9b", 00:11:42.620 "is_configured": true, 00:11:42.620 "data_offset": 2048, 00:11:42.620 "data_size": 63488 00:11:42.620 }, 00:11:42.620 { 00:11:42.620 "name": "BaseBdev3", 00:11:42.620 "uuid": "e533bb66-0bb1-5dcb-a551-b8cae5d05326", 00:11:42.620 "is_configured": true, 00:11:42.620 "data_offset": 2048, 00:11:42.620 "data_size": 63488 00:11:42.620 } 00:11:42.620 ] 00:11:42.620 }' 00:11:42.620 23:33:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:42.620 23:33:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:43.188 23:33:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:11:43.188 23:33:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:11:43.188 [2024-07-24 23:33:28.096017] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xcb9a30 00:11:44.125 23:33:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:11:44.386 23:33:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:11:44.386 23:33:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:11:44.386 23:33:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:11:44.387 23:33:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:11:44.387 23:33:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:44.387 23:33:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:44.387 23:33:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:44.387 23:33:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:44.387 23:33:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:44.387 23:33:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:44.387 23:33:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:44.387 23:33:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:44.387 23:33:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:44.387 23:33:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:44.387 23:33:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:44.387 23:33:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:44.387 "name": "raid_bdev1", 00:11:44.387 "uuid": "d570e8f6-ddbd-4ec3-8116-e08a1153fb54", 00:11:44.387 "strip_size_kb": 64, 00:11:44.387 "state": "online", 00:11:44.387 "raid_level": "raid0", 00:11:44.387 "superblock": true, 00:11:44.387 "num_base_bdevs": 3, 00:11:44.387 "num_base_bdevs_discovered": 3, 00:11:44.387 "num_base_bdevs_operational": 3, 00:11:44.387 "base_bdevs_list": [ 00:11:44.387 { 00:11:44.387 "name": "BaseBdev1", 00:11:44.387 "uuid": "01697663-3f00-5587-9f04-3f77de3d89fc", 00:11:44.387 "is_configured": true, 00:11:44.387 "data_offset": 2048, 00:11:44.387 "data_size": 63488 00:11:44.387 }, 00:11:44.387 { 00:11:44.387 "name": "BaseBdev2", 00:11:44.387 "uuid": "c955bf1d-f69d-5996-bc3a-9a586d203e9b", 00:11:44.387 "is_configured": true, 00:11:44.387 "data_offset": 2048, 00:11:44.387 "data_size": 63488 00:11:44.387 }, 00:11:44.387 { 00:11:44.387 "name": "BaseBdev3", 00:11:44.387 "uuid": "e533bb66-0bb1-5dcb-a551-b8cae5d05326", 00:11:44.387 "is_configured": true, 00:11:44.387 "data_offset": 2048, 00:11:44.387 "data_size": 63488 00:11:44.387 } 00:11:44.387 ] 00:11:44.387 }' 00:11:44.387 23:33:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:44.387 23:33:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:44.956 23:33:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:45.235 [2024-07-24 23:33:30.016159] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:45.235 [2024-07-24 23:33:30.016189] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:45.235 [2024-07-24 23:33:30.018329] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:45.235 [2024-07-24 23:33:30.018357] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:45.235 [2024-07-24 23:33:30.018380] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:45.235 [2024-07-24 23:33:30.018386] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe6b700 name raid_bdev1, state offline 00:11:45.235 0 00:11:45.235 23:33:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 274170 00:11:45.235 23:33:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@950 -- # '[' -z 274170 ']' 00:11:45.235 23:33:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # kill -0 274170 00:11:45.235 23:33:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # uname 00:11:45.235 23:33:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:11:45.235 23:33:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 274170 00:11:45.235 23:33:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:11:45.235 23:33:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:11:45.235 23:33:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 274170' 00:11:45.235 killing process with pid 274170 00:11:45.235 23:33:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@969 -- # kill 274170 00:11:45.235 [2024-07-24 23:33:30.079326] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:45.235 23:33:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@974 -- # wait 274170 00:11:45.236 [2024-07-24 23:33:30.097199] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:45.506 23:33:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.dBPF9opHGq 00:11:45.506 23:33:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:11:45.506 23:33:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:11:45.506 23:33:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.52 00:11:45.506 23:33:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:11:45.506 23:33:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:45.506 23:33:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:11:45.506 23:33:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.52 != \0\.\0\0 ]] 00:11:45.506 00:11:45.506 real 0m5.422s 00:11:45.506 user 0m8.455s 00:11:45.506 sys 0m0.773s 00:11:45.506 23:33:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:11:45.506 23:33:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:45.506 ************************************ 00:11:45.506 END TEST raid_read_error_test 00:11:45.506 ************************************ 00:11:45.506 23:33:30 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 3 write 00:11:45.506 23:33:30 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:11:45.506 23:33:30 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:11:45.506 23:33:30 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:45.506 ************************************ 00:11:45.506 START TEST raid_write_error_test 00:11:45.506 ************************************ 00:11:45.506 23:33:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test raid0 3 write 00:11:45.506 23:33:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:11:45.506 23:33:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:11:45.506 23:33:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:11:45.506 23:33:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:11:45.506 23:33:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:45.506 23:33:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:11:45.506 23:33:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:45.506 23:33:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:45.506 23:33:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:11:45.506 23:33:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:45.506 23:33:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:45.506 23:33:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:11:45.506 23:33:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:45.506 23:33:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:45.506 23:33:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:11:45.506 23:33:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:11:45.506 23:33:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:11:45.506 23:33:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:11:45.506 23:33:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:11:45.506 23:33:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:11:45.506 23:33:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:11:45.506 23:33:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:11:45.506 23:33:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:11:45.506 23:33:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:11:45.506 23:33:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:11:45.506 23:33:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.UloDF0ThWm 00:11:45.506 23:33:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=275188 00:11:45.506 23:33:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 275188 /var/tmp/spdk-raid.sock 00:11:45.506 23:33:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:11:45.506 23:33:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@831 -- # '[' -z 275188 ']' 00:11:45.506 23:33:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:45.506 23:33:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:11:45.506 23:33:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:45.506 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:45.506 23:33:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:11:45.506 23:33:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:45.506 [2024-07-24 23:33:30.419147] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:11:45.506 [2024-07-24 23:33:30.419187] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid275188 ] 00:11:45.506 [2024-07-24 23:33:30.482175] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:45.765 [2024-07-24 23:33:30.561992] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:45.765 [2024-07-24 23:33:30.618551] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:45.765 [2024-07-24 23:33:30.618580] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:46.332 23:33:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:11:46.332 23:33:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@864 -- # return 0 00:11:46.332 23:33:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:46.332 23:33:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:11:46.590 BaseBdev1_malloc 00:11:46.590 23:33:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:11:46.590 true 00:11:46.590 23:33:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:11:46.848 [2024-07-24 23:33:31.690726] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:11:46.848 [2024-07-24 23:33:31.690758] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:46.848 [2024-07-24 23:33:31.690769] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x222e550 00:11:46.848 [2024-07-24 23:33:31.690776] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:46.848 [2024-07-24 23:33:31.691973] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:46.848 [2024-07-24 23:33:31.691995] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:11:46.848 BaseBdev1 00:11:46.848 23:33:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:46.848 23:33:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:11:47.106 BaseBdev2_malloc 00:11:47.106 23:33:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:11:47.106 true 00:11:47.106 23:33:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:11:47.365 [2024-07-24 23:33:32.183511] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:11:47.365 [2024-07-24 23:33:32.183542] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:47.365 [2024-07-24 23:33:32.183552] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2232d90 00:11:47.365 [2024-07-24 23:33:32.183558] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:47.365 [2024-07-24 23:33:32.184580] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:47.365 [2024-07-24 23:33:32.184602] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:11:47.365 BaseBdev2 00:11:47.365 23:33:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:47.365 23:33:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:11:47.365 BaseBdev3_malloc 00:11:47.365 23:33:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:11:47.623 true 00:11:47.623 23:33:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:11:47.882 [2024-07-24 23:33:32.652337] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:11:47.882 [2024-07-24 23:33:32.652367] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:47.882 [2024-07-24 23:33:32.652377] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2235050 00:11:47.882 [2024-07-24 23:33:32.652383] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:47.882 [2024-07-24 23:33:32.653428] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:47.882 [2024-07-24 23:33:32.653448] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:11:47.882 BaseBdev3 00:11:47.882 23:33:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:11:47.882 [2024-07-24 23:33:32.820796] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:47.882 [2024-07-24 23:33:32.821658] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:47.882 [2024-07-24 23:33:32.821704] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:11:47.882 [2024-07-24 23:33:32.821839] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x2236700 00:11:47.882 [2024-07-24 23:33:32.821846] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:11:47.882 [2024-07-24 23:33:32.821974] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x22362a0 00:11:47.882 [2024-07-24 23:33:32.822070] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2236700 00:11:47.882 [2024-07-24 23:33:32.822075] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2236700 00:11:47.882 [2024-07-24 23:33:32.822139] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:47.882 23:33:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:11:47.882 23:33:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:47.882 23:33:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:47.882 23:33:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:47.882 23:33:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:47.882 23:33:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:47.882 23:33:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:47.882 23:33:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:47.882 23:33:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:47.882 23:33:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:47.882 23:33:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:47.882 23:33:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:48.141 23:33:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:48.141 "name": "raid_bdev1", 00:11:48.141 "uuid": "0d76ac9e-9a7b-4727-9c45-c2e73bdbe3a2", 00:11:48.141 "strip_size_kb": 64, 00:11:48.141 "state": "online", 00:11:48.141 "raid_level": "raid0", 00:11:48.141 "superblock": true, 00:11:48.141 "num_base_bdevs": 3, 00:11:48.141 "num_base_bdevs_discovered": 3, 00:11:48.141 "num_base_bdevs_operational": 3, 00:11:48.141 "base_bdevs_list": [ 00:11:48.141 { 00:11:48.141 "name": "BaseBdev1", 00:11:48.141 "uuid": "043e2e59-9cb3-5468-b698-6a36d2d2aaf0", 00:11:48.141 "is_configured": true, 00:11:48.141 "data_offset": 2048, 00:11:48.141 "data_size": 63488 00:11:48.141 }, 00:11:48.141 { 00:11:48.141 "name": "BaseBdev2", 00:11:48.141 "uuid": "21944924-fc7a-5966-a7f2-eeee9454c537", 00:11:48.141 "is_configured": true, 00:11:48.141 "data_offset": 2048, 00:11:48.141 "data_size": 63488 00:11:48.141 }, 00:11:48.141 { 00:11:48.141 "name": "BaseBdev3", 00:11:48.141 "uuid": "6d231d98-29a7-5277-b167-7ffcc9f974a7", 00:11:48.141 "is_configured": true, 00:11:48.141 "data_offset": 2048, 00:11:48.141 "data_size": 63488 00:11:48.141 } 00:11:48.141 ] 00:11:48.141 }' 00:11:48.141 23:33:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:48.141 23:33:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:48.705 23:33:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:11:48.705 23:33:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:11:48.705 [2024-07-24 23:33:33.570942] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2084a30 00:11:49.637 23:33:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:11:49.894 23:33:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:11:49.894 23:33:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:11:49.894 23:33:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:11:49.894 23:33:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:11:49.894 23:33:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:49.894 23:33:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:49.894 23:33:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:49.894 23:33:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:49.894 23:33:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:49.894 23:33:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:49.894 23:33:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:49.894 23:33:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:49.894 23:33:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:49.894 23:33:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:49.894 23:33:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:49.894 23:33:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:49.894 "name": "raid_bdev1", 00:11:49.894 "uuid": "0d76ac9e-9a7b-4727-9c45-c2e73bdbe3a2", 00:11:49.894 "strip_size_kb": 64, 00:11:49.894 "state": "online", 00:11:49.894 "raid_level": "raid0", 00:11:49.894 "superblock": true, 00:11:49.894 "num_base_bdevs": 3, 00:11:49.894 "num_base_bdevs_discovered": 3, 00:11:49.894 "num_base_bdevs_operational": 3, 00:11:49.894 "base_bdevs_list": [ 00:11:49.894 { 00:11:49.894 "name": "BaseBdev1", 00:11:49.894 "uuid": "043e2e59-9cb3-5468-b698-6a36d2d2aaf0", 00:11:49.894 "is_configured": true, 00:11:49.894 "data_offset": 2048, 00:11:49.894 "data_size": 63488 00:11:49.894 }, 00:11:49.894 { 00:11:49.894 "name": "BaseBdev2", 00:11:49.894 "uuid": "21944924-fc7a-5966-a7f2-eeee9454c537", 00:11:49.894 "is_configured": true, 00:11:49.894 "data_offset": 2048, 00:11:49.894 "data_size": 63488 00:11:49.894 }, 00:11:49.894 { 00:11:49.894 "name": "BaseBdev3", 00:11:49.895 "uuid": "6d231d98-29a7-5277-b167-7ffcc9f974a7", 00:11:49.895 "is_configured": true, 00:11:49.895 "data_offset": 2048, 00:11:49.895 "data_size": 63488 00:11:49.895 } 00:11:49.895 ] 00:11:49.895 }' 00:11:49.895 23:33:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:49.895 23:33:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:50.461 23:33:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:50.720 [2024-07-24 23:33:35.475651] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:50.720 [2024-07-24 23:33:35.475679] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:50.720 [2024-07-24 23:33:35.477640] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:50.720 [2024-07-24 23:33:35.477665] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:50.720 [2024-07-24 23:33:35.477687] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:50.720 [2024-07-24 23:33:35.477692] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2236700 name raid_bdev1, state offline 00:11:50.720 0 00:11:50.720 23:33:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 275188 00:11:50.720 23:33:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@950 -- # '[' -z 275188 ']' 00:11:50.720 23:33:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # kill -0 275188 00:11:50.720 23:33:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # uname 00:11:50.720 23:33:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:11:50.720 23:33:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 275188 00:11:50.720 23:33:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:11:50.720 23:33:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:11:50.720 23:33:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 275188' 00:11:50.720 killing process with pid 275188 00:11:50.720 23:33:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@969 -- # kill 275188 00:11:50.720 [2024-07-24 23:33:35.538490] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:50.720 23:33:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@974 -- # wait 275188 00:11:50.720 [2024-07-24 23:33:35.556239] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:50.979 23:33:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.UloDF0ThWm 00:11:50.979 23:33:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:11:50.979 23:33:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:11:50.979 23:33:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.53 00:11:50.979 23:33:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:11:50.979 23:33:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:50.979 23:33:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:11:50.979 23:33:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.53 != \0\.\0\0 ]] 00:11:50.979 00:11:50.979 real 0m5.386s 00:11:50.979 user 0m8.362s 00:11:50.979 sys 0m0.784s 00:11:50.979 23:33:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:11:50.979 23:33:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:50.979 ************************************ 00:11:50.979 END TEST raid_write_error_test 00:11:50.979 ************************************ 00:11:50.979 23:33:35 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:11:50.979 23:33:35 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 3 false 00:11:50.979 23:33:35 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:11:50.979 23:33:35 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:11:50.979 23:33:35 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:50.979 ************************************ 00:11:50.979 START TEST raid_state_function_test 00:11:50.979 ************************************ 00:11:50.979 23:33:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # raid_state_function_test concat 3 false 00:11:50.979 23:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:11:50.979 23:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:11:50.979 23:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:11:50.979 23:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:11:50.979 23:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:11:50.979 23:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:50.979 23:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:11:50.979 23:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:50.979 23:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:50.979 23:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:11:50.979 23:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:50.979 23:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:50.979 23:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:11:50.979 23:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:50.979 23:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:50.979 23:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:11:50.979 23:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:11:50.979 23:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:11:50.979 23:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:11:50.979 23:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:11:50.979 23:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:11:50.979 23:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:11:50.979 23:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:11:50.979 23:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:11:50.979 23:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:11:50.979 23:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:11:50.979 23:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=276193 00:11:50.979 23:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 276193' 00:11:50.979 Process raid pid: 276193 00:11:50.979 23:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:50.979 23:33:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 276193 /var/tmp/spdk-raid.sock 00:11:50.980 23:33:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@831 -- # '[' -z 276193 ']' 00:11:50.980 23:33:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:50.980 23:33:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:11:50.980 23:33:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:50.980 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:50.980 23:33:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:11:50.980 23:33:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:50.980 [2024-07-24 23:33:35.866209] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:11:50.980 [2024-07-24 23:33:35.866246] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:50.980 [2024-07-24 23:33:35.929451] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:51.238 [2024-07-24 23:33:36.008367] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:51.238 [2024-07-24 23:33:36.060519] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:51.238 [2024-07-24 23:33:36.060542] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:51.805 23:33:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:11:51.805 23:33:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@864 -- # return 0 00:11:51.805 23:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:11:52.063 [2024-07-24 23:33:36.807340] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:52.063 [2024-07-24 23:33:36.807372] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:52.063 [2024-07-24 23:33:36.807378] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:52.063 [2024-07-24 23:33:36.807384] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:52.063 [2024-07-24 23:33:36.807388] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:11:52.063 [2024-07-24 23:33:36.807393] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:11:52.063 23:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:11:52.063 23:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:52.063 23:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:52.063 23:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:52.063 23:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:52.063 23:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:52.063 23:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:52.063 23:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:52.064 23:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:52.064 23:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:52.064 23:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:52.064 23:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:52.064 23:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:52.064 "name": "Existed_Raid", 00:11:52.064 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:52.064 "strip_size_kb": 64, 00:11:52.064 "state": "configuring", 00:11:52.064 "raid_level": "concat", 00:11:52.064 "superblock": false, 00:11:52.064 "num_base_bdevs": 3, 00:11:52.064 "num_base_bdevs_discovered": 0, 00:11:52.064 "num_base_bdevs_operational": 3, 00:11:52.064 "base_bdevs_list": [ 00:11:52.064 { 00:11:52.064 "name": "BaseBdev1", 00:11:52.064 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:52.064 "is_configured": false, 00:11:52.064 "data_offset": 0, 00:11:52.064 "data_size": 0 00:11:52.064 }, 00:11:52.064 { 00:11:52.064 "name": "BaseBdev2", 00:11:52.064 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:52.064 "is_configured": false, 00:11:52.064 "data_offset": 0, 00:11:52.064 "data_size": 0 00:11:52.064 }, 00:11:52.064 { 00:11:52.064 "name": "BaseBdev3", 00:11:52.064 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:52.064 "is_configured": false, 00:11:52.064 "data_offset": 0, 00:11:52.064 "data_size": 0 00:11:52.064 } 00:11:52.064 ] 00:11:52.064 }' 00:11:52.064 23:33:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:52.064 23:33:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:52.631 23:33:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:52.631 [2024-07-24 23:33:37.609327] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:52.631 [2024-07-24 23:33:37.609351] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xcc1b30 name Existed_Raid, state configuring 00:11:52.631 23:33:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:11:52.889 [2024-07-24 23:33:37.793817] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:52.889 [2024-07-24 23:33:37.793836] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:52.889 [2024-07-24 23:33:37.793841] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:52.889 [2024-07-24 23:33:37.793846] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:52.889 [2024-07-24 23:33:37.793849] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:11:52.889 [2024-07-24 23:33:37.793854] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:11:52.889 23:33:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:53.148 [2024-07-24 23:33:37.974413] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:53.148 BaseBdev1 00:11:53.148 23:33:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:11:53.148 23:33:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:11:53.148 23:33:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:11:53.148 23:33:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:11:53.148 23:33:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:11:53.148 23:33:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:11:53.148 23:33:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:53.406 23:33:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:53.406 [ 00:11:53.406 { 00:11:53.406 "name": "BaseBdev1", 00:11:53.406 "aliases": [ 00:11:53.406 "58a142b9-bbe1-4e29-acb1-4fb12d71bb9d" 00:11:53.406 ], 00:11:53.406 "product_name": "Malloc disk", 00:11:53.406 "block_size": 512, 00:11:53.406 "num_blocks": 65536, 00:11:53.406 "uuid": "58a142b9-bbe1-4e29-acb1-4fb12d71bb9d", 00:11:53.406 "assigned_rate_limits": { 00:11:53.406 "rw_ios_per_sec": 0, 00:11:53.406 "rw_mbytes_per_sec": 0, 00:11:53.406 "r_mbytes_per_sec": 0, 00:11:53.406 "w_mbytes_per_sec": 0 00:11:53.406 }, 00:11:53.406 "claimed": true, 00:11:53.406 "claim_type": "exclusive_write", 00:11:53.406 "zoned": false, 00:11:53.406 "supported_io_types": { 00:11:53.406 "read": true, 00:11:53.406 "write": true, 00:11:53.406 "unmap": true, 00:11:53.406 "flush": true, 00:11:53.406 "reset": true, 00:11:53.406 "nvme_admin": false, 00:11:53.406 "nvme_io": false, 00:11:53.406 "nvme_io_md": false, 00:11:53.406 "write_zeroes": true, 00:11:53.406 "zcopy": true, 00:11:53.406 "get_zone_info": false, 00:11:53.406 "zone_management": false, 00:11:53.406 "zone_append": false, 00:11:53.406 "compare": false, 00:11:53.406 "compare_and_write": false, 00:11:53.406 "abort": true, 00:11:53.406 "seek_hole": false, 00:11:53.406 "seek_data": false, 00:11:53.406 "copy": true, 00:11:53.406 "nvme_iov_md": false 00:11:53.406 }, 00:11:53.406 "memory_domains": [ 00:11:53.406 { 00:11:53.406 "dma_device_id": "system", 00:11:53.407 "dma_device_type": 1 00:11:53.407 }, 00:11:53.407 { 00:11:53.407 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:53.407 "dma_device_type": 2 00:11:53.407 } 00:11:53.407 ], 00:11:53.407 "driver_specific": {} 00:11:53.407 } 00:11:53.407 ] 00:11:53.407 23:33:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:11:53.407 23:33:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:11:53.407 23:33:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:53.407 23:33:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:53.407 23:33:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:53.407 23:33:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:53.407 23:33:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:53.407 23:33:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:53.407 23:33:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:53.407 23:33:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:53.407 23:33:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:53.407 23:33:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:53.407 23:33:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:53.665 23:33:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:53.665 "name": "Existed_Raid", 00:11:53.665 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:53.665 "strip_size_kb": 64, 00:11:53.665 "state": "configuring", 00:11:53.665 "raid_level": "concat", 00:11:53.665 "superblock": false, 00:11:53.665 "num_base_bdevs": 3, 00:11:53.665 "num_base_bdevs_discovered": 1, 00:11:53.665 "num_base_bdevs_operational": 3, 00:11:53.665 "base_bdevs_list": [ 00:11:53.665 { 00:11:53.665 "name": "BaseBdev1", 00:11:53.665 "uuid": "58a142b9-bbe1-4e29-acb1-4fb12d71bb9d", 00:11:53.665 "is_configured": true, 00:11:53.665 "data_offset": 0, 00:11:53.665 "data_size": 65536 00:11:53.665 }, 00:11:53.665 { 00:11:53.665 "name": "BaseBdev2", 00:11:53.665 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:53.665 "is_configured": false, 00:11:53.665 "data_offset": 0, 00:11:53.665 "data_size": 0 00:11:53.665 }, 00:11:53.665 { 00:11:53.665 "name": "BaseBdev3", 00:11:53.665 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:53.665 "is_configured": false, 00:11:53.665 "data_offset": 0, 00:11:53.665 "data_size": 0 00:11:53.665 } 00:11:53.665 ] 00:11:53.665 }' 00:11:53.665 23:33:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:53.665 23:33:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:54.231 23:33:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:54.231 [2024-07-24 23:33:39.145435] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:54.231 [2024-07-24 23:33:39.145466] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xcc13a0 name Existed_Raid, state configuring 00:11:54.231 23:33:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:11:54.489 [2024-07-24 23:33:39.313917] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:54.489 [2024-07-24 23:33:39.314956] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:54.489 [2024-07-24 23:33:39.314980] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:54.489 [2024-07-24 23:33:39.314985] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:11:54.489 [2024-07-24 23:33:39.314990] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:11:54.489 23:33:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:11:54.489 23:33:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:54.489 23:33:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:11:54.489 23:33:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:54.489 23:33:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:54.489 23:33:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:54.489 23:33:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:54.489 23:33:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:54.489 23:33:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:54.489 23:33:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:54.489 23:33:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:54.489 23:33:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:54.489 23:33:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:54.489 23:33:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:54.747 23:33:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:54.747 "name": "Existed_Raid", 00:11:54.747 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:54.747 "strip_size_kb": 64, 00:11:54.747 "state": "configuring", 00:11:54.747 "raid_level": "concat", 00:11:54.747 "superblock": false, 00:11:54.747 "num_base_bdevs": 3, 00:11:54.747 "num_base_bdevs_discovered": 1, 00:11:54.747 "num_base_bdevs_operational": 3, 00:11:54.747 "base_bdevs_list": [ 00:11:54.747 { 00:11:54.747 "name": "BaseBdev1", 00:11:54.747 "uuid": "58a142b9-bbe1-4e29-acb1-4fb12d71bb9d", 00:11:54.747 "is_configured": true, 00:11:54.747 "data_offset": 0, 00:11:54.747 "data_size": 65536 00:11:54.747 }, 00:11:54.747 { 00:11:54.747 "name": "BaseBdev2", 00:11:54.747 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:54.747 "is_configured": false, 00:11:54.747 "data_offset": 0, 00:11:54.747 "data_size": 0 00:11:54.747 }, 00:11:54.747 { 00:11:54.747 "name": "BaseBdev3", 00:11:54.747 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:54.747 "is_configured": false, 00:11:54.747 "data_offset": 0, 00:11:54.747 "data_size": 0 00:11:54.747 } 00:11:54.747 ] 00:11:54.747 }' 00:11:54.747 23:33:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:54.747 23:33:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:55.004 23:33:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:55.262 [2024-07-24 23:33:40.146605] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:55.262 BaseBdev2 00:11:55.262 23:33:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:11:55.262 23:33:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:11:55.262 23:33:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:11:55.262 23:33:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:11:55.262 23:33:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:11:55.262 23:33:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:11:55.262 23:33:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:55.519 23:33:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:55.519 [ 00:11:55.519 { 00:11:55.519 "name": "BaseBdev2", 00:11:55.519 "aliases": [ 00:11:55.519 "15949d72-3be0-44ef-9da6-dc7ee9fe06dd" 00:11:55.519 ], 00:11:55.519 "product_name": "Malloc disk", 00:11:55.519 "block_size": 512, 00:11:55.519 "num_blocks": 65536, 00:11:55.519 "uuid": "15949d72-3be0-44ef-9da6-dc7ee9fe06dd", 00:11:55.519 "assigned_rate_limits": { 00:11:55.519 "rw_ios_per_sec": 0, 00:11:55.519 "rw_mbytes_per_sec": 0, 00:11:55.519 "r_mbytes_per_sec": 0, 00:11:55.519 "w_mbytes_per_sec": 0 00:11:55.519 }, 00:11:55.519 "claimed": true, 00:11:55.519 "claim_type": "exclusive_write", 00:11:55.519 "zoned": false, 00:11:55.519 "supported_io_types": { 00:11:55.519 "read": true, 00:11:55.519 "write": true, 00:11:55.519 "unmap": true, 00:11:55.519 "flush": true, 00:11:55.519 "reset": true, 00:11:55.519 "nvme_admin": false, 00:11:55.519 "nvme_io": false, 00:11:55.519 "nvme_io_md": false, 00:11:55.519 "write_zeroes": true, 00:11:55.519 "zcopy": true, 00:11:55.519 "get_zone_info": false, 00:11:55.519 "zone_management": false, 00:11:55.519 "zone_append": false, 00:11:55.519 "compare": false, 00:11:55.519 "compare_and_write": false, 00:11:55.519 "abort": true, 00:11:55.519 "seek_hole": false, 00:11:55.519 "seek_data": false, 00:11:55.519 "copy": true, 00:11:55.519 "nvme_iov_md": false 00:11:55.519 }, 00:11:55.519 "memory_domains": [ 00:11:55.519 { 00:11:55.519 "dma_device_id": "system", 00:11:55.519 "dma_device_type": 1 00:11:55.519 }, 00:11:55.519 { 00:11:55.520 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:55.520 "dma_device_type": 2 00:11:55.520 } 00:11:55.520 ], 00:11:55.520 "driver_specific": {} 00:11:55.520 } 00:11:55.520 ] 00:11:55.777 23:33:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:11:55.777 23:33:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:11:55.777 23:33:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:55.777 23:33:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:11:55.777 23:33:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:55.777 23:33:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:55.777 23:33:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:55.777 23:33:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:55.777 23:33:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:55.777 23:33:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:55.777 23:33:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:55.777 23:33:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:55.777 23:33:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:55.777 23:33:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:55.777 23:33:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:55.777 23:33:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:55.777 "name": "Existed_Raid", 00:11:55.777 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:55.777 "strip_size_kb": 64, 00:11:55.777 "state": "configuring", 00:11:55.777 "raid_level": "concat", 00:11:55.777 "superblock": false, 00:11:55.777 "num_base_bdevs": 3, 00:11:55.778 "num_base_bdevs_discovered": 2, 00:11:55.778 "num_base_bdevs_operational": 3, 00:11:55.778 "base_bdevs_list": [ 00:11:55.778 { 00:11:55.778 "name": "BaseBdev1", 00:11:55.778 "uuid": "58a142b9-bbe1-4e29-acb1-4fb12d71bb9d", 00:11:55.778 "is_configured": true, 00:11:55.778 "data_offset": 0, 00:11:55.778 "data_size": 65536 00:11:55.778 }, 00:11:55.778 { 00:11:55.778 "name": "BaseBdev2", 00:11:55.778 "uuid": "15949d72-3be0-44ef-9da6-dc7ee9fe06dd", 00:11:55.778 "is_configured": true, 00:11:55.778 "data_offset": 0, 00:11:55.778 "data_size": 65536 00:11:55.778 }, 00:11:55.778 { 00:11:55.778 "name": "BaseBdev3", 00:11:55.778 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:55.778 "is_configured": false, 00:11:55.778 "data_offset": 0, 00:11:55.778 "data_size": 0 00:11:55.778 } 00:11:55.778 ] 00:11:55.778 }' 00:11:55.778 23:33:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:55.778 23:33:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:56.344 23:33:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:11:56.344 [2024-07-24 23:33:41.336262] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:11:56.344 [2024-07-24 23:33:41.336289] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xcc22a0 00:11:56.344 [2024-07-24 23:33:41.336293] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:11:56.344 [2024-07-24 23:33:41.336418] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xcbade0 00:11:56.344 [2024-07-24 23:33:41.336504] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xcc22a0 00:11:56.344 [2024-07-24 23:33:41.336510] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xcc22a0 00:11:56.344 [2024-07-24 23:33:41.336620] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:56.344 BaseBdev3 00:11:56.601 23:33:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:11:56.601 23:33:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:11:56.601 23:33:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:11:56.601 23:33:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:11:56.602 23:33:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:11:56.602 23:33:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:11:56.602 23:33:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:56.602 23:33:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:11:56.859 [ 00:11:56.859 { 00:11:56.859 "name": "BaseBdev3", 00:11:56.859 "aliases": [ 00:11:56.859 "b81c002f-df59-4ba0-b7a3-4294c7853154" 00:11:56.859 ], 00:11:56.859 "product_name": "Malloc disk", 00:11:56.859 "block_size": 512, 00:11:56.859 "num_blocks": 65536, 00:11:56.859 "uuid": "b81c002f-df59-4ba0-b7a3-4294c7853154", 00:11:56.859 "assigned_rate_limits": { 00:11:56.859 "rw_ios_per_sec": 0, 00:11:56.859 "rw_mbytes_per_sec": 0, 00:11:56.859 "r_mbytes_per_sec": 0, 00:11:56.859 "w_mbytes_per_sec": 0 00:11:56.859 }, 00:11:56.859 "claimed": true, 00:11:56.859 "claim_type": "exclusive_write", 00:11:56.859 "zoned": false, 00:11:56.859 "supported_io_types": { 00:11:56.859 "read": true, 00:11:56.859 "write": true, 00:11:56.859 "unmap": true, 00:11:56.859 "flush": true, 00:11:56.859 "reset": true, 00:11:56.859 "nvme_admin": false, 00:11:56.859 "nvme_io": false, 00:11:56.859 "nvme_io_md": false, 00:11:56.859 "write_zeroes": true, 00:11:56.859 "zcopy": true, 00:11:56.859 "get_zone_info": false, 00:11:56.859 "zone_management": false, 00:11:56.859 "zone_append": false, 00:11:56.859 "compare": false, 00:11:56.859 "compare_and_write": false, 00:11:56.859 "abort": true, 00:11:56.859 "seek_hole": false, 00:11:56.859 "seek_data": false, 00:11:56.859 "copy": true, 00:11:56.859 "nvme_iov_md": false 00:11:56.859 }, 00:11:56.859 "memory_domains": [ 00:11:56.859 { 00:11:56.859 "dma_device_id": "system", 00:11:56.859 "dma_device_type": 1 00:11:56.859 }, 00:11:56.859 { 00:11:56.859 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:56.859 "dma_device_type": 2 00:11:56.859 } 00:11:56.859 ], 00:11:56.859 "driver_specific": {} 00:11:56.859 } 00:11:56.859 ] 00:11:56.859 23:33:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:11:56.859 23:33:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:11:56.859 23:33:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:56.859 23:33:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:11:56.859 23:33:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:56.859 23:33:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:56.859 23:33:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:56.859 23:33:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:56.859 23:33:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:56.859 23:33:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:56.859 23:33:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:56.859 23:33:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:56.859 23:33:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:56.859 23:33:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:56.859 23:33:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:57.117 23:33:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:57.117 "name": "Existed_Raid", 00:11:57.117 "uuid": "2da9f802-3048-4473-8f28-9fb99738f795", 00:11:57.117 "strip_size_kb": 64, 00:11:57.117 "state": "online", 00:11:57.117 "raid_level": "concat", 00:11:57.117 "superblock": false, 00:11:57.117 "num_base_bdevs": 3, 00:11:57.117 "num_base_bdevs_discovered": 3, 00:11:57.117 "num_base_bdevs_operational": 3, 00:11:57.117 "base_bdevs_list": [ 00:11:57.117 { 00:11:57.117 "name": "BaseBdev1", 00:11:57.117 "uuid": "58a142b9-bbe1-4e29-acb1-4fb12d71bb9d", 00:11:57.117 "is_configured": true, 00:11:57.117 "data_offset": 0, 00:11:57.117 "data_size": 65536 00:11:57.117 }, 00:11:57.117 { 00:11:57.117 "name": "BaseBdev2", 00:11:57.117 "uuid": "15949d72-3be0-44ef-9da6-dc7ee9fe06dd", 00:11:57.117 "is_configured": true, 00:11:57.117 "data_offset": 0, 00:11:57.117 "data_size": 65536 00:11:57.117 }, 00:11:57.117 { 00:11:57.117 "name": "BaseBdev3", 00:11:57.117 "uuid": "b81c002f-df59-4ba0-b7a3-4294c7853154", 00:11:57.117 "is_configured": true, 00:11:57.117 "data_offset": 0, 00:11:57.117 "data_size": 65536 00:11:57.117 } 00:11:57.117 ] 00:11:57.117 }' 00:11:57.117 23:33:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:57.117 23:33:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:57.374 23:33:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:11:57.374 23:33:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:11:57.374 23:33:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:57.374 23:33:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:57.374 23:33:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:57.374 23:33:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:57.374 23:33:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:57.374 23:33:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:57.632 [2024-07-24 23:33:42.487439] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:57.632 23:33:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:57.632 "name": "Existed_Raid", 00:11:57.632 "aliases": [ 00:11:57.632 "2da9f802-3048-4473-8f28-9fb99738f795" 00:11:57.632 ], 00:11:57.632 "product_name": "Raid Volume", 00:11:57.632 "block_size": 512, 00:11:57.632 "num_blocks": 196608, 00:11:57.632 "uuid": "2da9f802-3048-4473-8f28-9fb99738f795", 00:11:57.632 "assigned_rate_limits": { 00:11:57.632 "rw_ios_per_sec": 0, 00:11:57.632 "rw_mbytes_per_sec": 0, 00:11:57.632 "r_mbytes_per_sec": 0, 00:11:57.632 "w_mbytes_per_sec": 0 00:11:57.632 }, 00:11:57.632 "claimed": false, 00:11:57.632 "zoned": false, 00:11:57.632 "supported_io_types": { 00:11:57.632 "read": true, 00:11:57.632 "write": true, 00:11:57.632 "unmap": true, 00:11:57.632 "flush": true, 00:11:57.632 "reset": true, 00:11:57.632 "nvme_admin": false, 00:11:57.632 "nvme_io": false, 00:11:57.632 "nvme_io_md": false, 00:11:57.632 "write_zeroes": true, 00:11:57.632 "zcopy": false, 00:11:57.632 "get_zone_info": false, 00:11:57.632 "zone_management": false, 00:11:57.632 "zone_append": false, 00:11:57.632 "compare": false, 00:11:57.632 "compare_and_write": false, 00:11:57.632 "abort": false, 00:11:57.632 "seek_hole": false, 00:11:57.632 "seek_data": false, 00:11:57.632 "copy": false, 00:11:57.632 "nvme_iov_md": false 00:11:57.632 }, 00:11:57.632 "memory_domains": [ 00:11:57.632 { 00:11:57.632 "dma_device_id": "system", 00:11:57.632 "dma_device_type": 1 00:11:57.632 }, 00:11:57.632 { 00:11:57.632 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:57.632 "dma_device_type": 2 00:11:57.632 }, 00:11:57.632 { 00:11:57.632 "dma_device_id": "system", 00:11:57.632 "dma_device_type": 1 00:11:57.632 }, 00:11:57.632 { 00:11:57.632 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:57.632 "dma_device_type": 2 00:11:57.632 }, 00:11:57.632 { 00:11:57.632 "dma_device_id": "system", 00:11:57.632 "dma_device_type": 1 00:11:57.632 }, 00:11:57.632 { 00:11:57.632 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:57.632 "dma_device_type": 2 00:11:57.632 } 00:11:57.632 ], 00:11:57.632 "driver_specific": { 00:11:57.632 "raid": { 00:11:57.632 "uuid": "2da9f802-3048-4473-8f28-9fb99738f795", 00:11:57.632 "strip_size_kb": 64, 00:11:57.632 "state": "online", 00:11:57.632 "raid_level": "concat", 00:11:57.632 "superblock": false, 00:11:57.632 "num_base_bdevs": 3, 00:11:57.632 "num_base_bdevs_discovered": 3, 00:11:57.632 "num_base_bdevs_operational": 3, 00:11:57.632 "base_bdevs_list": [ 00:11:57.632 { 00:11:57.632 "name": "BaseBdev1", 00:11:57.632 "uuid": "58a142b9-bbe1-4e29-acb1-4fb12d71bb9d", 00:11:57.632 "is_configured": true, 00:11:57.632 "data_offset": 0, 00:11:57.632 "data_size": 65536 00:11:57.632 }, 00:11:57.632 { 00:11:57.632 "name": "BaseBdev2", 00:11:57.632 "uuid": "15949d72-3be0-44ef-9da6-dc7ee9fe06dd", 00:11:57.632 "is_configured": true, 00:11:57.632 "data_offset": 0, 00:11:57.632 "data_size": 65536 00:11:57.632 }, 00:11:57.632 { 00:11:57.632 "name": "BaseBdev3", 00:11:57.632 "uuid": "b81c002f-df59-4ba0-b7a3-4294c7853154", 00:11:57.632 "is_configured": true, 00:11:57.632 "data_offset": 0, 00:11:57.632 "data_size": 65536 00:11:57.632 } 00:11:57.632 ] 00:11:57.632 } 00:11:57.632 } 00:11:57.632 }' 00:11:57.632 23:33:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:57.632 23:33:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:11:57.632 BaseBdev2 00:11:57.632 BaseBdev3' 00:11:57.632 23:33:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:57.632 23:33:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:11:57.632 23:33:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:57.890 23:33:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:57.890 "name": "BaseBdev1", 00:11:57.890 "aliases": [ 00:11:57.890 "58a142b9-bbe1-4e29-acb1-4fb12d71bb9d" 00:11:57.890 ], 00:11:57.890 "product_name": "Malloc disk", 00:11:57.890 "block_size": 512, 00:11:57.890 "num_blocks": 65536, 00:11:57.890 "uuid": "58a142b9-bbe1-4e29-acb1-4fb12d71bb9d", 00:11:57.890 "assigned_rate_limits": { 00:11:57.890 "rw_ios_per_sec": 0, 00:11:57.890 "rw_mbytes_per_sec": 0, 00:11:57.890 "r_mbytes_per_sec": 0, 00:11:57.890 "w_mbytes_per_sec": 0 00:11:57.890 }, 00:11:57.890 "claimed": true, 00:11:57.890 "claim_type": "exclusive_write", 00:11:57.890 "zoned": false, 00:11:57.890 "supported_io_types": { 00:11:57.890 "read": true, 00:11:57.890 "write": true, 00:11:57.890 "unmap": true, 00:11:57.890 "flush": true, 00:11:57.890 "reset": true, 00:11:57.890 "nvme_admin": false, 00:11:57.890 "nvme_io": false, 00:11:57.890 "nvme_io_md": false, 00:11:57.890 "write_zeroes": true, 00:11:57.890 "zcopy": true, 00:11:57.890 "get_zone_info": false, 00:11:57.890 "zone_management": false, 00:11:57.890 "zone_append": false, 00:11:57.890 "compare": false, 00:11:57.890 "compare_and_write": false, 00:11:57.890 "abort": true, 00:11:57.890 "seek_hole": false, 00:11:57.890 "seek_data": false, 00:11:57.890 "copy": true, 00:11:57.890 "nvme_iov_md": false 00:11:57.890 }, 00:11:57.890 "memory_domains": [ 00:11:57.890 { 00:11:57.890 "dma_device_id": "system", 00:11:57.890 "dma_device_type": 1 00:11:57.890 }, 00:11:57.890 { 00:11:57.890 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:57.890 "dma_device_type": 2 00:11:57.890 } 00:11:57.890 ], 00:11:57.890 "driver_specific": {} 00:11:57.890 }' 00:11:57.890 23:33:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:57.890 23:33:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:57.890 23:33:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:57.890 23:33:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:57.890 23:33:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:57.890 23:33:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:57.890 23:33:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:58.147 23:33:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:58.147 23:33:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:58.147 23:33:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:58.147 23:33:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:58.147 23:33:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:58.147 23:33:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:58.147 23:33:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:58.147 23:33:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:58.405 23:33:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:58.405 "name": "BaseBdev2", 00:11:58.405 "aliases": [ 00:11:58.405 "15949d72-3be0-44ef-9da6-dc7ee9fe06dd" 00:11:58.405 ], 00:11:58.405 "product_name": "Malloc disk", 00:11:58.405 "block_size": 512, 00:11:58.405 "num_blocks": 65536, 00:11:58.405 "uuid": "15949d72-3be0-44ef-9da6-dc7ee9fe06dd", 00:11:58.405 "assigned_rate_limits": { 00:11:58.405 "rw_ios_per_sec": 0, 00:11:58.405 "rw_mbytes_per_sec": 0, 00:11:58.405 "r_mbytes_per_sec": 0, 00:11:58.405 "w_mbytes_per_sec": 0 00:11:58.405 }, 00:11:58.405 "claimed": true, 00:11:58.405 "claim_type": "exclusive_write", 00:11:58.405 "zoned": false, 00:11:58.405 "supported_io_types": { 00:11:58.405 "read": true, 00:11:58.405 "write": true, 00:11:58.405 "unmap": true, 00:11:58.405 "flush": true, 00:11:58.405 "reset": true, 00:11:58.405 "nvme_admin": false, 00:11:58.405 "nvme_io": false, 00:11:58.405 "nvme_io_md": false, 00:11:58.405 "write_zeroes": true, 00:11:58.405 "zcopy": true, 00:11:58.405 "get_zone_info": false, 00:11:58.405 "zone_management": false, 00:11:58.405 "zone_append": false, 00:11:58.405 "compare": false, 00:11:58.405 "compare_and_write": false, 00:11:58.405 "abort": true, 00:11:58.405 "seek_hole": false, 00:11:58.405 "seek_data": false, 00:11:58.405 "copy": true, 00:11:58.405 "nvme_iov_md": false 00:11:58.405 }, 00:11:58.405 "memory_domains": [ 00:11:58.405 { 00:11:58.405 "dma_device_id": "system", 00:11:58.405 "dma_device_type": 1 00:11:58.405 }, 00:11:58.405 { 00:11:58.405 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:58.405 "dma_device_type": 2 00:11:58.405 } 00:11:58.405 ], 00:11:58.405 "driver_specific": {} 00:11:58.405 }' 00:11:58.405 23:33:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:58.405 23:33:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:58.405 23:33:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:58.405 23:33:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:58.405 23:33:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:58.405 23:33:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:58.405 23:33:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:58.662 23:33:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:58.662 23:33:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:58.662 23:33:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:58.662 23:33:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:58.662 23:33:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:58.662 23:33:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:58.662 23:33:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:11:58.662 23:33:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:58.920 23:33:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:58.920 "name": "BaseBdev3", 00:11:58.920 "aliases": [ 00:11:58.920 "b81c002f-df59-4ba0-b7a3-4294c7853154" 00:11:58.920 ], 00:11:58.920 "product_name": "Malloc disk", 00:11:58.920 "block_size": 512, 00:11:58.920 "num_blocks": 65536, 00:11:58.920 "uuid": "b81c002f-df59-4ba0-b7a3-4294c7853154", 00:11:58.920 "assigned_rate_limits": { 00:11:58.920 "rw_ios_per_sec": 0, 00:11:58.920 "rw_mbytes_per_sec": 0, 00:11:58.920 "r_mbytes_per_sec": 0, 00:11:58.920 "w_mbytes_per_sec": 0 00:11:58.920 }, 00:11:58.920 "claimed": true, 00:11:58.920 "claim_type": "exclusive_write", 00:11:58.920 "zoned": false, 00:11:58.920 "supported_io_types": { 00:11:58.920 "read": true, 00:11:58.920 "write": true, 00:11:58.920 "unmap": true, 00:11:58.920 "flush": true, 00:11:58.920 "reset": true, 00:11:58.920 "nvme_admin": false, 00:11:58.920 "nvme_io": false, 00:11:58.920 "nvme_io_md": false, 00:11:58.920 "write_zeroes": true, 00:11:58.920 "zcopy": true, 00:11:58.920 "get_zone_info": false, 00:11:58.920 "zone_management": false, 00:11:58.920 "zone_append": false, 00:11:58.920 "compare": false, 00:11:58.920 "compare_and_write": false, 00:11:58.920 "abort": true, 00:11:58.920 "seek_hole": false, 00:11:58.920 "seek_data": false, 00:11:58.920 "copy": true, 00:11:58.920 "nvme_iov_md": false 00:11:58.920 }, 00:11:58.920 "memory_domains": [ 00:11:58.920 { 00:11:58.920 "dma_device_id": "system", 00:11:58.920 "dma_device_type": 1 00:11:58.920 }, 00:11:58.920 { 00:11:58.920 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:58.920 "dma_device_type": 2 00:11:58.920 } 00:11:58.920 ], 00:11:58.920 "driver_specific": {} 00:11:58.920 }' 00:11:58.920 23:33:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:58.920 23:33:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:58.920 23:33:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:58.920 23:33:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:58.920 23:33:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:58.920 23:33:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:58.920 23:33:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:58.920 23:33:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:59.177 23:33:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:59.177 23:33:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:59.177 23:33:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:59.177 23:33:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:59.177 23:33:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:59.177 [2024-07-24 23:33:44.167640] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:59.177 [2024-07-24 23:33:44.167657] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:59.177 [2024-07-24 23:33:44.167684] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:59.434 23:33:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:11:59.434 23:33:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:11:59.434 23:33:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:59.434 23:33:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:11:59.434 23:33:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:11:59.434 23:33:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 2 00:11:59.434 23:33:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:59.434 23:33:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:11:59.434 23:33:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:59.434 23:33:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:59.435 23:33:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:59.435 23:33:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:59.435 23:33:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:59.435 23:33:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:59.435 23:33:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:59.435 23:33:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:59.435 23:33:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:59.435 23:33:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:59.435 "name": "Existed_Raid", 00:11:59.435 "uuid": "2da9f802-3048-4473-8f28-9fb99738f795", 00:11:59.435 "strip_size_kb": 64, 00:11:59.435 "state": "offline", 00:11:59.435 "raid_level": "concat", 00:11:59.435 "superblock": false, 00:11:59.435 "num_base_bdevs": 3, 00:11:59.435 "num_base_bdevs_discovered": 2, 00:11:59.435 "num_base_bdevs_operational": 2, 00:11:59.435 "base_bdevs_list": [ 00:11:59.435 { 00:11:59.435 "name": null, 00:11:59.435 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:59.435 "is_configured": false, 00:11:59.435 "data_offset": 0, 00:11:59.435 "data_size": 65536 00:11:59.435 }, 00:11:59.435 { 00:11:59.435 "name": "BaseBdev2", 00:11:59.435 "uuid": "15949d72-3be0-44ef-9da6-dc7ee9fe06dd", 00:11:59.435 "is_configured": true, 00:11:59.435 "data_offset": 0, 00:11:59.435 "data_size": 65536 00:11:59.435 }, 00:11:59.435 { 00:11:59.435 "name": "BaseBdev3", 00:11:59.435 "uuid": "b81c002f-df59-4ba0-b7a3-4294c7853154", 00:11:59.435 "is_configured": true, 00:11:59.435 "data_offset": 0, 00:11:59.435 "data_size": 65536 00:11:59.435 } 00:11:59.435 ] 00:11:59.435 }' 00:11:59.435 23:33:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:59.435 23:33:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:59.999 23:33:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:11:59.999 23:33:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:59.999 23:33:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:59.999 23:33:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:00.258 23:33:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:00.258 23:33:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:00.258 23:33:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:12:00.258 [2024-07-24 23:33:45.175079] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:00.258 23:33:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:00.258 23:33:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:00.258 23:33:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:00.258 23:33:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:00.516 23:33:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:00.516 23:33:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:00.516 23:33:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:12:00.775 [2024-07-24 23:33:45.521813] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:12:00.775 [2024-07-24 23:33:45.521843] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xcc22a0 name Existed_Raid, state offline 00:12:00.775 23:33:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:00.775 23:33:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:00.775 23:33:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:00.775 23:33:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:12:00.775 23:33:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:12:00.775 23:33:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:12:00.775 23:33:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:12:00.775 23:33:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:12:00.775 23:33:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:12:00.775 23:33:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:01.033 BaseBdev2 00:12:01.033 23:33:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:12:01.033 23:33:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:12:01.033 23:33:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:12:01.033 23:33:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:12:01.033 23:33:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:12:01.033 23:33:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:12:01.033 23:33:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:01.291 23:33:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:01.291 [ 00:12:01.291 { 00:12:01.291 "name": "BaseBdev2", 00:12:01.291 "aliases": [ 00:12:01.291 "514d0597-0abf-41fc-bafe-6e47604899bc" 00:12:01.291 ], 00:12:01.291 "product_name": "Malloc disk", 00:12:01.291 "block_size": 512, 00:12:01.291 "num_blocks": 65536, 00:12:01.291 "uuid": "514d0597-0abf-41fc-bafe-6e47604899bc", 00:12:01.291 "assigned_rate_limits": { 00:12:01.291 "rw_ios_per_sec": 0, 00:12:01.291 "rw_mbytes_per_sec": 0, 00:12:01.291 "r_mbytes_per_sec": 0, 00:12:01.291 "w_mbytes_per_sec": 0 00:12:01.291 }, 00:12:01.291 "claimed": false, 00:12:01.291 "zoned": false, 00:12:01.291 "supported_io_types": { 00:12:01.291 "read": true, 00:12:01.291 "write": true, 00:12:01.291 "unmap": true, 00:12:01.291 "flush": true, 00:12:01.291 "reset": true, 00:12:01.291 "nvme_admin": false, 00:12:01.291 "nvme_io": false, 00:12:01.291 "nvme_io_md": false, 00:12:01.291 "write_zeroes": true, 00:12:01.291 "zcopy": true, 00:12:01.291 "get_zone_info": false, 00:12:01.291 "zone_management": false, 00:12:01.291 "zone_append": false, 00:12:01.291 "compare": false, 00:12:01.291 "compare_and_write": false, 00:12:01.291 "abort": true, 00:12:01.291 "seek_hole": false, 00:12:01.291 "seek_data": false, 00:12:01.291 "copy": true, 00:12:01.291 "nvme_iov_md": false 00:12:01.291 }, 00:12:01.291 "memory_domains": [ 00:12:01.291 { 00:12:01.291 "dma_device_id": "system", 00:12:01.291 "dma_device_type": 1 00:12:01.291 }, 00:12:01.291 { 00:12:01.291 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:01.291 "dma_device_type": 2 00:12:01.291 } 00:12:01.291 ], 00:12:01.291 "driver_specific": {} 00:12:01.291 } 00:12:01.291 ] 00:12:01.291 23:33:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:12:01.291 23:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:12:01.291 23:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:12:01.291 23:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:12:01.589 BaseBdev3 00:12:01.589 23:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:12:01.589 23:33:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:12:01.589 23:33:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:12:01.589 23:33:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:12:01.589 23:33:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:12:01.589 23:33:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:12:01.589 23:33:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:01.589 23:33:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:12:01.870 [ 00:12:01.870 { 00:12:01.870 "name": "BaseBdev3", 00:12:01.870 "aliases": [ 00:12:01.870 "3a58c208-0f87-4f35-a375-9a1e956dfc90" 00:12:01.870 ], 00:12:01.870 "product_name": "Malloc disk", 00:12:01.870 "block_size": 512, 00:12:01.870 "num_blocks": 65536, 00:12:01.870 "uuid": "3a58c208-0f87-4f35-a375-9a1e956dfc90", 00:12:01.870 "assigned_rate_limits": { 00:12:01.870 "rw_ios_per_sec": 0, 00:12:01.870 "rw_mbytes_per_sec": 0, 00:12:01.870 "r_mbytes_per_sec": 0, 00:12:01.870 "w_mbytes_per_sec": 0 00:12:01.870 }, 00:12:01.870 "claimed": false, 00:12:01.870 "zoned": false, 00:12:01.870 "supported_io_types": { 00:12:01.870 "read": true, 00:12:01.870 "write": true, 00:12:01.870 "unmap": true, 00:12:01.870 "flush": true, 00:12:01.870 "reset": true, 00:12:01.870 "nvme_admin": false, 00:12:01.870 "nvme_io": false, 00:12:01.870 "nvme_io_md": false, 00:12:01.870 "write_zeroes": true, 00:12:01.870 "zcopy": true, 00:12:01.870 "get_zone_info": false, 00:12:01.870 "zone_management": false, 00:12:01.870 "zone_append": false, 00:12:01.870 "compare": false, 00:12:01.870 "compare_and_write": false, 00:12:01.870 "abort": true, 00:12:01.870 "seek_hole": false, 00:12:01.870 "seek_data": false, 00:12:01.870 "copy": true, 00:12:01.870 "nvme_iov_md": false 00:12:01.870 }, 00:12:01.870 "memory_domains": [ 00:12:01.870 { 00:12:01.870 "dma_device_id": "system", 00:12:01.870 "dma_device_type": 1 00:12:01.870 }, 00:12:01.870 { 00:12:01.870 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:01.870 "dma_device_type": 2 00:12:01.870 } 00:12:01.870 ], 00:12:01.870 "driver_specific": {} 00:12:01.870 } 00:12:01.870 ] 00:12:01.870 23:33:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:12:01.870 23:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:12:01.870 23:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:12:01.870 23:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:01.870 [2024-07-24 23:33:46.858607] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:01.870 [2024-07-24 23:33:46.858636] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:01.870 [2024-07-24 23:33:46.858649] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:01.870 [2024-07-24 23:33:46.859592] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:02.129 23:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:02.129 23:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:02.129 23:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:02.129 23:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:02.129 23:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:02.129 23:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:02.129 23:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:02.129 23:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:02.129 23:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:02.129 23:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:02.129 23:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:02.129 23:33:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:02.129 23:33:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:02.129 "name": "Existed_Raid", 00:12:02.129 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:02.129 "strip_size_kb": 64, 00:12:02.129 "state": "configuring", 00:12:02.129 "raid_level": "concat", 00:12:02.129 "superblock": false, 00:12:02.129 "num_base_bdevs": 3, 00:12:02.129 "num_base_bdevs_discovered": 2, 00:12:02.129 "num_base_bdevs_operational": 3, 00:12:02.129 "base_bdevs_list": [ 00:12:02.129 { 00:12:02.129 "name": "BaseBdev1", 00:12:02.129 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:02.129 "is_configured": false, 00:12:02.129 "data_offset": 0, 00:12:02.129 "data_size": 0 00:12:02.129 }, 00:12:02.129 { 00:12:02.129 "name": "BaseBdev2", 00:12:02.129 "uuid": "514d0597-0abf-41fc-bafe-6e47604899bc", 00:12:02.129 "is_configured": true, 00:12:02.129 "data_offset": 0, 00:12:02.129 "data_size": 65536 00:12:02.129 }, 00:12:02.129 { 00:12:02.129 "name": "BaseBdev3", 00:12:02.129 "uuid": "3a58c208-0f87-4f35-a375-9a1e956dfc90", 00:12:02.129 "is_configured": true, 00:12:02.129 "data_offset": 0, 00:12:02.129 "data_size": 65536 00:12:02.129 } 00:12:02.129 ] 00:12:02.129 }' 00:12:02.129 23:33:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:02.129 23:33:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:02.696 23:33:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:12:02.954 [2024-07-24 23:33:47.700775] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:02.954 23:33:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:02.954 23:33:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:02.954 23:33:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:02.954 23:33:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:02.954 23:33:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:02.954 23:33:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:02.954 23:33:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:02.954 23:33:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:02.954 23:33:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:02.954 23:33:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:02.954 23:33:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:02.954 23:33:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:02.954 23:33:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:02.954 "name": "Existed_Raid", 00:12:02.954 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:02.954 "strip_size_kb": 64, 00:12:02.954 "state": "configuring", 00:12:02.954 "raid_level": "concat", 00:12:02.954 "superblock": false, 00:12:02.954 "num_base_bdevs": 3, 00:12:02.954 "num_base_bdevs_discovered": 1, 00:12:02.954 "num_base_bdevs_operational": 3, 00:12:02.954 "base_bdevs_list": [ 00:12:02.954 { 00:12:02.954 "name": "BaseBdev1", 00:12:02.954 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:02.954 "is_configured": false, 00:12:02.954 "data_offset": 0, 00:12:02.954 "data_size": 0 00:12:02.954 }, 00:12:02.954 { 00:12:02.954 "name": null, 00:12:02.954 "uuid": "514d0597-0abf-41fc-bafe-6e47604899bc", 00:12:02.954 "is_configured": false, 00:12:02.954 "data_offset": 0, 00:12:02.954 "data_size": 65536 00:12:02.954 }, 00:12:02.954 { 00:12:02.954 "name": "BaseBdev3", 00:12:02.954 "uuid": "3a58c208-0f87-4f35-a375-9a1e956dfc90", 00:12:02.954 "is_configured": true, 00:12:02.954 "data_offset": 0, 00:12:02.954 "data_size": 65536 00:12:02.954 } 00:12:02.954 ] 00:12:02.954 }' 00:12:02.954 23:33:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:02.954 23:33:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:03.520 23:33:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:12:03.520 23:33:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:03.779 23:33:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:12:03.779 23:33:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:03.779 [2024-07-24 23:33:48.714178] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:03.779 BaseBdev1 00:12:03.779 23:33:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:12:03.779 23:33:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:12:03.779 23:33:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:12:03.779 23:33:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:12:03.779 23:33:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:12:03.779 23:33:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:12:03.779 23:33:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:04.037 23:33:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:04.038 [ 00:12:04.038 { 00:12:04.038 "name": "BaseBdev1", 00:12:04.038 "aliases": [ 00:12:04.038 "4944bb16-d911-48fd-9558-380a8f0f1c91" 00:12:04.038 ], 00:12:04.038 "product_name": "Malloc disk", 00:12:04.038 "block_size": 512, 00:12:04.038 "num_blocks": 65536, 00:12:04.038 "uuid": "4944bb16-d911-48fd-9558-380a8f0f1c91", 00:12:04.038 "assigned_rate_limits": { 00:12:04.038 "rw_ios_per_sec": 0, 00:12:04.038 "rw_mbytes_per_sec": 0, 00:12:04.038 "r_mbytes_per_sec": 0, 00:12:04.038 "w_mbytes_per_sec": 0 00:12:04.038 }, 00:12:04.038 "claimed": true, 00:12:04.038 "claim_type": "exclusive_write", 00:12:04.038 "zoned": false, 00:12:04.038 "supported_io_types": { 00:12:04.038 "read": true, 00:12:04.038 "write": true, 00:12:04.038 "unmap": true, 00:12:04.038 "flush": true, 00:12:04.038 "reset": true, 00:12:04.038 "nvme_admin": false, 00:12:04.038 "nvme_io": false, 00:12:04.038 "nvme_io_md": false, 00:12:04.038 "write_zeroes": true, 00:12:04.038 "zcopy": true, 00:12:04.038 "get_zone_info": false, 00:12:04.038 "zone_management": false, 00:12:04.038 "zone_append": false, 00:12:04.038 "compare": false, 00:12:04.038 "compare_and_write": false, 00:12:04.038 "abort": true, 00:12:04.038 "seek_hole": false, 00:12:04.038 "seek_data": false, 00:12:04.038 "copy": true, 00:12:04.038 "nvme_iov_md": false 00:12:04.038 }, 00:12:04.038 "memory_domains": [ 00:12:04.038 { 00:12:04.038 "dma_device_id": "system", 00:12:04.038 "dma_device_type": 1 00:12:04.038 }, 00:12:04.038 { 00:12:04.038 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:04.038 "dma_device_type": 2 00:12:04.038 } 00:12:04.038 ], 00:12:04.038 "driver_specific": {} 00:12:04.038 } 00:12:04.038 ] 00:12:04.038 23:33:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:12:04.038 23:33:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:04.038 23:33:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:04.038 23:33:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:04.038 23:33:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:04.038 23:33:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:04.038 23:33:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:04.038 23:33:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:04.038 23:33:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:04.038 23:33:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:04.038 23:33:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:04.296 23:33:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:04.296 23:33:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:04.296 23:33:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:04.296 "name": "Existed_Raid", 00:12:04.296 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:04.296 "strip_size_kb": 64, 00:12:04.296 "state": "configuring", 00:12:04.296 "raid_level": "concat", 00:12:04.296 "superblock": false, 00:12:04.296 "num_base_bdevs": 3, 00:12:04.296 "num_base_bdevs_discovered": 2, 00:12:04.296 "num_base_bdevs_operational": 3, 00:12:04.296 "base_bdevs_list": [ 00:12:04.296 { 00:12:04.296 "name": "BaseBdev1", 00:12:04.296 "uuid": "4944bb16-d911-48fd-9558-380a8f0f1c91", 00:12:04.296 "is_configured": true, 00:12:04.296 "data_offset": 0, 00:12:04.296 "data_size": 65536 00:12:04.296 }, 00:12:04.296 { 00:12:04.296 "name": null, 00:12:04.296 "uuid": "514d0597-0abf-41fc-bafe-6e47604899bc", 00:12:04.296 "is_configured": false, 00:12:04.296 "data_offset": 0, 00:12:04.296 "data_size": 65536 00:12:04.296 }, 00:12:04.296 { 00:12:04.296 "name": "BaseBdev3", 00:12:04.296 "uuid": "3a58c208-0f87-4f35-a375-9a1e956dfc90", 00:12:04.296 "is_configured": true, 00:12:04.296 "data_offset": 0, 00:12:04.296 "data_size": 65536 00:12:04.296 } 00:12:04.296 ] 00:12:04.296 }' 00:12:04.296 23:33:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:04.296 23:33:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:04.863 23:33:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:12:04.863 23:33:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:04.863 23:33:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:12:04.863 23:33:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:12:05.126 [2024-07-24 23:33:50.005542] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:12:05.126 23:33:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:05.126 23:33:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:05.126 23:33:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:05.126 23:33:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:05.126 23:33:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:05.126 23:33:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:05.126 23:33:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:05.126 23:33:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:05.126 23:33:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:05.126 23:33:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:05.126 23:33:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:05.126 23:33:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:05.391 23:33:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:05.391 "name": "Existed_Raid", 00:12:05.391 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:05.391 "strip_size_kb": 64, 00:12:05.391 "state": "configuring", 00:12:05.391 "raid_level": "concat", 00:12:05.391 "superblock": false, 00:12:05.391 "num_base_bdevs": 3, 00:12:05.391 "num_base_bdevs_discovered": 1, 00:12:05.391 "num_base_bdevs_operational": 3, 00:12:05.391 "base_bdevs_list": [ 00:12:05.391 { 00:12:05.391 "name": "BaseBdev1", 00:12:05.391 "uuid": "4944bb16-d911-48fd-9558-380a8f0f1c91", 00:12:05.391 "is_configured": true, 00:12:05.391 "data_offset": 0, 00:12:05.391 "data_size": 65536 00:12:05.391 }, 00:12:05.391 { 00:12:05.391 "name": null, 00:12:05.391 "uuid": "514d0597-0abf-41fc-bafe-6e47604899bc", 00:12:05.391 "is_configured": false, 00:12:05.391 "data_offset": 0, 00:12:05.391 "data_size": 65536 00:12:05.391 }, 00:12:05.391 { 00:12:05.391 "name": null, 00:12:05.391 "uuid": "3a58c208-0f87-4f35-a375-9a1e956dfc90", 00:12:05.391 "is_configured": false, 00:12:05.391 "data_offset": 0, 00:12:05.391 "data_size": 65536 00:12:05.391 } 00:12:05.391 ] 00:12:05.391 }' 00:12:05.391 23:33:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:05.391 23:33:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:05.956 23:33:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:05.956 23:33:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:12:05.956 23:33:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:12:05.956 23:33:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:12:06.213 [2024-07-24 23:33:51.036220] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:06.213 23:33:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:06.213 23:33:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:06.213 23:33:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:06.213 23:33:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:06.213 23:33:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:06.213 23:33:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:06.213 23:33:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:06.213 23:33:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:06.213 23:33:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:06.213 23:33:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:06.213 23:33:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:06.213 23:33:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:06.213 23:33:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:06.213 "name": "Existed_Raid", 00:12:06.213 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:06.213 "strip_size_kb": 64, 00:12:06.213 "state": "configuring", 00:12:06.213 "raid_level": "concat", 00:12:06.213 "superblock": false, 00:12:06.213 "num_base_bdevs": 3, 00:12:06.213 "num_base_bdevs_discovered": 2, 00:12:06.213 "num_base_bdevs_operational": 3, 00:12:06.213 "base_bdevs_list": [ 00:12:06.213 { 00:12:06.213 "name": "BaseBdev1", 00:12:06.213 "uuid": "4944bb16-d911-48fd-9558-380a8f0f1c91", 00:12:06.213 "is_configured": true, 00:12:06.213 "data_offset": 0, 00:12:06.213 "data_size": 65536 00:12:06.213 }, 00:12:06.213 { 00:12:06.213 "name": null, 00:12:06.213 "uuid": "514d0597-0abf-41fc-bafe-6e47604899bc", 00:12:06.213 "is_configured": false, 00:12:06.213 "data_offset": 0, 00:12:06.213 "data_size": 65536 00:12:06.213 }, 00:12:06.213 { 00:12:06.213 "name": "BaseBdev3", 00:12:06.213 "uuid": "3a58c208-0f87-4f35-a375-9a1e956dfc90", 00:12:06.213 "is_configured": true, 00:12:06.213 "data_offset": 0, 00:12:06.213 "data_size": 65536 00:12:06.213 } 00:12:06.213 ] 00:12:06.213 }' 00:12:06.213 23:33:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:06.213 23:33:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:06.777 23:33:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:06.777 23:33:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:12:07.035 23:33:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:12:07.035 23:33:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:07.035 [2024-07-24 23:33:52.014757] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:07.293 23:33:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:07.293 23:33:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:07.293 23:33:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:07.293 23:33:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:07.293 23:33:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:07.293 23:33:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:07.293 23:33:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:07.293 23:33:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:07.293 23:33:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:07.293 23:33:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:07.293 23:33:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:07.293 23:33:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:07.293 23:33:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:07.293 "name": "Existed_Raid", 00:12:07.293 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:07.293 "strip_size_kb": 64, 00:12:07.293 "state": "configuring", 00:12:07.293 "raid_level": "concat", 00:12:07.293 "superblock": false, 00:12:07.293 "num_base_bdevs": 3, 00:12:07.293 "num_base_bdevs_discovered": 1, 00:12:07.293 "num_base_bdevs_operational": 3, 00:12:07.293 "base_bdevs_list": [ 00:12:07.293 { 00:12:07.293 "name": null, 00:12:07.293 "uuid": "4944bb16-d911-48fd-9558-380a8f0f1c91", 00:12:07.293 "is_configured": false, 00:12:07.293 "data_offset": 0, 00:12:07.293 "data_size": 65536 00:12:07.293 }, 00:12:07.293 { 00:12:07.293 "name": null, 00:12:07.293 "uuid": "514d0597-0abf-41fc-bafe-6e47604899bc", 00:12:07.293 "is_configured": false, 00:12:07.293 "data_offset": 0, 00:12:07.293 "data_size": 65536 00:12:07.293 }, 00:12:07.293 { 00:12:07.293 "name": "BaseBdev3", 00:12:07.293 "uuid": "3a58c208-0f87-4f35-a375-9a1e956dfc90", 00:12:07.293 "is_configured": true, 00:12:07.293 "data_offset": 0, 00:12:07.293 "data_size": 65536 00:12:07.293 } 00:12:07.293 ] 00:12:07.293 }' 00:12:07.293 23:33:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:07.293 23:33:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:07.858 23:33:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:07.858 23:33:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:12:07.858 23:33:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:12:07.858 23:33:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:12:08.116 [2024-07-24 23:33:52.998927] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:08.116 23:33:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:08.116 23:33:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:08.116 23:33:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:08.116 23:33:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:08.116 23:33:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:08.116 23:33:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:08.116 23:33:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:08.116 23:33:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:08.116 23:33:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:08.116 23:33:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:08.116 23:33:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:08.116 23:33:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:08.374 23:33:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:08.374 "name": "Existed_Raid", 00:12:08.374 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:08.374 "strip_size_kb": 64, 00:12:08.374 "state": "configuring", 00:12:08.374 "raid_level": "concat", 00:12:08.374 "superblock": false, 00:12:08.374 "num_base_bdevs": 3, 00:12:08.374 "num_base_bdevs_discovered": 2, 00:12:08.374 "num_base_bdevs_operational": 3, 00:12:08.374 "base_bdevs_list": [ 00:12:08.374 { 00:12:08.374 "name": null, 00:12:08.374 "uuid": "4944bb16-d911-48fd-9558-380a8f0f1c91", 00:12:08.374 "is_configured": false, 00:12:08.374 "data_offset": 0, 00:12:08.374 "data_size": 65536 00:12:08.374 }, 00:12:08.374 { 00:12:08.374 "name": "BaseBdev2", 00:12:08.374 "uuid": "514d0597-0abf-41fc-bafe-6e47604899bc", 00:12:08.374 "is_configured": true, 00:12:08.374 "data_offset": 0, 00:12:08.374 "data_size": 65536 00:12:08.374 }, 00:12:08.374 { 00:12:08.374 "name": "BaseBdev3", 00:12:08.374 "uuid": "3a58c208-0f87-4f35-a375-9a1e956dfc90", 00:12:08.374 "is_configured": true, 00:12:08.374 "data_offset": 0, 00:12:08.374 "data_size": 65536 00:12:08.374 } 00:12:08.374 ] 00:12:08.374 }' 00:12:08.374 23:33:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:08.374 23:33:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:08.937 23:33:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:08.937 23:33:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:12:08.937 23:33:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:12:08.937 23:33:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:08.937 23:33:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:12:09.194 23:33:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 4944bb16-d911-48fd-9558-380a8f0f1c91 00:12:09.194 [2024-07-24 23:33:54.116529] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:12:09.194 [2024-07-24 23:33:54.116560] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xe73ac0 00:12:09.194 [2024-07-24 23:33:54.116564] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:12:09.194 [2024-07-24 23:33:54.116690] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xcb97f0 00:12:09.194 [2024-07-24 23:33:54.116767] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe73ac0 00:12:09.194 [2024-07-24 23:33:54.116771] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xe73ac0 00:12:09.194 [2024-07-24 23:33:54.116882] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:09.194 NewBaseBdev 00:12:09.194 23:33:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:12:09.194 23:33:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=NewBaseBdev 00:12:09.194 23:33:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:12:09.194 23:33:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:12:09.194 23:33:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:12:09.194 23:33:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:12:09.194 23:33:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:09.451 23:33:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:12:09.451 [ 00:12:09.451 { 00:12:09.451 "name": "NewBaseBdev", 00:12:09.451 "aliases": [ 00:12:09.451 "4944bb16-d911-48fd-9558-380a8f0f1c91" 00:12:09.451 ], 00:12:09.451 "product_name": "Malloc disk", 00:12:09.451 "block_size": 512, 00:12:09.451 "num_blocks": 65536, 00:12:09.451 "uuid": "4944bb16-d911-48fd-9558-380a8f0f1c91", 00:12:09.451 "assigned_rate_limits": { 00:12:09.451 "rw_ios_per_sec": 0, 00:12:09.451 "rw_mbytes_per_sec": 0, 00:12:09.451 "r_mbytes_per_sec": 0, 00:12:09.451 "w_mbytes_per_sec": 0 00:12:09.451 }, 00:12:09.451 "claimed": true, 00:12:09.451 "claim_type": "exclusive_write", 00:12:09.451 "zoned": false, 00:12:09.451 "supported_io_types": { 00:12:09.451 "read": true, 00:12:09.451 "write": true, 00:12:09.451 "unmap": true, 00:12:09.451 "flush": true, 00:12:09.451 "reset": true, 00:12:09.451 "nvme_admin": false, 00:12:09.451 "nvme_io": false, 00:12:09.451 "nvme_io_md": false, 00:12:09.451 "write_zeroes": true, 00:12:09.451 "zcopy": true, 00:12:09.451 "get_zone_info": false, 00:12:09.451 "zone_management": false, 00:12:09.451 "zone_append": false, 00:12:09.451 "compare": false, 00:12:09.451 "compare_and_write": false, 00:12:09.451 "abort": true, 00:12:09.451 "seek_hole": false, 00:12:09.451 "seek_data": false, 00:12:09.451 "copy": true, 00:12:09.451 "nvme_iov_md": false 00:12:09.451 }, 00:12:09.451 "memory_domains": [ 00:12:09.451 { 00:12:09.451 "dma_device_id": "system", 00:12:09.451 "dma_device_type": 1 00:12:09.451 }, 00:12:09.451 { 00:12:09.451 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:09.451 "dma_device_type": 2 00:12:09.451 } 00:12:09.451 ], 00:12:09.452 "driver_specific": {} 00:12:09.452 } 00:12:09.452 ] 00:12:09.452 23:33:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:12:09.452 23:33:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:12:09.452 23:33:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:09.452 23:33:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:09.452 23:33:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:09.452 23:33:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:09.452 23:33:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:09.452 23:33:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:09.452 23:33:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:09.452 23:33:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:09.452 23:33:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:09.452 23:33:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:09.452 23:33:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:09.710 23:33:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:09.710 "name": "Existed_Raid", 00:12:09.710 "uuid": "26d6beec-878b-4f8a-993d-750db199dbb2", 00:12:09.710 "strip_size_kb": 64, 00:12:09.710 "state": "online", 00:12:09.710 "raid_level": "concat", 00:12:09.710 "superblock": false, 00:12:09.710 "num_base_bdevs": 3, 00:12:09.710 "num_base_bdevs_discovered": 3, 00:12:09.710 "num_base_bdevs_operational": 3, 00:12:09.710 "base_bdevs_list": [ 00:12:09.710 { 00:12:09.710 "name": "NewBaseBdev", 00:12:09.710 "uuid": "4944bb16-d911-48fd-9558-380a8f0f1c91", 00:12:09.710 "is_configured": true, 00:12:09.710 "data_offset": 0, 00:12:09.710 "data_size": 65536 00:12:09.710 }, 00:12:09.710 { 00:12:09.710 "name": "BaseBdev2", 00:12:09.710 "uuid": "514d0597-0abf-41fc-bafe-6e47604899bc", 00:12:09.710 "is_configured": true, 00:12:09.710 "data_offset": 0, 00:12:09.710 "data_size": 65536 00:12:09.710 }, 00:12:09.710 { 00:12:09.710 "name": "BaseBdev3", 00:12:09.710 "uuid": "3a58c208-0f87-4f35-a375-9a1e956dfc90", 00:12:09.710 "is_configured": true, 00:12:09.710 "data_offset": 0, 00:12:09.710 "data_size": 65536 00:12:09.710 } 00:12:09.710 ] 00:12:09.710 }' 00:12:09.710 23:33:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:09.710 23:33:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:10.275 23:33:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:12:10.275 23:33:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:12:10.275 23:33:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:10.275 23:33:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:10.275 23:33:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:10.275 23:33:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:10.275 23:33:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:10.275 23:33:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:10.275 [2024-07-24 23:33:55.235605] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:10.275 23:33:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:10.275 "name": "Existed_Raid", 00:12:10.275 "aliases": [ 00:12:10.275 "26d6beec-878b-4f8a-993d-750db199dbb2" 00:12:10.275 ], 00:12:10.275 "product_name": "Raid Volume", 00:12:10.275 "block_size": 512, 00:12:10.275 "num_blocks": 196608, 00:12:10.275 "uuid": "26d6beec-878b-4f8a-993d-750db199dbb2", 00:12:10.275 "assigned_rate_limits": { 00:12:10.275 "rw_ios_per_sec": 0, 00:12:10.275 "rw_mbytes_per_sec": 0, 00:12:10.275 "r_mbytes_per_sec": 0, 00:12:10.275 "w_mbytes_per_sec": 0 00:12:10.275 }, 00:12:10.275 "claimed": false, 00:12:10.275 "zoned": false, 00:12:10.275 "supported_io_types": { 00:12:10.275 "read": true, 00:12:10.275 "write": true, 00:12:10.275 "unmap": true, 00:12:10.275 "flush": true, 00:12:10.275 "reset": true, 00:12:10.275 "nvme_admin": false, 00:12:10.275 "nvme_io": false, 00:12:10.275 "nvme_io_md": false, 00:12:10.275 "write_zeroes": true, 00:12:10.275 "zcopy": false, 00:12:10.275 "get_zone_info": false, 00:12:10.275 "zone_management": false, 00:12:10.275 "zone_append": false, 00:12:10.275 "compare": false, 00:12:10.275 "compare_and_write": false, 00:12:10.275 "abort": false, 00:12:10.275 "seek_hole": false, 00:12:10.275 "seek_data": false, 00:12:10.275 "copy": false, 00:12:10.275 "nvme_iov_md": false 00:12:10.275 }, 00:12:10.275 "memory_domains": [ 00:12:10.275 { 00:12:10.275 "dma_device_id": "system", 00:12:10.275 "dma_device_type": 1 00:12:10.275 }, 00:12:10.275 { 00:12:10.275 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:10.275 "dma_device_type": 2 00:12:10.275 }, 00:12:10.275 { 00:12:10.275 "dma_device_id": "system", 00:12:10.275 "dma_device_type": 1 00:12:10.275 }, 00:12:10.275 { 00:12:10.275 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:10.275 "dma_device_type": 2 00:12:10.275 }, 00:12:10.275 { 00:12:10.275 "dma_device_id": "system", 00:12:10.275 "dma_device_type": 1 00:12:10.275 }, 00:12:10.275 { 00:12:10.275 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:10.275 "dma_device_type": 2 00:12:10.275 } 00:12:10.275 ], 00:12:10.275 "driver_specific": { 00:12:10.275 "raid": { 00:12:10.275 "uuid": "26d6beec-878b-4f8a-993d-750db199dbb2", 00:12:10.275 "strip_size_kb": 64, 00:12:10.275 "state": "online", 00:12:10.275 "raid_level": "concat", 00:12:10.276 "superblock": false, 00:12:10.276 "num_base_bdevs": 3, 00:12:10.276 "num_base_bdevs_discovered": 3, 00:12:10.276 "num_base_bdevs_operational": 3, 00:12:10.276 "base_bdevs_list": [ 00:12:10.276 { 00:12:10.276 "name": "NewBaseBdev", 00:12:10.276 "uuid": "4944bb16-d911-48fd-9558-380a8f0f1c91", 00:12:10.276 "is_configured": true, 00:12:10.276 "data_offset": 0, 00:12:10.276 "data_size": 65536 00:12:10.276 }, 00:12:10.276 { 00:12:10.276 "name": "BaseBdev2", 00:12:10.276 "uuid": "514d0597-0abf-41fc-bafe-6e47604899bc", 00:12:10.276 "is_configured": true, 00:12:10.276 "data_offset": 0, 00:12:10.276 "data_size": 65536 00:12:10.276 }, 00:12:10.276 { 00:12:10.276 "name": "BaseBdev3", 00:12:10.276 "uuid": "3a58c208-0f87-4f35-a375-9a1e956dfc90", 00:12:10.276 "is_configured": true, 00:12:10.276 "data_offset": 0, 00:12:10.276 "data_size": 65536 00:12:10.276 } 00:12:10.276 ] 00:12:10.276 } 00:12:10.276 } 00:12:10.276 }' 00:12:10.276 23:33:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:10.533 23:33:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:12:10.533 BaseBdev2 00:12:10.533 BaseBdev3' 00:12:10.533 23:33:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:10.533 23:33:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:12:10.533 23:33:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:10.533 23:33:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:10.533 "name": "NewBaseBdev", 00:12:10.533 "aliases": [ 00:12:10.533 "4944bb16-d911-48fd-9558-380a8f0f1c91" 00:12:10.533 ], 00:12:10.533 "product_name": "Malloc disk", 00:12:10.533 "block_size": 512, 00:12:10.533 "num_blocks": 65536, 00:12:10.533 "uuid": "4944bb16-d911-48fd-9558-380a8f0f1c91", 00:12:10.533 "assigned_rate_limits": { 00:12:10.533 "rw_ios_per_sec": 0, 00:12:10.533 "rw_mbytes_per_sec": 0, 00:12:10.533 "r_mbytes_per_sec": 0, 00:12:10.533 "w_mbytes_per_sec": 0 00:12:10.533 }, 00:12:10.533 "claimed": true, 00:12:10.534 "claim_type": "exclusive_write", 00:12:10.534 "zoned": false, 00:12:10.534 "supported_io_types": { 00:12:10.534 "read": true, 00:12:10.534 "write": true, 00:12:10.534 "unmap": true, 00:12:10.534 "flush": true, 00:12:10.534 "reset": true, 00:12:10.534 "nvme_admin": false, 00:12:10.534 "nvme_io": false, 00:12:10.534 "nvme_io_md": false, 00:12:10.534 "write_zeroes": true, 00:12:10.534 "zcopy": true, 00:12:10.534 "get_zone_info": false, 00:12:10.534 "zone_management": false, 00:12:10.534 "zone_append": false, 00:12:10.534 "compare": false, 00:12:10.534 "compare_and_write": false, 00:12:10.534 "abort": true, 00:12:10.534 "seek_hole": false, 00:12:10.534 "seek_data": false, 00:12:10.534 "copy": true, 00:12:10.534 "nvme_iov_md": false 00:12:10.534 }, 00:12:10.534 "memory_domains": [ 00:12:10.534 { 00:12:10.534 "dma_device_id": "system", 00:12:10.534 "dma_device_type": 1 00:12:10.534 }, 00:12:10.534 { 00:12:10.534 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:10.534 "dma_device_type": 2 00:12:10.534 } 00:12:10.534 ], 00:12:10.534 "driver_specific": {} 00:12:10.534 }' 00:12:10.534 23:33:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:10.534 23:33:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:10.791 23:33:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:10.791 23:33:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:10.791 23:33:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:10.791 23:33:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:10.791 23:33:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:10.791 23:33:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:10.791 23:33:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:10.791 23:33:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:10.791 23:33:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:10.791 23:33:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:10.791 23:33:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:10.792 23:33:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:10.792 23:33:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:11.049 23:33:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:11.049 "name": "BaseBdev2", 00:12:11.049 "aliases": [ 00:12:11.049 "514d0597-0abf-41fc-bafe-6e47604899bc" 00:12:11.049 ], 00:12:11.049 "product_name": "Malloc disk", 00:12:11.049 "block_size": 512, 00:12:11.049 "num_blocks": 65536, 00:12:11.049 "uuid": "514d0597-0abf-41fc-bafe-6e47604899bc", 00:12:11.049 "assigned_rate_limits": { 00:12:11.049 "rw_ios_per_sec": 0, 00:12:11.049 "rw_mbytes_per_sec": 0, 00:12:11.049 "r_mbytes_per_sec": 0, 00:12:11.049 "w_mbytes_per_sec": 0 00:12:11.049 }, 00:12:11.049 "claimed": true, 00:12:11.049 "claim_type": "exclusive_write", 00:12:11.049 "zoned": false, 00:12:11.049 "supported_io_types": { 00:12:11.049 "read": true, 00:12:11.049 "write": true, 00:12:11.049 "unmap": true, 00:12:11.049 "flush": true, 00:12:11.049 "reset": true, 00:12:11.049 "nvme_admin": false, 00:12:11.049 "nvme_io": false, 00:12:11.049 "nvme_io_md": false, 00:12:11.049 "write_zeroes": true, 00:12:11.049 "zcopy": true, 00:12:11.049 "get_zone_info": false, 00:12:11.049 "zone_management": false, 00:12:11.049 "zone_append": false, 00:12:11.049 "compare": false, 00:12:11.049 "compare_and_write": false, 00:12:11.049 "abort": true, 00:12:11.049 "seek_hole": false, 00:12:11.049 "seek_data": false, 00:12:11.049 "copy": true, 00:12:11.049 "nvme_iov_md": false 00:12:11.049 }, 00:12:11.049 "memory_domains": [ 00:12:11.049 { 00:12:11.049 "dma_device_id": "system", 00:12:11.049 "dma_device_type": 1 00:12:11.050 }, 00:12:11.050 { 00:12:11.050 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:11.050 "dma_device_type": 2 00:12:11.050 } 00:12:11.050 ], 00:12:11.050 "driver_specific": {} 00:12:11.050 }' 00:12:11.050 23:33:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:11.050 23:33:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:11.050 23:33:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:11.050 23:33:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:11.308 23:33:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:11.308 23:33:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:11.308 23:33:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:11.308 23:33:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:11.308 23:33:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:11.308 23:33:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:11.308 23:33:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:11.308 23:33:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:11.308 23:33:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:11.308 23:33:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:12:11.308 23:33:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:11.566 23:33:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:11.566 "name": "BaseBdev3", 00:12:11.566 "aliases": [ 00:12:11.566 "3a58c208-0f87-4f35-a375-9a1e956dfc90" 00:12:11.566 ], 00:12:11.566 "product_name": "Malloc disk", 00:12:11.566 "block_size": 512, 00:12:11.566 "num_blocks": 65536, 00:12:11.566 "uuid": "3a58c208-0f87-4f35-a375-9a1e956dfc90", 00:12:11.566 "assigned_rate_limits": { 00:12:11.566 "rw_ios_per_sec": 0, 00:12:11.566 "rw_mbytes_per_sec": 0, 00:12:11.566 "r_mbytes_per_sec": 0, 00:12:11.566 "w_mbytes_per_sec": 0 00:12:11.566 }, 00:12:11.566 "claimed": true, 00:12:11.566 "claim_type": "exclusive_write", 00:12:11.566 "zoned": false, 00:12:11.566 "supported_io_types": { 00:12:11.566 "read": true, 00:12:11.566 "write": true, 00:12:11.566 "unmap": true, 00:12:11.566 "flush": true, 00:12:11.566 "reset": true, 00:12:11.566 "nvme_admin": false, 00:12:11.566 "nvme_io": false, 00:12:11.566 "nvme_io_md": false, 00:12:11.566 "write_zeroes": true, 00:12:11.566 "zcopy": true, 00:12:11.566 "get_zone_info": false, 00:12:11.566 "zone_management": false, 00:12:11.566 "zone_append": false, 00:12:11.566 "compare": false, 00:12:11.566 "compare_and_write": false, 00:12:11.566 "abort": true, 00:12:11.566 "seek_hole": false, 00:12:11.566 "seek_data": false, 00:12:11.566 "copy": true, 00:12:11.566 "nvme_iov_md": false 00:12:11.566 }, 00:12:11.566 "memory_domains": [ 00:12:11.566 { 00:12:11.566 "dma_device_id": "system", 00:12:11.566 "dma_device_type": 1 00:12:11.566 }, 00:12:11.566 { 00:12:11.566 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:11.566 "dma_device_type": 2 00:12:11.566 } 00:12:11.566 ], 00:12:11.566 "driver_specific": {} 00:12:11.566 }' 00:12:11.566 23:33:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:11.566 23:33:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:11.566 23:33:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:11.566 23:33:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:11.566 23:33:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:11.566 23:33:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:11.566 23:33:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:11.824 23:33:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:11.824 23:33:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:11.824 23:33:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:11.824 23:33:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:11.824 23:33:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:11.825 23:33:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:12.084 [2024-07-24 23:33:56.843578] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:12.084 [2024-07-24 23:33:56.843598] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:12.084 [2024-07-24 23:33:56.843633] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:12.084 [2024-07-24 23:33:56.843667] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:12.084 [2024-07-24 23:33:56.843673] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe73ac0 name Existed_Raid, state offline 00:12:12.084 23:33:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 276193 00:12:12.084 23:33:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@950 -- # '[' -z 276193 ']' 00:12:12.084 23:33:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # kill -0 276193 00:12:12.084 23:33:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # uname 00:12:12.084 23:33:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:12:12.084 23:33:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 276193 00:12:12.084 23:33:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:12:12.084 23:33:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:12:12.084 23:33:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 276193' 00:12:12.084 killing process with pid 276193 00:12:12.084 23:33:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@969 -- # kill 276193 00:12:12.084 [2024-07-24 23:33:56.894818] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:12.084 23:33:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@974 -- # wait 276193 00:12:12.084 [2024-07-24 23:33:56.917589] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:12.342 23:33:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:12:12.342 00:12:12.342 real 0m21.280s 00:12:12.342 user 0m39.640s 00:12:12.342 sys 0m3.235s 00:12:12.342 23:33:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:12:12.342 23:33:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:12.342 ************************************ 00:12:12.342 END TEST raid_state_function_test 00:12:12.342 ************************************ 00:12:12.342 23:33:57 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 3 true 00:12:12.342 23:33:57 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:12:12.342 23:33:57 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:12:12.342 23:33:57 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:12.342 ************************************ 00:12:12.342 START TEST raid_state_function_test_sb 00:12:12.342 ************************************ 00:12:12.342 23:33:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # raid_state_function_test concat 3 true 00:12:12.342 23:33:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:12:12.342 23:33:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:12:12.342 23:33:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:12:12.342 23:33:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:12:12.342 23:33:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:12:12.342 23:33:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:12.342 23:33:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:12:12.342 23:33:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:12.342 23:33:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:12.342 23:33:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:12:12.342 23:33:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:12.342 23:33:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:12.342 23:33:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:12:12.342 23:33:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:12.342 23:33:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:12.342 23:33:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:12:12.342 23:33:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:12:12.342 23:33:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:12:12.342 23:33:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:12:12.342 23:33:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:12:12.342 23:33:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:12:12.342 23:33:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:12:12.342 23:33:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:12:12.342 23:33:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:12:12.342 23:33:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:12:12.343 23:33:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:12:12.343 23:33:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=280436 00:12:12.343 23:33:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 280436' 00:12:12.343 Process raid pid: 280436 00:12:12.343 23:33:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 280436 /var/tmp/spdk-raid.sock 00:12:12.343 23:33:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@831 -- # '[' -z 280436 ']' 00:12:12.343 23:33:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:12.343 23:33:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # local max_retries=100 00:12:12.343 23:33:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:12.343 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:12.343 23:33:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@840 -- # xtrace_disable 00:12:12.343 23:33:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:12.343 23:33:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:12.343 [2024-07-24 23:33:57.208436] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:12:12.343 [2024-07-24 23:33:57.208483] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:12.343 [2024-07-24 23:33:57.271688] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:12.601 [2024-07-24 23:33:57.350028] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:12.601 [2024-07-24 23:33:57.400768] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:12.601 [2024-07-24 23:33:57.400794] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:13.167 23:33:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:12:13.167 23:33:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@864 -- # return 0 00:12:13.167 23:33:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:13.167 [2024-07-24 23:33:58.147822] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:13.167 [2024-07-24 23:33:58.147852] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:13.167 [2024-07-24 23:33:58.147858] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:13.167 [2024-07-24 23:33:58.147864] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:13.167 [2024-07-24 23:33:58.147868] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:12:13.167 [2024-07-24 23:33:58.147874] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:12:13.167 23:33:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:13.167 23:33:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:13.167 23:33:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:13.167 23:33:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:13.167 23:33:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:13.167 23:33:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:13.167 23:33:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:13.167 23:33:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:13.167 23:33:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:13.167 23:33:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:13.167 23:33:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:13.167 23:33:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:13.425 23:33:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:13.425 "name": "Existed_Raid", 00:12:13.425 "uuid": "629bc179-b090-41b3-b972-873e464dba17", 00:12:13.425 "strip_size_kb": 64, 00:12:13.425 "state": "configuring", 00:12:13.425 "raid_level": "concat", 00:12:13.425 "superblock": true, 00:12:13.425 "num_base_bdevs": 3, 00:12:13.425 "num_base_bdevs_discovered": 0, 00:12:13.425 "num_base_bdevs_operational": 3, 00:12:13.425 "base_bdevs_list": [ 00:12:13.425 { 00:12:13.425 "name": "BaseBdev1", 00:12:13.425 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:13.425 "is_configured": false, 00:12:13.425 "data_offset": 0, 00:12:13.425 "data_size": 0 00:12:13.425 }, 00:12:13.425 { 00:12:13.425 "name": "BaseBdev2", 00:12:13.425 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:13.425 "is_configured": false, 00:12:13.425 "data_offset": 0, 00:12:13.425 "data_size": 0 00:12:13.425 }, 00:12:13.425 { 00:12:13.425 "name": "BaseBdev3", 00:12:13.425 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:13.425 "is_configured": false, 00:12:13.425 "data_offset": 0, 00:12:13.425 "data_size": 0 00:12:13.425 } 00:12:13.425 ] 00:12:13.425 }' 00:12:13.425 23:33:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:13.425 23:33:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:13.991 23:33:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:13.991 [2024-07-24 23:33:58.973869] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:13.991 [2024-07-24 23:33:58.973890] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x22b0b30 name Existed_Raid, state configuring 00:12:14.249 23:33:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:14.249 [2024-07-24 23:33:59.142328] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:14.249 [2024-07-24 23:33:59.142345] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:14.249 [2024-07-24 23:33:59.142350] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:14.249 [2024-07-24 23:33:59.142355] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:14.249 [2024-07-24 23:33:59.142359] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:12:14.249 [2024-07-24 23:33:59.142365] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:12:14.249 23:33:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:14.508 [2024-07-24 23:33:59.319011] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:14.508 BaseBdev1 00:12:14.508 23:33:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:12:14.508 23:33:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:12:14.508 23:33:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:12:14.508 23:33:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:12:14.508 23:33:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:12:14.508 23:33:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:12:14.508 23:33:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:14.508 23:33:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:14.766 [ 00:12:14.766 { 00:12:14.766 "name": "BaseBdev1", 00:12:14.766 "aliases": [ 00:12:14.766 "61193c0f-0f67-46c8-a56e-e565c731c7b2" 00:12:14.766 ], 00:12:14.766 "product_name": "Malloc disk", 00:12:14.766 "block_size": 512, 00:12:14.766 "num_blocks": 65536, 00:12:14.766 "uuid": "61193c0f-0f67-46c8-a56e-e565c731c7b2", 00:12:14.766 "assigned_rate_limits": { 00:12:14.766 "rw_ios_per_sec": 0, 00:12:14.766 "rw_mbytes_per_sec": 0, 00:12:14.766 "r_mbytes_per_sec": 0, 00:12:14.766 "w_mbytes_per_sec": 0 00:12:14.766 }, 00:12:14.766 "claimed": true, 00:12:14.766 "claim_type": "exclusive_write", 00:12:14.766 "zoned": false, 00:12:14.766 "supported_io_types": { 00:12:14.766 "read": true, 00:12:14.766 "write": true, 00:12:14.766 "unmap": true, 00:12:14.766 "flush": true, 00:12:14.766 "reset": true, 00:12:14.766 "nvme_admin": false, 00:12:14.766 "nvme_io": false, 00:12:14.766 "nvme_io_md": false, 00:12:14.766 "write_zeroes": true, 00:12:14.766 "zcopy": true, 00:12:14.766 "get_zone_info": false, 00:12:14.766 "zone_management": false, 00:12:14.766 "zone_append": false, 00:12:14.766 "compare": false, 00:12:14.766 "compare_and_write": false, 00:12:14.766 "abort": true, 00:12:14.766 "seek_hole": false, 00:12:14.766 "seek_data": false, 00:12:14.766 "copy": true, 00:12:14.766 "nvme_iov_md": false 00:12:14.766 }, 00:12:14.766 "memory_domains": [ 00:12:14.766 { 00:12:14.766 "dma_device_id": "system", 00:12:14.766 "dma_device_type": 1 00:12:14.766 }, 00:12:14.766 { 00:12:14.766 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:14.767 "dma_device_type": 2 00:12:14.767 } 00:12:14.767 ], 00:12:14.767 "driver_specific": {} 00:12:14.767 } 00:12:14.767 ] 00:12:14.767 23:33:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:12:14.767 23:33:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:14.767 23:33:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:14.767 23:33:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:14.767 23:33:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:14.767 23:33:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:14.767 23:33:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:14.767 23:33:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:14.767 23:33:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:14.767 23:33:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:14.767 23:33:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:14.767 23:33:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:14.767 23:33:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:15.025 23:33:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:15.025 "name": "Existed_Raid", 00:12:15.025 "uuid": "af8eff44-b8a4-475d-8fba-61156fa98436", 00:12:15.025 "strip_size_kb": 64, 00:12:15.025 "state": "configuring", 00:12:15.025 "raid_level": "concat", 00:12:15.025 "superblock": true, 00:12:15.025 "num_base_bdevs": 3, 00:12:15.025 "num_base_bdevs_discovered": 1, 00:12:15.025 "num_base_bdevs_operational": 3, 00:12:15.025 "base_bdevs_list": [ 00:12:15.025 { 00:12:15.025 "name": "BaseBdev1", 00:12:15.025 "uuid": "61193c0f-0f67-46c8-a56e-e565c731c7b2", 00:12:15.025 "is_configured": true, 00:12:15.025 "data_offset": 2048, 00:12:15.025 "data_size": 63488 00:12:15.025 }, 00:12:15.025 { 00:12:15.025 "name": "BaseBdev2", 00:12:15.025 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:15.025 "is_configured": false, 00:12:15.025 "data_offset": 0, 00:12:15.025 "data_size": 0 00:12:15.025 }, 00:12:15.025 { 00:12:15.025 "name": "BaseBdev3", 00:12:15.025 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:15.025 "is_configured": false, 00:12:15.025 "data_offset": 0, 00:12:15.025 "data_size": 0 00:12:15.025 } 00:12:15.025 ] 00:12:15.025 }' 00:12:15.025 23:33:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:15.025 23:33:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:15.591 23:34:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:15.591 [2024-07-24 23:34:00.457960] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:15.591 [2024-07-24 23:34:00.457995] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x22b03a0 name Existed_Raid, state configuring 00:12:15.591 23:34:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:15.849 [2024-07-24 23:34:00.622407] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:15.849 [2024-07-24 23:34:00.623398] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:15.849 [2024-07-24 23:34:00.623423] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:15.849 [2024-07-24 23:34:00.623429] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:12:15.849 [2024-07-24 23:34:00.623433] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:12:15.849 23:34:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:12:15.849 23:34:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:15.849 23:34:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:15.849 23:34:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:15.849 23:34:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:15.849 23:34:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:15.849 23:34:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:15.849 23:34:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:15.849 23:34:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:15.849 23:34:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:15.849 23:34:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:15.849 23:34:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:15.849 23:34:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:15.849 23:34:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:15.849 23:34:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:15.849 "name": "Existed_Raid", 00:12:15.849 "uuid": "22bb4a0a-6068-4592-bf6b-79aa3fe0ae5b", 00:12:15.849 "strip_size_kb": 64, 00:12:15.849 "state": "configuring", 00:12:15.849 "raid_level": "concat", 00:12:15.849 "superblock": true, 00:12:15.849 "num_base_bdevs": 3, 00:12:15.849 "num_base_bdevs_discovered": 1, 00:12:15.849 "num_base_bdevs_operational": 3, 00:12:15.849 "base_bdevs_list": [ 00:12:15.849 { 00:12:15.849 "name": "BaseBdev1", 00:12:15.849 "uuid": "61193c0f-0f67-46c8-a56e-e565c731c7b2", 00:12:15.849 "is_configured": true, 00:12:15.849 "data_offset": 2048, 00:12:15.849 "data_size": 63488 00:12:15.849 }, 00:12:15.849 { 00:12:15.849 "name": "BaseBdev2", 00:12:15.849 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:15.849 "is_configured": false, 00:12:15.849 "data_offset": 0, 00:12:15.849 "data_size": 0 00:12:15.849 }, 00:12:15.849 { 00:12:15.849 "name": "BaseBdev3", 00:12:15.849 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:15.849 "is_configured": false, 00:12:15.849 "data_offset": 0, 00:12:15.849 "data_size": 0 00:12:15.849 } 00:12:15.849 ] 00:12:15.849 }' 00:12:15.849 23:34:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:15.849 23:34:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:16.415 23:34:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:16.672 [2024-07-24 23:34:01.451123] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:16.673 BaseBdev2 00:12:16.673 23:34:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:12:16.673 23:34:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:12:16.673 23:34:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:12:16.673 23:34:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:12:16.673 23:34:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:12:16.673 23:34:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:12:16.673 23:34:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:16.673 23:34:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:16.931 [ 00:12:16.931 { 00:12:16.931 "name": "BaseBdev2", 00:12:16.931 "aliases": [ 00:12:16.931 "ec74d600-d4f2-420f-a9a6-432f044f5f37" 00:12:16.931 ], 00:12:16.931 "product_name": "Malloc disk", 00:12:16.931 "block_size": 512, 00:12:16.931 "num_blocks": 65536, 00:12:16.931 "uuid": "ec74d600-d4f2-420f-a9a6-432f044f5f37", 00:12:16.931 "assigned_rate_limits": { 00:12:16.931 "rw_ios_per_sec": 0, 00:12:16.931 "rw_mbytes_per_sec": 0, 00:12:16.931 "r_mbytes_per_sec": 0, 00:12:16.931 "w_mbytes_per_sec": 0 00:12:16.931 }, 00:12:16.931 "claimed": true, 00:12:16.931 "claim_type": "exclusive_write", 00:12:16.931 "zoned": false, 00:12:16.931 "supported_io_types": { 00:12:16.931 "read": true, 00:12:16.931 "write": true, 00:12:16.931 "unmap": true, 00:12:16.931 "flush": true, 00:12:16.931 "reset": true, 00:12:16.931 "nvme_admin": false, 00:12:16.931 "nvme_io": false, 00:12:16.931 "nvme_io_md": false, 00:12:16.931 "write_zeroes": true, 00:12:16.931 "zcopy": true, 00:12:16.931 "get_zone_info": false, 00:12:16.931 "zone_management": false, 00:12:16.931 "zone_append": false, 00:12:16.931 "compare": false, 00:12:16.931 "compare_and_write": false, 00:12:16.931 "abort": true, 00:12:16.931 "seek_hole": false, 00:12:16.931 "seek_data": false, 00:12:16.931 "copy": true, 00:12:16.931 "nvme_iov_md": false 00:12:16.931 }, 00:12:16.931 "memory_domains": [ 00:12:16.931 { 00:12:16.931 "dma_device_id": "system", 00:12:16.931 "dma_device_type": 1 00:12:16.931 }, 00:12:16.931 { 00:12:16.931 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:16.931 "dma_device_type": 2 00:12:16.931 } 00:12:16.931 ], 00:12:16.931 "driver_specific": {} 00:12:16.931 } 00:12:16.931 ] 00:12:16.931 23:34:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:12:16.931 23:34:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:12:16.931 23:34:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:16.931 23:34:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:16.931 23:34:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:16.931 23:34:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:16.931 23:34:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:16.931 23:34:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:16.931 23:34:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:16.931 23:34:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:16.931 23:34:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:16.931 23:34:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:16.931 23:34:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:16.931 23:34:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:16.931 23:34:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:17.189 23:34:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:17.189 "name": "Existed_Raid", 00:12:17.189 "uuid": "22bb4a0a-6068-4592-bf6b-79aa3fe0ae5b", 00:12:17.189 "strip_size_kb": 64, 00:12:17.189 "state": "configuring", 00:12:17.189 "raid_level": "concat", 00:12:17.189 "superblock": true, 00:12:17.189 "num_base_bdevs": 3, 00:12:17.189 "num_base_bdevs_discovered": 2, 00:12:17.189 "num_base_bdevs_operational": 3, 00:12:17.189 "base_bdevs_list": [ 00:12:17.189 { 00:12:17.189 "name": "BaseBdev1", 00:12:17.189 "uuid": "61193c0f-0f67-46c8-a56e-e565c731c7b2", 00:12:17.189 "is_configured": true, 00:12:17.189 "data_offset": 2048, 00:12:17.189 "data_size": 63488 00:12:17.189 }, 00:12:17.189 { 00:12:17.189 "name": "BaseBdev2", 00:12:17.189 "uuid": "ec74d600-d4f2-420f-a9a6-432f044f5f37", 00:12:17.189 "is_configured": true, 00:12:17.189 "data_offset": 2048, 00:12:17.189 "data_size": 63488 00:12:17.189 }, 00:12:17.189 { 00:12:17.189 "name": "BaseBdev3", 00:12:17.189 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:17.189 "is_configured": false, 00:12:17.189 "data_offset": 0, 00:12:17.189 "data_size": 0 00:12:17.189 } 00:12:17.189 ] 00:12:17.189 }' 00:12:17.189 23:34:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:17.189 23:34:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:17.446 23:34:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:12:17.703 [2024-07-24 23:34:02.596674] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:17.703 [2024-07-24 23:34:02.596805] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x22b12a0 00:12:17.703 [2024-07-24 23:34:02.596815] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:12:17.703 [2024-07-24 23:34:02.596936] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x22b2310 00:12:17.703 [2024-07-24 23:34:02.597019] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x22b12a0 00:12:17.703 [2024-07-24 23:34:02.597025] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x22b12a0 00:12:17.703 [2024-07-24 23:34:02.597089] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:17.703 BaseBdev3 00:12:17.703 23:34:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:12:17.703 23:34:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:12:17.704 23:34:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:12:17.704 23:34:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:12:17.704 23:34:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:12:17.704 23:34:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:12:17.704 23:34:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:17.961 23:34:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:12:17.961 [ 00:12:17.961 { 00:12:17.961 "name": "BaseBdev3", 00:12:17.961 "aliases": [ 00:12:17.961 "de5b187b-ca30-4858-9e8b-ff3cc8526b23" 00:12:17.961 ], 00:12:17.961 "product_name": "Malloc disk", 00:12:17.961 "block_size": 512, 00:12:17.961 "num_blocks": 65536, 00:12:17.961 "uuid": "de5b187b-ca30-4858-9e8b-ff3cc8526b23", 00:12:17.961 "assigned_rate_limits": { 00:12:17.961 "rw_ios_per_sec": 0, 00:12:17.961 "rw_mbytes_per_sec": 0, 00:12:17.961 "r_mbytes_per_sec": 0, 00:12:17.961 "w_mbytes_per_sec": 0 00:12:17.961 }, 00:12:17.961 "claimed": true, 00:12:17.961 "claim_type": "exclusive_write", 00:12:17.961 "zoned": false, 00:12:17.961 "supported_io_types": { 00:12:17.961 "read": true, 00:12:17.961 "write": true, 00:12:17.961 "unmap": true, 00:12:17.961 "flush": true, 00:12:17.961 "reset": true, 00:12:17.961 "nvme_admin": false, 00:12:17.961 "nvme_io": false, 00:12:17.961 "nvme_io_md": false, 00:12:17.961 "write_zeroes": true, 00:12:17.961 "zcopy": true, 00:12:17.961 "get_zone_info": false, 00:12:17.961 "zone_management": false, 00:12:17.961 "zone_append": false, 00:12:17.961 "compare": false, 00:12:17.961 "compare_and_write": false, 00:12:17.961 "abort": true, 00:12:17.961 "seek_hole": false, 00:12:17.961 "seek_data": false, 00:12:17.961 "copy": true, 00:12:17.961 "nvme_iov_md": false 00:12:17.961 }, 00:12:17.961 "memory_domains": [ 00:12:17.961 { 00:12:17.961 "dma_device_id": "system", 00:12:17.961 "dma_device_type": 1 00:12:17.961 }, 00:12:17.961 { 00:12:17.961 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:17.961 "dma_device_type": 2 00:12:17.961 } 00:12:17.961 ], 00:12:17.961 "driver_specific": {} 00:12:17.961 } 00:12:17.961 ] 00:12:17.961 23:34:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:12:17.961 23:34:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:12:17.961 23:34:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:17.961 23:34:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:12:17.961 23:34:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:17.961 23:34:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:17.961 23:34:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:17.961 23:34:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:17.961 23:34:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:17.961 23:34:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:17.961 23:34:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:17.961 23:34:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:17.962 23:34:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:17.962 23:34:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:17.962 23:34:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:18.229 23:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:18.229 "name": "Existed_Raid", 00:12:18.229 "uuid": "22bb4a0a-6068-4592-bf6b-79aa3fe0ae5b", 00:12:18.229 "strip_size_kb": 64, 00:12:18.229 "state": "online", 00:12:18.229 "raid_level": "concat", 00:12:18.229 "superblock": true, 00:12:18.229 "num_base_bdevs": 3, 00:12:18.229 "num_base_bdevs_discovered": 3, 00:12:18.229 "num_base_bdevs_operational": 3, 00:12:18.229 "base_bdevs_list": [ 00:12:18.229 { 00:12:18.229 "name": "BaseBdev1", 00:12:18.229 "uuid": "61193c0f-0f67-46c8-a56e-e565c731c7b2", 00:12:18.229 "is_configured": true, 00:12:18.229 "data_offset": 2048, 00:12:18.229 "data_size": 63488 00:12:18.229 }, 00:12:18.229 { 00:12:18.229 "name": "BaseBdev2", 00:12:18.229 "uuid": "ec74d600-d4f2-420f-a9a6-432f044f5f37", 00:12:18.229 "is_configured": true, 00:12:18.229 "data_offset": 2048, 00:12:18.229 "data_size": 63488 00:12:18.229 }, 00:12:18.229 { 00:12:18.229 "name": "BaseBdev3", 00:12:18.229 "uuid": "de5b187b-ca30-4858-9e8b-ff3cc8526b23", 00:12:18.229 "is_configured": true, 00:12:18.229 "data_offset": 2048, 00:12:18.229 "data_size": 63488 00:12:18.229 } 00:12:18.229 ] 00:12:18.229 }' 00:12:18.229 23:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:18.229 23:34:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:18.841 23:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:12:18.841 23:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:12:18.841 23:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:18.841 23:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:18.841 23:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:18.841 23:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:12:18.841 23:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:18.841 23:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:18.841 [2024-07-24 23:34:03.727803] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:18.841 23:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:18.841 "name": "Existed_Raid", 00:12:18.841 "aliases": [ 00:12:18.841 "22bb4a0a-6068-4592-bf6b-79aa3fe0ae5b" 00:12:18.841 ], 00:12:18.841 "product_name": "Raid Volume", 00:12:18.841 "block_size": 512, 00:12:18.841 "num_blocks": 190464, 00:12:18.841 "uuid": "22bb4a0a-6068-4592-bf6b-79aa3fe0ae5b", 00:12:18.841 "assigned_rate_limits": { 00:12:18.841 "rw_ios_per_sec": 0, 00:12:18.841 "rw_mbytes_per_sec": 0, 00:12:18.841 "r_mbytes_per_sec": 0, 00:12:18.841 "w_mbytes_per_sec": 0 00:12:18.841 }, 00:12:18.841 "claimed": false, 00:12:18.841 "zoned": false, 00:12:18.841 "supported_io_types": { 00:12:18.841 "read": true, 00:12:18.841 "write": true, 00:12:18.841 "unmap": true, 00:12:18.841 "flush": true, 00:12:18.841 "reset": true, 00:12:18.841 "nvme_admin": false, 00:12:18.841 "nvme_io": false, 00:12:18.841 "nvme_io_md": false, 00:12:18.841 "write_zeroes": true, 00:12:18.841 "zcopy": false, 00:12:18.841 "get_zone_info": false, 00:12:18.841 "zone_management": false, 00:12:18.841 "zone_append": false, 00:12:18.841 "compare": false, 00:12:18.841 "compare_and_write": false, 00:12:18.841 "abort": false, 00:12:18.841 "seek_hole": false, 00:12:18.841 "seek_data": false, 00:12:18.841 "copy": false, 00:12:18.842 "nvme_iov_md": false 00:12:18.842 }, 00:12:18.842 "memory_domains": [ 00:12:18.842 { 00:12:18.842 "dma_device_id": "system", 00:12:18.842 "dma_device_type": 1 00:12:18.842 }, 00:12:18.842 { 00:12:18.842 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:18.842 "dma_device_type": 2 00:12:18.842 }, 00:12:18.842 { 00:12:18.842 "dma_device_id": "system", 00:12:18.842 "dma_device_type": 1 00:12:18.842 }, 00:12:18.842 { 00:12:18.842 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:18.842 "dma_device_type": 2 00:12:18.842 }, 00:12:18.842 { 00:12:18.842 "dma_device_id": "system", 00:12:18.842 "dma_device_type": 1 00:12:18.842 }, 00:12:18.842 { 00:12:18.842 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:18.842 "dma_device_type": 2 00:12:18.842 } 00:12:18.842 ], 00:12:18.842 "driver_specific": { 00:12:18.842 "raid": { 00:12:18.842 "uuid": "22bb4a0a-6068-4592-bf6b-79aa3fe0ae5b", 00:12:18.842 "strip_size_kb": 64, 00:12:18.842 "state": "online", 00:12:18.842 "raid_level": "concat", 00:12:18.842 "superblock": true, 00:12:18.842 "num_base_bdevs": 3, 00:12:18.842 "num_base_bdevs_discovered": 3, 00:12:18.842 "num_base_bdevs_operational": 3, 00:12:18.842 "base_bdevs_list": [ 00:12:18.842 { 00:12:18.842 "name": "BaseBdev1", 00:12:18.842 "uuid": "61193c0f-0f67-46c8-a56e-e565c731c7b2", 00:12:18.842 "is_configured": true, 00:12:18.842 "data_offset": 2048, 00:12:18.842 "data_size": 63488 00:12:18.842 }, 00:12:18.842 { 00:12:18.842 "name": "BaseBdev2", 00:12:18.842 "uuid": "ec74d600-d4f2-420f-a9a6-432f044f5f37", 00:12:18.842 "is_configured": true, 00:12:18.842 "data_offset": 2048, 00:12:18.842 "data_size": 63488 00:12:18.842 }, 00:12:18.842 { 00:12:18.842 "name": "BaseBdev3", 00:12:18.842 "uuid": "de5b187b-ca30-4858-9e8b-ff3cc8526b23", 00:12:18.842 "is_configured": true, 00:12:18.842 "data_offset": 2048, 00:12:18.842 "data_size": 63488 00:12:18.842 } 00:12:18.842 ] 00:12:18.842 } 00:12:18.842 } 00:12:18.842 }' 00:12:18.842 23:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:18.842 23:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:12:18.842 BaseBdev2 00:12:18.842 BaseBdev3' 00:12:18.842 23:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:18.842 23:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:12:18.842 23:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:19.099 23:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:19.099 "name": "BaseBdev1", 00:12:19.099 "aliases": [ 00:12:19.099 "61193c0f-0f67-46c8-a56e-e565c731c7b2" 00:12:19.099 ], 00:12:19.099 "product_name": "Malloc disk", 00:12:19.099 "block_size": 512, 00:12:19.099 "num_blocks": 65536, 00:12:19.099 "uuid": "61193c0f-0f67-46c8-a56e-e565c731c7b2", 00:12:19.099 "assigned_rate_limits": { 00:12:19.099 "rw_ios_per_sec": 0, 00:12:19.099 "rw_mbytes_per_sec": 0, 00:12:19.099 "r_mbytes_per_sec": 0, 00:12:19.099 "w_mbytes_per_sec": 0 00:12:19.099 }, 00:12:19.099 "claimed": true, 00:12:19.099 "claim_type": "exclusive_write", 00:12:19.099 "zoned": false, 00:12:19.099 "supported_io_types": { 00:12:19.099 "read": true, 00:12:19.099 "write": true, 00:12:19.099 "unmap": true, 00:12:19.099 "flush": true, 00:12:19.099 "reset": true, 00:12:19.099 "nvme_admin": false, 00:12:19.099 "nvme_io": false, 00:12:19.099 "nvme_io_md": false, 00:12:19.099 "write_zeroes": true, 00:12:19.099 "zcopy": true, 00:12:19.099 "get_zone_info": false, 00:12:19.099 "zone_management": false, 00:12:19.099 "zone_append": false, 00:12:19.099 "compare": false, 00:12:19.099 "compare_and_write": false, 00:12:19.099 "abort": true, 00:12:19.099 "seek_hole": false, 00:12:19.099 "seek_data": false, 00:12:19.099 "copy": true, 00:12:19.099 "nvme_iov_md": false 00:12:19.099 }, 00:12:19.099 "memory_domains": [ 00:12:19.099 { 00:12:19.099 "dma_device_id": "system", 00:12:19.099 "dma_device_type": 1 00:12:19.099 }, 00:12:19.099 { 00:12:19.099 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:19.099 "dma_device_type": 2 00:12:19.099 } 00:12:19.099 ], 00:12:19.099 "driver_specific": {} 00:12:19.099 }' 00:12:19.099 23:34:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:19.099 23:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:19.099 23:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:19.099 23:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:19.099 23:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:19.357 23:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:19.357 23:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:19.357 23:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:19.357 23:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:19.357 23:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:19.357 23:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:19.357 23:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:19.357 23:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:19.357 23:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:19.357 23:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:19.614 23:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:19.614 "name": "BaseBdev2", 00:12:19.615 "aliases": [ 00:12:19.615 "ec74d600-d4f2-420f-a9a6-432f044f5f37" 00:12:19.615 ], 00:12:19.615 "product_name": "Malloc disk", 00:12:19.615 "block_size": 512, 00:12:19.615 "num_blocks": 65536, 00:12:19.615 "uuid": "ec74d600-d4f2-420f-a9a6-432f044f5f37", 00:12:19.615 "assigned_rate_limits": { 00:12:19.615 "rw_ios_per_sec": 0, 00:12:19.615 "rw_mbytes_per_sec": 0, 00:12:19.615 "r_mbytes_per_sec": 0, 00:12:19.615 "w_mbytes_per_sec": 0 00:12:19.615 }, 00:12:19.615 "claimed": true, 00:12:19.615 "claim_type": "exclusive_write", 00:12:19.615 "zoned": false, 00:12:19.615 "supported_io_types": { 00:12:19.615 "read": true, 00:12:19.615 "write": true, 00:12:19.615 "unmap": true, 00:12:19.615 "flush": true, 00:12:19.615 "reset": true, 00:12:19.615 "nvme_admin": false, 00:12:19.615 "nvme_io": false, 00:12:19.615 "nvme_io_md": false, 00:12:19.615 "write_zeroes": true, 00:12:19.615 "zcopy": true, 00:12:19.615 "get_zone_info": false, 00:12:19.615 "zone_management": false, 00:12:19.615 "zone_append": false, 00:12:19.615 "compare": false, 00:12:19.615 "compare_and_write": false, 00:12:19.615 "abort": true, 00:12:19.615 "seek_hole": false, 00:12:19.615 "seek_data": false, 00:12:19.615 "copy": true, 00:12:19.615 "nvme_iov_md": false 00:12:19.615 }, 00:12:19.615 "memory_domains": [ 00:12:19.615 { 00:12:19.615 "dma_device_id": "system", 00:12:19.615 "dma_device_type": 1 00:12:19.615 }, 00:12:19.615 { 00:12:19.615 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:19.615 "dma_device_type": 2 00:12:19.615 } 00:12:19.615 ], 00:12:19.615 "driver_specific": {} 00:12:19.615 }' 00:12:19.615 23:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:19.615 23:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:19.615 23:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:19.615 23:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:19.615 23:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:19.615 23:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:19.615 23:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:19.873 23:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:19.873 23:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:19.873 23:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:19.873 23:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:19.873 23:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:19.873 23:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:19.873 23:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:12:19.873 23:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:20.131 23:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:20.131 "name": "BaseBdev3", 00:12:20.131 "aliases": [ 00:12:20.131 "de5b187b-ca30-4858-9e8b-ff3cc8526b23" 00:12:20.131 ], 00:12:20.131 "product_name": "Malloc disk", 00:12:20.131 "block_size": 512, 00:12:20.131 "num_blocks": 65536, 00:12:20.131 "uuid": "de5b187b-ca30-4858-9e8b-ff3cc8526b23", 00:12:20.131 "assigned_rate_limits": { 00:12:20.131 "rw_ios_per_sec": 0, 00:12:20.131 "rw_mbytes_per_sec": 0, 00:12:20.131 "r_mbytes_per_sec": 0, 00:12:20.131 "w_mbytes_per_sec": 0 00:12:20.131 }, 00:12:20.131 "claimed": true, 00:12:20.131 "claim_type": "exclusive_write", 00:12:20.131 "zoned": false, 00:12:20.131 "supported_io_types": { 00:12:20.131 "read": true, 00:12:20.131 "write": true, 00:12:20.131 "unmap": true, 00:12:20.131 "flush": true, 00:12:20.131 "reset": true, 00:12:20.131 "nvme_admin": false, 00:12:20.131 "nvme_io": false, 00:12:20.131 "nvme_io_md": false, 00:12:20.131 "write_zeroes": true, 00:12:20.131 "zcopy": true, 00:12:20.131 "get_zone_info": false, 00:12:20.131 "zone_management": false, 00:12:20.131 "zone_append": false, 00:12:20.131 "compare": false, 00:12:20.131 "compare_and_write": false, 00:12:20.131 "abort": true, 00:12:20.131 "seek_hole": false, 00:12:20.131 "seek_data": false, 00:12:20.131 "copy": true, 00:12:20.131 "nvme_iov_md": false 00:12:20.131 }, 00:12:20.131 "memory_domains": [ 00:12:20.131 { 00:12:20.131 "dma_device_id": "system", 00:12:20.131 "dma_device_type": 1 00:12:20.131 }, 00:12:20.131 { 00:12:20.131 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:20.131 "dma_device_type": 2 00:12:20.131 } 00:12:20.131 ], 00:12:20.131 "driver_specific": {} 00:12:20.131 }' 00:12:20.131 23:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:20.131 23:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:20.131 23:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:20.131 23:34:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:20.131 23:34:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:20.131 23:34:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:20.131 23:34:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:20.131 23:34:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:20.389 23:34:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:20.389 23:34:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:20.389 23:34:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:20.389 23:34:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:20.389 23:34:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:20.389 [2024-07-24 23:34:05.383916] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:20.389 [2024-07-24 23:34:05.383935] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:20.389 [2024-07-24 23:34:05.383960] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:20.648 23:34:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:12:20.648 23:34:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:12:20.648 23:34:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:20.648 23:34:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:12:20.648 23:34:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:12:20.648 23:34:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 2 00:12:20.648 23:34:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:20.648 23:34:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:12:20.648 23:34:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:20.648 23:34:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:20.648 23:34:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:20.648 23:34:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:20.648 23:34:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:20.648 23:34:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:20.648 23:34:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:20.648 23:34:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:20.648 23:34:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:20.648 23:34:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:20.648 "name": "Existed_Raid", 00:12:20.648 "uuid": "22bb4a0a-6068-4592-bf6b-79aa3fe0ae5b", 00:12:20.648 "strip_size_kb": 64, 00:12:20.648 "state": "offline", 00:12:20.648 "raid_level": "concat", 00:12:20.648 "superblock": true, 00:12:20.648 "num_base_bdevs": 3, 00:12:20.648 "num_base_bdevs_discovered": 2, 00:12:20.648 "num_base_bdevs_operational": 2, 00:12:20.648 "base_bdevs_list": [ 00:12:20.648 { 00:12:20.648 "name": null, 00:12:20.648 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:20.648 "is_configured": false, 00:12:20.648 "data_offset": 2048, 00:12:20.648 "data_size": 63488 00:12:20.648 }, 00:12:20.648 { 00:12:20.648 "name": "BaseBdev2", 00:12:20.648 "uuid": "ec74d600-d4f2-420f-a9a6-432f044f5f37", 00:12:20.648 "is_configured": true, 00:12:20.648 "data_offset": 2048, 00:12:20.648 "data_size": 63488 00:12:20.648 }, 00:12:20.648 { 00:12:20.648 "name": "BaseBdev3", 00:12:20.648 "uuid": "de5b187b-ca30-4858-9e8b-ff3cc8526b23", 00:12:20.648 "is_configured": true, 00:12:20.648 "data_offset": 2048, 00:12:20.648 "data_size": 63488 00:12:20.648 } 00:12:20.648 ] 00:12:20.648 }' 00:12:20.648 23:34:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:20.648 23:34:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:21.213 23:34:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:12:21.213 23:34:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:21.213 23:34:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:21.213 23:34:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:21.471 23:34:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:21.471 23:34:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:21.471 23:34:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:12:21.471 [2024-07-24 23:34:06.399459] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:21.471 23:34:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:21.472 23:34:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:21.472 23:34:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:21.472 23:34:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:21.730 23:34:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:21.730 23:34:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:21.730 23:34:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:12:21.987 [2024-07-24 23:34:06.750140] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:12:21.987 [2024-07-24 23:34:06.750171] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x22b12a0 name Existed_Raid, state offline 00:12:21.987 23:34:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:21.987 23:34:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:21.987 23:34:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:12:21.987 23:34:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:21.987 23:34:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:12:21.987 23:34:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:12:21.987 23:34:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:12:21.987 23:34:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:12:21.987 23:34:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:12:21.987 23:34:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:22.246 BaseBdev2 00:12:22.246 23:34:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:12:22.246 23:34:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:12:22.246 23:34:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:12:22.246 23:34:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:12:22.246 23:34:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:12:22.246 23:34:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:12:22.246 23:34:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:22.504 23:34:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:22.504 [ 00:12:22.504 { 00:12:22.504 "name": "BaseBdev2", 00:12:22.504 "aliases": [ 00:12:22.504 "ff170074-d98c-4306-896c-89be87bc71ba" 00:12:22.504 ], 00:12:22.504 "product_name": "Malloc disk", 00:12:22.504 "block_size": 512, 00:12:22.504 "num_blocks": 65536, 00:12:22.504 "uuid": "ff170074-d98c-4306-896c-89be87bc71ba", 00:12:22.504 "assigned_rate_limits": { 00:12:22.504 "rw_ios_per_sec": 0, 00:12:22.504 "rw_mbytes_per_sec": 0, 00:12:22.504 "r_mbytes_per_sec": 0, 00:12:22.504 "w_mbytes_per_sec": 0 00:12:22.504 }, 00:12:22.504 "claimed": false, 00:12:22.504 "zoned": false, 00:12:22.504 "supported_io_types": { 00:12:22.504 "read": true, 00:12:22.504 "write": true, 00:12:22.504 "unmap": true, 00:12:22.504 "flush": true, 00:12:22.504 "reset": true, 00:12:22.504 "nvme_admin": false, 00:12:22.504 "nvme_io": false, 00:12:22.504 "nvme_io_md": false, 00:12:22.504 "write_zeroes": true, 00:12:22.504 "zcopy": true, 00:12:22.504 "get_zone_info": false, 00:12:22.504 "zone_management": false, 00:12:22.504 "zone_append": false, 00:12:22.504 "compare": false, 00:12:22.504 "compare_and_write": false, 00:12:22.504 "abort": true, 00:12:22.504 "seek_hole": false, 00:12:22.504 "seek_data": false, 00:12:22.504 "copy": true, 00:12:22.504 "nvme_iov_md": false 00:12:22.504 }, 00:12:22.504 "memory_domains": [ 00:12:22.504 { 00:12:22.504 "dma_device_id": "system", 00:12:22.504 "dma_device_type": 1 00:12:22.504 }, 00:12:22.504 { 00:12:22.504 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:22.504 "dma_device_type": 2 00:12:22.504 } 00:12:22.504 ], 00:12:22.504 "driver_specific": {} 00:12:22.504 } 00:12:22.504 ] 00:12:22.504 23:34:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:12:22.504 23:34:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:12:22.504 23:34:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:12:22.504 23:34:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:12:22.762 BaseBdev3 00:12:22.762 23:34:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:12:22.762 23:34:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:12:22.762 23:34:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:12:22.762 23:34:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:12:22.762 23:34:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:12:22.762 23:34:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:12:22.762 23:34:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:23.021 23:34:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:12:23.021 [ 00:12:23.021 { 00:12:23.021 "name": "BaseBdev3", 00:12:23.021 "aliases": [ 00:12:23.021 "3c33fe64-2fe3-4288-a1a2-20540a8dbdd9" 00:12:23.021 ], 00:12:23.021 "product_name": "Malloc disk", 00:12:23.021 "block_size": 512, 00:12:23.021 "num_blocks": 65536, 00:12:23.021 "uuid": "3c33fe64-2fe3-4288-a1a2-20540a8dbdd9", 00:12:23.021 "assigned_rate_limits": { 00:12:23.021 "rw_ios_per_sec": 0, 00:12:23.021 "rw_mbytes_per_sec": 0, 00:12:23.021 "r_mbytes_per_sec": 0, 00:12:23.021 "w_mbytes_per_sec": 0 00:12:23.021 }, 00:12:23.021 "claimed": false, 00:12:23.021 "zoned": false, 00:12:23.021 "supported_io_types": { 00:12:23.021 "read": true, 00:12:23.021 "write": true, 00:12:23.021 "unmap": true, 00:12:23.021 "flush": true, 00:12:23.021 "reset": true, 00:12:23.021 "nvme_admin": false, 00:12:23.021 "nvme_io": false, 00:12:23.021 "nvme_io_md": false, 00:12:23.021 "write_zeroes": true, 00:12:23.021 "zcopy": true, 00:12:23.021 "get_zone_info": false, 00:12:23.021 "zone_management": false, 00:12:23.021 "zone_append": false, 00:12:23.021 "compare": false, 00:12:23.021 "compare_and_write": false, 00:12:23.021 "abort": true, 00:12:23.021 "seek_hole": false, 00:12:23.021 "seek_data": false, 00:12:23.021 "copy": true, 00:12:23.021 "nvme_iov_md": false 00:12:23.021 }, 00:12:23.021 "memory_domains": [ 00:12:23.021 { 00:12:23.021 "dma_device_id": "system", 00:12:23.021 "dma_device_type": 1 00:12:23.021 }, 00:12:23.021 { 00:12:23.021 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:23.021 "dma_device_type": 2 00:12:23.021 } 00:12:23.021 ], 00:12:23.021 "driver_specific": {} 00:12:23.021 } 00:12:23.021 ] 00:12:23.021 23:34:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:12:23.021 23:34:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:12:23.021 23:34:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:12:23.021 23:34:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:23.279 [2024-07-24 23:34:08.074833] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:23.279 [2024-07-24 23:34:08.074862] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:23.279 [2024-07-24 23:34:08.074873] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:23.279 [2024-07-24 23:34:08.075796] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:23.279 23:34:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:23.279 23:34:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:23.279 23:34:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:23.279 23:34:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:23.279 23:34:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:23.279 23:34:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:23.279 23:34:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:23.279 23:34:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:23.279 23:34:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:23.279 23:34:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:23.279 23:34:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:23.279 23:34:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:23.279 23:34:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:23.279 "name": "Existed_Raid", 00:12:23.279 "uuid": "e7d8acbb-1a0c-4571-80f9-03a81bbacd02", 00:12:23.279 "strip_size_kb": 64, 00:12:23.279 "state": "configuring", 00:12:23.279 "raid_level": "concat", 00:12:23.279 "superblock": true, 00:12:23.279 "num_base_bdevs": 3, 00:12:23.279 "num_base_bdevs_discovered": 2, 00:12:23.279 "num_base_bdevs_operational": 3, 00:12:23.280 "base_bdevs_list": [ 00:12:23.280 { 00:12:23.280 "name": "BaseBdev1", 00:12:23.280 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:23.280 "is_configured": false, 00:12:23.280 "data_offset": 0, 00:12:23.280 "data_size": 0 00:12:23.280 }, 00:12:23.280 { 00:12:23.280 "name": "BaseBdev2", 00:12:23.280 "uuid": "ff170074-d98c-4306-896c-89be87bc71ba", 00:12:23.280 "is_configured": true, 00:12:23.280 "data_offset": 2048, 00:12:23.280 "data_size": 63488 00:12:23.280 }, 00:12:23.280 { 00:12:23.280 "name": "BaseBdev3", 00:12:23.280 "uuid": "3c33fe64-2fe3-4288-a1a2-20540a8dbdd9", 00:12:23.280 "is_configured": true, 00:12:23.280 "data_offset": 2048, 00:12:23.280 "data_size": 63488 00:12:23.280 } 00:12:23.280 ] 00:12:23.280 }' 00:12:23.280 23:34:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:23.280 23:34:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:23.845 23:34:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:12:24.103 [2024-07-24 23:34:08.856838] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:24.103 23:34:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:24.103 23:34:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:24.103 23:34:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:24.103 23:34:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:24.103 23:34:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:24.103 23:34:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:24.103 23:34:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:24.103 23:34:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:24.103 23:34:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:24.103 23:34:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:24.103 23:34:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:24.103 23:34:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:24.103 23:34:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:24.103 "name": "Existed_Raid", 00:12:24.103 "uuid": "e7d8acbb-1a0c-4571-80f9-03a81bbacd02", 00:12:24.103 "strip_size_kb": 64, 00:12:24.103 "state": "configuring", 00:12:24.103 "raid_level": "concat", 00:12:24.103 "superblock": true, 00:12:24.103 "num_base_bdevs": 3, 00:12:24.103 "num_base_bdevs_discovered": 1, 00:12:24.103 "num_base_bdevs_operational": 3, 00:12:24.103 "base_bdevs_list": [ 00:12:24.103 { 00:12:24.103 "name": "BaseBdev1", 00:12:24.103 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:24.103 "is_configured": false, 00:12:24.103 "data_offset": 0, 00:12:24.103 "data_size": 0 00:12:24.103 }, 00:12:24.103 { 00:12:24.103 "name": null, 00:12:24.103 "uuid": "ff170074-d98c-4306-896c-89be87bc71ba", 00:12:24.103 "is_configured": false, 00:12:24.103 "data_offset": 2048, 00:12:24.103 "data_size": 63488 00:12:24.103 }, 00:12:24.103 { 00:12:24.103 "name": "BaseBdev3", 00:12:24.103 "uuid": "3c33fe64-2fe3-4288-a1a2-20540a8dbdd9", 00:12:24.103 "is_configured": true, 00:12:24.103 "data_offset": 2048, 00:12:24.103 "data_size": 63488 00:12:24.103 } 00:12:24.103 ] 00:12:24.103 }' 00:12:24.103 23:34:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:24.103 23:34:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:24.668 23:34:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:24.668 23:34:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:12:24.926 23:34:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:12:24.926 23:34:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:24.926 [2024-07-24 23:34:09.886169] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:24.926 BaseBdev1 00:12:24.926 23:34:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:12:24.926 23:34:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:12:24.926 23:34:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:12:24.926 23:34:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:12:24.926 23:34:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:12:24.926 23:34:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:12:24.926 23:34:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:25.185 23:34:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:25.443 [ 00:12:25.443 { 00:12:25.443 "name": "BaseBdev1", 00:12:25.443 "aliases": [ 00:12:25.443 "81020cab-4b20-4c26-a3cc-303bcda6651c" 00:12:25.443 ], 00:12:25.443 "product_name": "Malloc disk", 00:12:25.443 "block_size": 512, 00:12:25.443 "num_blocks": 65536, 00:12:25.443 "uuid": "81020cab-4b20-4c26-a3cc-303bcda6651c", 00:12:25.443 "assigned_rate_limits": { 00:12:25.443 "rw_ios_per_sec": 0, 00:12:25.443 "rw_mbytes_per_sec": 0, 00:12:25.443 "r_mbytes_per_sec": 0, 00:12:25.443 "w_mbytes_per_sec": 0 00:12:25.443 }, 00:12:25.443 "claimed": true, 00:12:25.443 "claim_type": "exclusive_write", 00:12:25.443 "zoned": false, 00:12:25.443 "supported_io_types": { 00:12:25.443 "read": true, 00:12:25.443 "write": true, 00:12:25.443 "unmap": true, 00:12:25.443 "flush": true, 00:12:25.443 "reset": true, 00:12:25.443 "nvme_admin": false, 00:12:25.443 "nvme_io": false, 00:12:25.443 "nvme_io_md": false, 00:12:25.443 "write_zeroes": true, 00:12:25.443 "zcopy": true, 00:12:25.443 "get_zone_info": false, 00:12:25.443 "zone_management": false, 00:12:25.443 "zone_append": false, 00:12:25.443 "compare": false, 00:12:25.443 "compare_and_write": false, 00:12:25.443 "abort": true, 00:12:25.443 "seek_hole": false, 00:12:25.443 "seek_data": false, 00:12:25.443 "copy": true, 00:12:25.443 "nvme_iov_md": false 00:12:25.443 }, 00:12:25.443 "memory_domains": [ 00:12:25.443 { 00:12:25.443 "dma_device_id": "system", 00:12:25.443 "dma_device_type": 1 00:12:25.443 }, 00:12:25.443 { 00:12:25.443 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:25.443 "dma_device_type": 2 00:12:25.443 } 00:12:25.443 ], 00:12:25.443 "driver_specific": {} 00:12:25.443 } 00:12:25.443 ] 00:12:25.443 23:34:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:12:25.443 23:34:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:25.443 23:34:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:25.443 23:34:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:25.443 23:34:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:25.443 23:34:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:25.443 23:34:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:25.443 23:34:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:25.443 23:34:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:25.443 23:34:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:25.443 23:34:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:25.443 23:34:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:25.443 23:34:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:25.443 23:34:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:25.443 "name": "Existed_Raid", 00:12:25.443 "uuid": "e7d8acbb-1a0c-4571-80f9-03a81bbacd02", 00:12:25.443 "strip_size_kb": 64, 00:12:25.443 "state": "configuring", 00:12:25.443 "raid_level": "concat", 00:12:25.443 "superblock": true, 00:12:25.443 "num_base_bdevs": 3, 00:12:25.443 "num_base_bdevs_discovered": 2, 00:12:25.443 "num_base_bdevs_operational": 3, 00:12:25.443 "base_bdevs_list": [ 00:12:25.443 { 00:12:25.443 "name": "BaseBdev1", 00:12:25.443 "uuid": "81020cab-4b20-4c26-a3cc-303bcda6651c", 00:12:25.443 "is_configured": true, 00:12:25.443 "data_offset": 2048, 00:12:25.443 "data_size": 63488 00:12:25.443 }, 00:12:25.443 { 00:12:25.443 "name": null, 00:12:25.443 "uuid": "ff170074-d98c-4306-896c-89be87bc71ba", 00:12:25.443 "is_configured": false, 00:12:25.443 "data_offset": 2048, 00:12:25.443 "data_size": 63488 00:12:25.443 }, 00:12:25.443 { 00:12:25.443 "name": "BaseBdev3", 00:12:25.443 "uuid": "3c33fe64-2fe3-4288-a1a2-20540a8dbdd9", 00:12:25.443 "is_configured": true, 00:12:25.443 "data_offset": 2048, 00:12:25.443 "data_size": 63488 00:12:25.443 } 00:12:25.443 ] 00:12:25.443 }' 00:12:25.443 23:34:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:25.443 23:34:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:26.008 23:34:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:26.008 23:34:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:12:26.266 23:34:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:12:26.266 23:34:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:12:26.266 [2024-07-24 23:34:11.233688] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:12:26.266 23:34:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:26.266 23:34:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:26.266 23:34:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:26.266 23:34:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:26.266 23:34:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:26.266 23:34:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:26.266 23:34:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:26.266 23:34:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:26.266 23:34:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:26.266 23:34:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:26.266 23:34:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:26.266 23:34:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:26.526 23:34:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:26.526 "name": "Existed_Raid", 00:12:26.526 "uuid": "e7d8acbb-1a0c-4571-80f9-03a81bbacd02", 00:12:26.526 "strip_size_kb": 64, 00:12:26.526 "state": "configuring", 00:12:26.526 "raid_level": "concat", 00:12:26.526 "superblock": true, 00:12:26.527 "num_base_bdevs": 3, 00:12:26.527 "num_base_bdevs_discovered": 1, 00:12:26.527 "num_base_bdevs_operational": 3, 00:12:26.527 "base_bdevs_list": [ 00:12:26.527 { 00:12:26.527 "name": "BaseBdev1", 00:12:26.527 "uuid": "81020cab-4b20-4c26-a3cc-303bcda6651c", 00:12:26.527 "is_configured": true, 00:12:26.527 "data_offset": 2048, 00:12:26.527 "data_size": 63488 00:12:26.527 }, 00:12:26.527 { 00:12:26.527 "name": null, 00:12:26.527 "uuid": "ff170074-d98c-4306-896c-89be87bc71ba", 00:12:26.527 "is_configured": false, 00:12:26.527 "data_offset": 2048, 00:12:26.527 "data_size": 63488 00:12:26.527 }, 00:12:26.527 { 00:12:26.527 "name": null, 00:12:26.527 "uuid": "3c33fe64-2fe3-4288-a1a2-20540a8dbdd9", 00:12:26.527 "is_configured": false, 00:12:26.527 "data_offset": 2048, 00:12:26.527 "data_size": 63488 00:12:26.527 } 00:12:26.527 ] 00:12:26.527 }' 00:12:26.527 23:34:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:26.527 23:34:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:27.093 23:34:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:27.093 23:34:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:12:27.093 23:34:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:12:27.093 23:34:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:12:27.351 [2024-07-24 23:34:12.228265] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:27.351 23:34:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:27.351 23:34:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:27.351 23:34:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:27.351 23:34:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:27.351 23:34:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:27.351 23:34:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:27.351 23:34:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:27.351 23:34:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:27.351 23:34:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:27.351 23:34:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:27.351 23:34:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:27.351 23:34:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:27.609 23:34:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:27.609 "name": "Existed_Raid", 00:12:27.609 "uuid": "e7d8acbb-1a0c-4571-80f9-03a81bbacd02", 00:12:27.609 "strip_size_kb": 64, 00:12:27.609 "state": "configuring", 00:12:27.609 "raid_level": "concat", 00:12:27.609 "superblock": true, 00:12:27.609 "num_base_bdevs": 3, 00:12:27.609 "num_base_bdevs_discovered": 2, 00:12:27.609 "num_base_bdevs_operational": 3, 00:12:27.609 "base_bdevs_list": [ 00:12:27.609 { 00:12:27.609 "name": "BaseBdev1", 00:12:27.609 "uuid": "81020cab-4b20-4c26-a3cc-303bcda6651c", 00:12:27.609 "is_configured": true, 00:12:27.609 "data_offset": 2048, 00:12:27.609 "data_size": 63488 00:12:27.609 }, 00:12:27.609 { 00:12:27.609 "name": null, 00:12:27.609 "uuid": "ff170074-d98c-4306-896c-89be87bc71ba", 00:12:27.609 "is_configured": false, 00:12:27.609 "data_offset": 2048, 00:12:27.609 "data_size": 63488 00:12:27.609 }, 00:12:27.609 { 00:12:27.609 "name": "BaseBdev3", 00:12:27.609 "uuid": "3c33fe64-2fe3-4288-a1a2-20540a8dbdd9", 00:12:27.609 "is_configured": true, 00:12:27.609 "data_offset": 2048, 00:12:27.609 "data_size": 63488 00:12:27.609 } 00:12:27.609 ] 00:12:27.609 }' 00:12:27.609 23:34:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:27.609 23:34:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:28.176 23:34:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:12:28.176 23:34:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:28.176 23:34:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:12:28.176 23:34:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:28.434 [2024-07-24 23:34:13.246919] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:28.434 23:34:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:28.434 23:34:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:28.434 23:34:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:28.434 23:34:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:28.434 23:34:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:28.434 23:34:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:28.435 23:34:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:28.435 23:34:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:28.435 23:34:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:28.435 23:34:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:28.435 23:34:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:28.435 23:34:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:28.435 23:34:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:28.435 "name": "Existed_Raid", 00:12:28.435 "uuid": "e7d8acbb-1a0c-4571-80f9-03a81bbacd02", 00:12:28.435 "strip_size_kb": 64, 00:12:28.435 "state": "configuring", 00:12:28.435 "raid_level": "concat", 00:12:28.435 "superblock": true, 00:12:28.435 "num_base_bdevs": 3, 00:12:28.435 "num_base_bdevs_discovered": 1, 00:12:28.435 "num_base_bdevs_operational": 3, 00:12:28.435 "base_bdevs_list": [ 00:12:28.435 { 00:12:28.435 "name": null, 00:12:28.435 "uuid": "81020cab-4b20-4c26-a3cc-303bcda6651c", 00:12:28.435 "is_configured": false, 00:12:28.435 "data_offset": 2048, 00:12:28.435 "data_size": 63488 00:12:28.435 }, 00:12:28.435 { 00:12:28.435 "name": null, 00:12:28.435 "uuid": "ff170074-d98c-4306-896c-89be87bc71ba", 00:12:28.435 "is_configured": false, 00:12:28.435 "data_offset": 2048, 00:12:28.435 "data_size": 63488 00:12:28.435 }, 00:12:28.435 { 00:12:28.435 "name": "BaseBdev3", 00:12:28.435 "uuid": "3c33fe64-2fe3-4288-a1a2-20540a8dbdd9", 00:12:28.435 "is_configured": true, 00:12:28.435 "data_offset": 2048, 00:12:28.435 "data_size": 63488 00:12:28.435 } 00:12:28.435 ] 00:12:28.435 }' 00:12:28.435 23:34:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:28.435 23:34:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:28.998 23:34:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:12:28.998 23:34:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:29.255 23:34:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:12:29.255 23:34:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:12:29.255 [2024-07-24 23:34:14.243332] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:29.512 23:34:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:29.512 23:34:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:29.512 23:34:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:29.512 23:34:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:29.512 23:34:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:29.512 23:34:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:29.512 23:34:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:29.512 23:34:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:29.512 23:34:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:29.512 23:34:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:29.512 23:34:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:29.512 23:34:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:29.512 23:34:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:29.512 "name": "Existed_Raid", 00:12:29.512 "uuid": "e7d8acbb-1a0c-4571-80f9-03a81bbacd02", 00:12:29.512 "strip_size_kb": 64, 00:12:29.512 "state": "configuring", 00:12:29.512 "raid_level": "concat", 00:12:29.512 "superblock": true, 00:12:29.512 "num_base_bdevs": 3, 00:12:29.512 "num_base_bdevs_discovered": 2, 00:12:29.512 "num_base_bdevs_operational": 3, 00:12:29.512 "base_bdevs_list": [ 00:12:29.512 { 00:12:29.512 "name": null, 00:12:29.512 "uuid": "81020cab-4b20-4c26-a3cc-303bcda6651c", 00:12:29.512 "is_configured": false, 00:12:29.512 "data_offset": 2048, 00:12:29.512 "data_size": 63488 00:12:29.512 }, 00:12:29.512 { 00:12:29.512 "name": "BaseBdev2", 00:12:29.512 "uuid": "ff170074-d98c-4306-896c-89be87bc71ba", 00:12:29.512 "is_configured": true, 00:12:29.512 "data_offset": 2048, 00:12:29.512 "data_size": 63488 00:12:29.512 }, 00:12:29.512 { 00:12:29.512 "name": "BaseBdev3", 00:12:29.512 "uuid": "3c33fe64-2fe3-4288-a1a2-20540a8dbdd9", 00:12:29.512 "is_configured": true, 00:12:29.512 "data_offset": 2048, 00:12:29.512 "data_size": 63488 00:12:29.512 } 00:12:29.512 ] 00:12:29.512 }' 00:12:29.512 23:34:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:29.512 23:34:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:30.077 23:34:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:12:30.077 23:34:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:30.335 23:34:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:12:30.335 23:34:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:12:30.335 23:34:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:30.335 23:34:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 81020cab-4b20-4c26-a3cc-303bcda6651c 00:12:30.593 [2024-07-24 23:34:15.437095] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:12:30.593 [2024-07-24 23:34:15.437216] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x2462ac0 00:12:30.593 [2024-07-24 23:34:15.437224] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:12:30.593 [2024-07-24 23:34:15.437340] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x24621e0 00:12:30.593 [2024-07-24 23:34:15.437418] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2462ac0 00:12:30.593 [2024-07-24 23:34:15.437423] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2462ac0 00:12:30.593 [2024-07-24 23:34:15.437490] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:30.593 NewBaseBdev 00:12:30.593 23:34:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:12:30.593 23:34:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=NewBaseBdev 00:12:30.593 23:34:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:12:30.593 23:34:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:12:30.593 23:34:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:12:30.593 23:34:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:12:30.593 23:34:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:30.852 23:34:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:12:30.852 [ 00:12:30.852 { 00:12:30.852 "name": "NewBaseBdev", 00:12:30.852 "aliases": [ 00:12:30.852 "81020cab-4b20-4c26-a3cc-303bcda6651c" 00:12:30.852 ], 00:12:30.852 "product_name": "Malloc disk", 00:12:30.852 "block_size": 512, 00:12:30.852 "num_blocks": 65536, 00:12:30.852 "uuid": "81020cab-4b20-4c26-a3cc-303bcda6651c", 00:12:30.852 "assigned_rate_limits": { 00:12:30.852 "rw_ios_per_sec": 0, 00:12:30.852 "rw_mbytes_per_sec": 0, 00:12:30.852 "r_mbytes_per_sec": 0, 00:12:30.852 "w_mbytes_per_sec": 0 00:12:30.852 }, 00:12:30.852 "claimed": true, 00:12:30.852 "claim_type": "exclusive_write", 00:12:30.852 "zoned": false, 00:12:30.852 "supported_io_types": { 00:12:30.852 "read": true, 00:12:30.852 "write": true, 00:12:30.852 "unmap": true, 00:12:30.852 "flush": true, 00:12:30.852 "reset": true, 00:12:30.852 "nvme_admin": false, 00:12:30.852 "nvme_io": false, 00:12:30.852 "nvme_io_md": false, 00:12:30.852 "write_zeroes": true, 00:12:30.852 "zcopy": true, 00:12:30.852 "get_zone_info": false, 00:12:30.852 "zone_management": false, 00:12:30.852 "zone_append": false, 00:12:30.852 "compare": false, 00:12:30.852 "compare_and_write": false, 00:12:30.852 "abort": true, 00:12:30.852 "seek_hole": false, 00:12:30.852 "seek_data": false, 00:12:30.852 "copy": true, 00:12:30.852 "nvme_iov_md": false 00:12:30.852 }, 00:12:30.852 "memory_domains": [ 00:12:30.852 { 00:12:30.852 "dma_device_id": "system", 00:12:30.852 "dma_device_type": 1 00:12:30.852 }, 00:12:30.852 { 00:12:30.852 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:30.852 "dma_device_type": 2 00:12:30.852 } 00:12:30.852 ], 00:12:30.852 "driver_specific": {} 00:12:30.852 } 00:12:30.852 ] 00:12:30.852 23:34:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:12:30.852 23:34:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:12:30.852 23:34:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:30.852 23:34:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:30.852 23:34:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:30.852 23:34:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:30.852 23:34:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:30.852 23:34:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:30.852 23:34:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:30.852 23:34:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:30.852 23:34:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:30.853 23:34:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:30.853 23:34:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:31.111 23:34:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:31.111 "name": "Existed_Raid", 00:12:31.111 "uuid": "e7d8acbb-1a0c-4571-80f9-03a81bbacd02", 00:12:31.111 "strip_size_kb": 64, 00:12:31.111 "state": "online", 00:12:31.111 "raid_level": "concat", 00:12:31.111 "superblock": true, 00:12:31.111 "num_base_bdevs": 3, 00:12:31.111 "num_base_bdevs_discovered": 3, 00:12:31.111 "num_base_bdevs_operational": 3, 00:12:31.111 "base_bdevs_list": [ 00:12:31.111 { 00:12:31.111 "name": "NewBaseBdev", 00:12:31.111 "uuid": "81020cab-4b20-4c26-a3cc-303bcda6651c", 00:12:31.111 "is_configured": true, 00:12:31.111 "data_offset": 2048, 00:12:31.111 "data_size": 63488 00:12:31.111 }, 00:12:31.111 { 00:12:31.111 "name": "BaseBdev2", 00:12:31.111 "uuid": "ff170074-d98c-4306-896c-89be87bc71ba", 00:12:31.111 "is_configured": true, 00:12:31.111 "data_offset": 2048, 00:12:31.111 "data_size": 63488 00:12:31.111 }, 00:12:31.111 { 00:12:31.111 "name": "BaseBdev3", 00:12:31.111 "uuid": "3c33fe64-2fe3-4288-a1a2-20540a8dbdd9", 00:12:31.111 "is_configured": true, 00:12:31.111 "data_offset": 2048, 00:12:31.111 "data_size": 63488 00:12:31.111 } 00:12:31.111 ] 00:12:31.111 }' 00:12:31.111 23:34:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:31.111 23:34:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:31.677 23:34:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:12:31.677 23:34:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:12:31.677 23:34:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:31.677 23:34:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:31.677 23:34:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:31.677 23:34:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:12:31.677 23:34:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:31.677 23:34:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:31.677 [2024-07-24 23:34:16.580248] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:31.677 23:34:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:31.677 "name": "Existed_Raid", 00:12:31.677 "aliases": [ 00:12:31.677 "e7d8acbb-1a0c-4571-80f9-03a81bbacd02" 00:12:31.677 ], 00:12:31.677 "product_name": "Raid Volume", 00:12:31.677 "block_size": 512, 00:12:31.677 "num_blocks": 190464, 00:12:31.677 "uuid": "e7d8acbb-1a0c-4571-80f9-03a81bbacd02", 00:12:31.677 "assigned_rate_limits": { 00:12:31.677 "rw_ios_per_sec": 0, 00:12:31.677 "rw_mbytes_per_sec": 0, 00:12:31.677 "r_mbytes_per_sec": 0, 00:12:31.677 "w_mbytes_per_sec": 0 00:12:31.677 }, 00:12:31.677 "claimed": false, 00:12:31.677 "zoned": false, 00:12:31.677 "supported_io_types": { 00:12:31.677 "read": true, 00:12:31.677 "write": true, 00:12:31.677 "unmap": true, 00:12:31.677 "flush": true, 00:12:31.677 "reset": true, 00:12:31.677 "nvme_admin": false, 00:12:31.677 "nvme_io": false, 00:12:31.677 "nvme_io_md": false, 00:12:31.677 "write_zeroes": true, 00:12:31.677 "zcopy": false, 00:12:31.677 "get_zone_info": false, 00:12:31.677 "zone_management": false, 00:12:31.677 "zone_append": false, 00:12:31.677 "compare": false, 00:12:31.677 "compare_and_write": false, 00:12:31.677 "abort": false, 00:12:31.677 "seek_hole": false, 00:12:31.677 "seek_data": false, 00:12:31.677 "copy": false, 00:12:31.677 "nvme_iov_md": false 00:12:31.677 }, 00:12:31.677 "memory_domains": [ 00:12:31.677 { 00:12:31.677 "dma_device_id": "system", 00:12:31.677 "dma_device_type": 1 00:12:31.677 }, 00:12:31.677 { 00:12:31.677 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:31.677 "dma_device_type": 2 00:12:31.677 }, 00:12:31.677 { 00:12:31.677 "dma_device_id": "system", 00:12:31.677 "dma_device_type": 1 00:12:31.677 }, 00:12:31.677 { 00:12:31.677 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:31.677 "dma_device_type": 2 00:12:31.677 }, 00:12:31.677 { 00:12:31.677 "dma_device_id": "system", 00:12:31.677 "dma_device_type": 1 00:12:31.677 }, 00:12:31.677 { 00:12:31.677 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:31.677 "dma_device_type": 2 00:12:31.677 } 00:12:31.677 ], 00:12:31.677 "driver_specific": { 00:12:31.677 "raid": { 00:12:31.677 "uuid": "e7d8acbb-1a0c-4571-80f9-03a81bbacd02", 00:12:31.677 "strip_size_kb": 64, 00:12:31.677 "state": "online", 00:12:31.677 "raid_level": "concat", 00:12:31.677 "superblock": true, 00:12:31.677 "num_base_bdevs": 3, 00:12:31.677 "num_base_bdevs_discovered": 3, 00:12:31.677 "num_base_bdevs_operational": 3, 00:12:31.677 "base_bdevs_list": [ 00:12:31.677 { 00:12:31.677 "name": "NewBaseBdev", 00:12:31.677 "uuid": "81020cab-4b20-4c26-a3cc-303bcda6651c", 00:12:31.677 "is_configured": true, 00:12:31.677 "data_offset": 2048, 00:12:31.677 "data_size": 63488 00:12:31.677 }, 00:12:31.677 { 00:12:31.677 "name": "BaseBdev2", 00:12:31.677 "uuid": "ff170074-d98c-4306-896c-89be87bc71ba", 00:12:31.677 "is_configured": true, 00:12:31.677 "data_offset": 2048, 00:12:31.677 "data_size": 63488 00:12:31.677 }, 00:12:31.677 { 00:12:31.677 "name": "BaseBdev3", 00:12:31.677 "uuid": "3c33fe64-2fe3-4288-a1a2-20540a8dbdd9", 00:12:31.677 "is_configured": true, 00:12:31.677 "data_offset": 2048, 00:12:31.677 "data_size": 63488 00:12:31.677 } 00:12:31.677 ] 00:12:31.677 } 00:12:31.677 } 00:12:31.677 }' 00:12:31.677 23:34:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:31.677 23:34:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:12:31.677 BaseBdev2 00:12:31.677 BaseBdev3' 00:12:31.677 23:34:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:31.677 23:34:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:12:31.677 23:34:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:31.935 23:34:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:31.935 "name": "NewBaseBdev", 00:12:31.935 "aliases": [ 00:12:31.935 "81020cab-4b20-4c26-a3cc-303bcda6651c" 00:12:31.935 ], 00:12:31.935 "product_name": "Malloc disk", 00:12:31.935 "block_size": 512, 00:12:31.935 "num_blocks": 65536, 00:12:31.935 "uuid": "81020cab-4b20-4c26-a3cc-303bcda6651c", 00:12:31.935 "assigned_rate_limits": { 00:12:31.935 "rw_ios_per_sec": 0, 00:12:31.935 "rw_mbytes_per_sec": 0, 00:12:31.935 "r_mbytes_per_sec": 0, 00:12:31.935 "w_mbytes_per_sec": 0 00:12:31.935 }, 00:12:31.935 "claimed": true, 00:12:31.935 "claim_type": "exclusive_write", 00:12:31.935 "zoned": false, 00:12:31.935 "supported_io_types": { 00:12:31.935 "read": true, 00:12:31.935 "write": true, 00:12:31.935 "unmap": true, 00:12:31.935 "flush": true, 00:12:31.935 "reset": true, 00:12:31.935 "nvme_admin": false, 00:12:31.935 "nvme_io": false, 00:12:31.935 "nvme_io_md": false, 00:12:31.935 "write_zeroes": true, 00:12:31.935 "zcopy": true, 00:12:31.935 "get_zone_info": false, 00:12:31.935 "zone_management": false, 00:12:31.935 "zone_append": false, 00:12:31.935 "compare": false, 00:12:31.935 "compare_and_write": false, 00:12:31.935 "abort": true, 00:12:31.935 "seek_hole": false, 00:12:31.935 "seek_data": false, 00:12:31.935 "copy": true, 00:12:31.935 "nvme_iov_md": false 00:12:31.935 }, 00:12:31.935 "memory_domains": [ 00:12:31.935 { 00:12:31.935 "dma_device_id": "system", 00:12:31.935 "dma_device_type": 1 00:12:31.935 }, 00:12:31.935 { 00:12:31.935 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:31.935 "dma_device_type": 2 00:12:31.935 } 00:12:31.935 ], 00:12:31.935 "driver_specific": {} 00:12:31.935 }' 00:12:31.935 23:34:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:31.935 23:34:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:31.935 23:34:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:31.935 23:34:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:31.935 23:34:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:31.935 23:34:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:31.935 23:34:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:32.194 23:34:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:32.194 23:34:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:32.194 23:34:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:32.194 23:34:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:32.194 23:34:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:32.194 23:34:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:32.194 23:34:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:32.194 23:34:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:32.452 23:34:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:32.452 "name": "BaseBdev2", 00:12:32.452 "aliases": [ 00:12:32.452 "ff170074-d98c-4306-896c-89be87bc71ba" 00:12:32.452 ], 00:12:32.452 "product_name": "Malloc disk", 00:12:32.452 "block_size": 512, 00:12:32.452 "num_blocks": 65536, 00:12:32.452 "uuid": "ff170074-d98c-4306-896c-89be87bc71ba", 00:12:32.452 "assigned_rate_limits": { 00:12:32.452 "rw_ios_per_sec": 0, 00:12:32.452 "rw_mbytes_per_sec": 0, 00:12:32.452 "r_mbytes_per_sec": 0, 00:12:32.452 "w_mbytes_per_sec": 0 00:12:32.452 }, 00:12:32.452 "claimed": true, 00:12:32.452 "claim_type": "exclusive_write", 00:12:32.452 "zoned": false, 00:12:32.452 "supported_io_types": { 00:12:32.452 "read": true, 00:12:32.452 "write": true, 00:12:32.452 "unmap": true, 00:12:32.452 "flush": true, 00:12:32.452 "reset": true, 00:12:32.452 "nvme_admin": false, 00:12:32.452 "nvme_io": false, 00:12:32.452 "nvme_io_md": false, 00:12:32.452 "write_zeroes": true, 00:12:32.452 "zcopy": true, 00:12:32.452 "get_zone_info": false, 00:12:32.452 "zone_management": false, 00:12:32.452 "zone_append": false, 00:12:32.452 "compare": false, 00:12:32.452 "compare_and_write": false, 00:12:32.452 "abort": true, 00:12:32.452 "seek_hole": false, 00:12:32.452 "seek_data": false, 00:12:32.452 "copy": true, 00:12:32.452 "nvme_iov_md": false 00:12:32.452 }, 00:12:32.452 "memory_domains": [ 00:12:32.452 { 00:12:32.452 "dma_device_id": "system", 00:12:32.452 "dma_device_type": 1 00:12:32.452 }, 00:12:32.452 { 00:12:32.452 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:32.452 "dma_device_type": 2 00:12:32.452 } 00:12:32.452 ], 00:12:32.452 "driver_specific": {} 00:12:32.452 }' 00:12:32.452 23:34:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:32.452 23:34:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:32.452 23:34:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:32.452 23:34:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:32.452 23:34:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:32.452 23:34:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:32.452 23:34:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:32.452 23:34:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:32.710 23:34:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:32.710 23:34:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:32.710 23:34:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:32.710 23:34:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:32.710 23:34:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:32.710 23:34:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:12:32.710 23:34:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:32.710 23:34:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:32.710 "name": "BaseBdev3", 00:12:32.710 "aliases": [ 00:12:32.710 "3c33fe64-2fe3-4288-a1a2-20540a8dbdd9" 00:12:32.710 ], 00:12:32.710 "product_name": "Malloc disk", 00:12:32.710 "block_size": 512, 00:12:32.710 "num_blocks": 65536, 00:12:32.710 "uuid": "3c33fe64-2fe3-4288-a1a2-20540a8dbdd9", 00:12:32.710 "assigned_rate_limits": { 00:12:32.710 "rw_ios_per_sec": 0, 00:12:32.710 "rw_mbytes_per_sec": 0, 00:12:32.710 "r_mbytes_per_sec": 0, 00:12:32.710 "w_mbytes_per_sec": 0 00:12:32.710 }, 00:12:32.710 "claimed": true, 00:12:32.710 "claim_type": "exclusive_write", 00:12:32.710 "zoned": false, 00:12:32.710 "supported_io_types": { 00:12:32.710 "read": true, 00:12:32.710 "write": true, 00:12:32.710 "unmap": true, 00:12:32.710 "flush": true, 00:12:32.710 "reset": true, 00:12:32.710 "nvme_admin": false, 00:12:32.710 "nvme_io": false, 00:12:32.710 "nvme_io_md": false, 00:12:32.710 "write_zeroes": true, 00:12:32.710 "zcopy": true, 00:12:32.710 "get_zone_info": false, 00:12:32.710 "zone_management": false, 00:12:32.710 "zone_append": false, 00:12:32.710 "compare": false, 00:12:32.710 "compare_and_write": false, 00:12:32.710 "abort": true, 00:12:32.710 "seek_hole": false, 00:12:32.710 "seek_data": false, 00:12:32.710 "copy": true, 00:12:32.710 "nvme_iov_md": false 00:12:32.710 }, 00:12:32.710 "memory_domains": [ 00:12:32.710 { 00:12:32.710 "dma_device_id": "system", 00:12:32.710 "dma_device_type": 1 00:12:32.710 }, 00:12:32.710 { 00:12:32.710 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:32.710 "dma_device_type": 2 00:12:32.710 } 00:12:32.710 ], 00:12:32.710 "driver_specific": {} 00:12:32.710 }' 00:12:32.710 23:34:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:32.968 23:34:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:32.968 23:34:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:32.968 23:34:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:32.968 23:34:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:32.968 23:34:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:32.968 23:34:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:32.968 23:34:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:32.968 23:34:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:32.968 23:34:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:32.968 23:34:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:33.227 23:34:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:33.227 23:34:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:33.227 [2024-07-24 23:34:18.124066] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:33.227 [2024-07-24 23:34:18.124085] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:33.227 [2024-07-24 23:34:18.124124] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:33.227 [2024-07-24 23:34:18.124176] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:33.227 [2024-07-24 23:34:18.124183] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2462ac0 name Existed_Raid, state offline 00:12:33.227 23:34:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 280436 00:12:33.227 23:34:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@950 -- # '[' -z 280436 ']' 00:12:33.227 23:34:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # kill -0 280436 00:12:33.227 23:34:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # uname 00:12:33.227 23:34:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:12:33.227 23:34:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 280436 00:12:33.227 23:34:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:12:33.227 23:34:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:12:33.227 23:34:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # echo 'killing process with pid 280436' 00:12:33.227 killing process with pid 280436 00:12:33.227 23:34:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@969 -- # kill 280436 00:12:33.227 [2024-07-24 23:34:18.178340] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:33.227 23:34:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@974 -- # wait 280436 00:12:33.227 [2024-07-24 23:34:18.201081] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:33.485 23:34:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:12:33.485 00:12:33.485 real 0m21.216s 00:12:33.485 user 0m39.488s 00:12:33.485 sys 0m3.287s 00:12:33.485 23:34:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1126 -- # xtrace_disable 00:12:33.485 23:34:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:33.485 ************************************ 00:12:33.485 END TEST raid_state_function_test_sb 00:12:33.485 ************************************ 00:12:33.485 23:34:18 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 3 00:12:33.485 23:34:18 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:12:33.485 23:34:18 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:12:33.485 23:34:18 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:33.485 ************************************ 00:12:33.485 START TEST raid_superblock_test 00:12:33.485 ************************************ 00:12:33.485 23:34:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # raid_superblock_test concat 3 00:12:33.485 23:34:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:12:33.485 23:34:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:12:33.485 23:34:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:12:33.485 23:34:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:12:33.485 23:34:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:12:33.485 23:34:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:12:33.485 23:34:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:12:33.485 23:34:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:12:33.485 23:34:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:12:33.485 23:34:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:12:33.486 23:34:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:12:33.486 23:34:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:12:33.486 23:34:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:12:33.486 23:34:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:12:33.486 23:34:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:12:33.486 23:34:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:12:33.486 23:34:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=284464 00:12:33.486 23:34:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 284464 /var/tmp/spdk-raid.sock 00:12:33.486 23:34:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@831 -- # '[' -z 284464 ']' 00:12:33.486 23:34:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:33.486 23:34:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:12:33.486 23:34:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:33.486 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:33.486 23:34:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:12:33.486 23:34:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:33.486 23:34:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:12:33.486 [2024-07-24 23:34:18.480411] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:12:33.486 [2024-07-24 23:34:18.480450] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid284464 ] 00:12:33.744 [2024-07-24 23:34:18.543996] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:33.744 [2024-07-24 23:34:18.621869] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:33.744 [2024-07-24 23:34:18.675329] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:33.744 [2024-07-24 23:34:18.675358] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:34.309 23:34:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:12:34.309 23:34:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@864 -- # return 0 00:12:34.309 23:34:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:12:34.309 23:34:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:12:34.309 23:34:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:12:34.309 23:34:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:12:34.309 23:34:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:12:34.309 23:34:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:12:34.309 23:34:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:12:34.309 23:34:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:12:34.309 23:34:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:12:34.567 malloc1 00:12:34.567 23:34:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:12:34.858 [2024-07-24 23:34:19.591426] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:12:34.858 [2024-07-24 23:34:19.591460] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:34.858 [2024-07-24 23:34:19.591477] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24f0dd0 00:12:34.858 [2024-07-24 23:34:19.591483] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:34.858 [2024-07-24 23:34:19.592594] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:34.858 [2024-07-24 23:34:19.592617] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:12:34.858 pt1 00:12:34.858 23:34:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:12:34.858 23:34:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:12:34.858 23:34:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:12:34.858 23:34:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:12:34.858 23:34:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:12:34.858 23:34:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:12:34.858 23:34:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:12:34.858 23:34:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:12:34.858 23:34:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:12:34.858 malloc2 00:12:34.859 23:34:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:35.221 [2024-07-24 23:34:19.927705] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:35.221 [2024-07-24 23:34:19.927739] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:35.221 [2024-07-24 23:34:19.927750] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24f18d0 00:12:35.221 [2024-07-24 23:34:19.927756] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:35.221 [2024-07-24 23:34:19.928786] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:35.221 [2024-07-24 23:34:19.928807] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:35.221 pt2 00:12:35.221 23:34:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:12:35.221 23:34:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:12:35.221 23:34:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:12:35.221 23:34:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:12:35.221 23:34:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:12:35.221 23:34:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:12:35.221 23:34:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:12:35.221 23:34:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:12:35.221 23:34:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:12:35.221 malloc3 00:12:35.221 23:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:12:35.479 [2024-07-24 23:34:20.276311] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:12:35.479 [2024-07-24 23:34:20.276344] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:35.479 [2024-07-24 23:34:20.276354] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25b2740 00:12:35.479 [2024-07-24 23:34:20.276360] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:35.479 [2024-07-24 23:34:20.277418] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:35.479 [2024-07-24 23:34:20.277439] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:12:35.479 pt3 00:12:35.479 23:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:12:35.479 23:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:12:35.479 23:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:12:35.479 [2024-07-24 23:34:20.448776] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:12:35.479 [2024-07-24 23:34:20.449688] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:35.479 [2024-07-24 23:34:20.449731] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:12:35.479 [2024-07-24 23:34:20.449839] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x25b2d20 00:12:35.479 [2024-07-24 23:34:20.449846] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:12:35.479 [2024-07-24 23:34:20.449983] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25b4860 00:12:35.479 [2024-07-24 23:34:20.450091] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x25b2d20 00:12:35.479 [2024-07-24 23:34:20.450096] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x25b2d20 00:12:35.479 [2024-07-24 23:34:20.450163] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:35.479 23:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:12:35.479 23:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:35.479 23:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:35.479 23:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:35.479 23:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:35.479 23:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:35.479 23:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:35.480 23:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:35.480 23:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:35.480 23:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:35.480 23:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:35.480 23:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:35.738 23:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:35.738 "name": "raid_bdev1", 00:12:35.738 "uuid": "795f64f8-e3c9-484b-8a3b-c3cd7adbc8f2", 00:12:35.738 "strip_size_kb": 64, 00:12:35.738 "state": "online", 00:12:35.738 "raid_level": "concat", 00:12:35.738 "superblock": true, 00:12:35.738 "num_base_bdevs": 3, 00:12:35.738 "num_base_bdevs_discovered": 3, 00:12:35.738 "num_base_bdevs_operational": 3, 00:12:35.738 "base_bdevs_list": [ 00:12:35.738 { 00:12:35.738 "name": "pt1", 00:12:35.738 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:35.738 "is_configured": true, 00:12:35.738 "data_offset": 2048, 00:12:35.738 "data_size": 63488 00:12:35.738 }, 00:12:35.738 { 00:12:35.738 "name": "pt2", 00:12:35.738 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:35.738 "is_configured": true, 00:12:35.738 "data_offset": 2048, 00:12:35.738 "data_size": 63488 00:12:35.738 }, 00:12:35.738 { 00:12:35.738 "name": "pt3", 00:12:35.738 "uuid": "00000000-0000-0000-0000-000000000003", 00:12:35.738 "is_configured": true, 00:12:35.738 "data_offset": 2048, 00:12:35.738 "data_size": 63488 00:12:35.738 } 00:12:35.738 ] 00:12:35.738 }' 00:12:35.738 23:34:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:35.738 23:34:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:36.303 23:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:12:36.303 23:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:12:36.303 23:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:36.303 23:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:36.303 23:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:36.303 23:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:36.303 23:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:36.303 23:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:36.303 [2024-07-24 23:34:21.275059] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:36.303 23:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:36.303 "name": "raid_bdev1", 00:12:36.303 "aliases": [ 00:12:36.303 "795f64f8-e3c9-484b-8a3b-c3cd7adbc8f2" 00:12:36.303 ], 00:12:36.303 "product_name": "Raid Volume", 00:12:36.303 "block_size": 512, 00:12:36.303 "num_blocks": 190464, 00:12:36.303 "uuid": "795f64f8-e3c9-484b-8a3b-c3cd7adbc8f2", 00:12:36.303 "assigned_rate_limits": { 00:12:36.303 "rw_ios_per_sec": 0, 00:12:36.303 "rw_mbytes_per_sec": 0, 00:12:36.303 "r_mbytes_per_sec": 0, 00:12:36.303 "w_mbytes_per_sec": 0 00:12:36.303 }, 00:12:36.303 "claimed": false, 00:12:36.303 "zoned": false, 00:12:36.303 "supported_io_types": { 00:12:36.303 "read": true, 00:12:36.303 "write": true, 00:12:36.303 "unmap": true, 00:12:36.303 "flush": true, 00:12:36.303 "reset": true, 00:12:36.303 "nvme_admin": false, 00:12:36.303 "nvme_io": false, 00:12:36.303 "nvme_io_md": false, 00:12:36.303 "write_zeroes": true, 00:12:36.303 "zcopy": false, 00:12:36.303 "get_zone_info": false, 00:12:36.303 "zone_management": false, 00:12:36.303 "zone_append": false, 00:12:36.303 "compare": false, 00:12:36.303 "compare_and_write": false, 00:12:36.303 "abort": false, 00:12:36.303 "seek_hole": false, 00:12:36.303 "seek_data": false, 00:12:36.303 "copy": false, 00:12:36.303 "nvme_iov_md": false 00:12:36.303 }, 00:12:36.303 "memory_domains": [ 00:12:36.303 { 00:12:36.303 "dma_device_id": "system", 00:12:36.303 "dma_device_type": 1 00:12:36.303 }, 00:12:36.303 { 00:12:36.303 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:36.303 "dma_device_type": 2 00:12:36.303 }, 00:12:36.303 { 00:12:36.304 "dma_device_id": "system", 00:12:36.304 "dma_device_type": 1 00:12:36.304 }, 00:12:36.304 { 00:12:36.304 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:36.304 "dma_device_type": 2 00:12:36.304 }, 00:12:36.304 { 00:12:36.304 "dma_device_id": "system", 00:12:36.304 "dma_device_type": 1 00:12:36.304 }, 00:12:36.304 { 00:12:36.304 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:36.304 "dma_device_type": 2 00:12:36.304 } 00:12:36.304 ], 00:12:36.304 "driver_specific": { 00:12:36.304 "raid": { 00:12:36.304 "uuid": "795f64f8-e3c9-484b-8a3b-c3cd7adbc8f2", 00:12:36.304 "strip_size_kb": 64, 00:12:36.304 "state": "online", 00:12:36.304 "raid_level": "concat", 00:12:36.304 "superblock": true, 00:12:36.304 "num_base_bdevs": 3, 00:12:36.304 "num_base_bdevs_discovered": 3, 00:12:36.304 "num_base_bdevs_operational": 3, 00:12:36.304 "base_bdevs_list": [ 00:12:36.304 { 00:12:36.304 "name": "pt1", 00:12:36.304 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:36.304 "is_configured": true, 00:12:36.304 "data_offset": 2048, 00:12:36.304 "data_size": 63488 00:12:36.304 }, 00:12:36.304 { 00:12:36.304 "name": "pt2", 00:12:36.304 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:36.304 "is_configured": true, 00:12:36.304 "data_offset": 2048, 00:12:36.304 "data_size": 63488 00:12:36.304 }, 00:12:36.304 { 00:12:36.304 "name": "pt3", 00:12:36.304 "uuid": "00000000-0000-0000-0000-000000000003", 00:12:36.304 "is_configured": true, 00:12:36.304 "data_offset": 2048, 00:12:36.304 "data_size": 63488 00:12:36.304 } 00:12:36.304 ] 00:12:36.304 } 00:12:36.304 } 00:12:36.304 }' 00:12:36.304 23:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:36.562 23:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:12:36.562 pt2 00:12:36.562 pt3' 00:12:36.562 23:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:36.562 23:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:12:36.562 23:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:36.562 23:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:36.562 "name": "pt1", 00:12:36.562 "aliases": [ 00:12:36.562 "00000000-0000-0000-0000-000000000001" 00:12:36.562 ], 00:12:36.562 "product_name": "passthru", 00:12:36.562 "block_size": 512, 00:12:36.562 "num_blocks": 65536, 00:12:36.562 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:36.562 "assigned_rate_limits": { 00:12:36.562 "rw_ios_per_sec": 0, 00:12:36.562 "rw_mbytes_per_sec": 0, 00:12:36.562 "r_mbytes_per_sec": 0, 00:12:36.562 "w_mbytes_per_sec": 0 00:12:36.562 }, 00:12:36.562 "claimed": true, 00:12:36.562 "claim_type": "exclusive_write", 00:12:36.562 "zoned": false, 00:12:36.562 "supported_io_types": { 00:12:36.562 "read": true, 00:12:36.562 "write": true, 00:12:36.562 "unmap": true, 00:12:36.562 "flush": true, 00:12:36.562 "reset": true, 00:12:36.562 "nvme_admin": false, 00:12:36.562 "nvme_io": false, 00:12:36.562 "nvme_io_md": false, 00:12:36.562 "write_zeroes": true, 00:12:36.562 "zcopy": true, 00:12:36.562 "get_zone_info": false, 00:12:36.562 "zone_management": false, 00:12:36.562 "zone_append": false, 00:12:36.562 "compare": false, 00:12:36.562 "compare_and_write": false, 00:12:36.562 "abort": true, 00:12:36.562 "seek_hole": false, 00:12:36.562 "seek_data": false, 00:12:36.562 "copy": true, 00:12:36.562 "nvme_iov_md": false 00:12:36.562 }, 00:12:36.562 "memory_domains": [ 00:12:36.562 { 00:12:36.562 "dma_device_id": "system", 00:12:36.562 "dma_device_type": 1 00:12:36.562 }, 00:12:36.562 { 00:12:36.562 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:36.562 "dma_device_type": 2 00:12:36.562 } 00:12:36.562 ], 00:12:36.562 "driver_specific": { 00:12:36.562 "passthru": { 00:12:36.562 "name": "pt1", 00:12:36.562 "base_bdev_name": "malloc1" 00:12:36.562 } 00:12:36.562 } 00:12:36.562 }' 00:12:36.562 23:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:36.562 23:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:36.562 23:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:36.562 23:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:36.820 23:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:36.820 23:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:36.820 23:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:36.820 23:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:36.820 23:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:36.820 23:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:36.820 23:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:36.820 23:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:36.820 23:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:36.820 23:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:36.820 23:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:12:37.078 23:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:37.078 "name": "pt2", 00:12:37.078 "aliases": [ 00:12:37.078 "00000000-0000-0000-0000-000000000002" 00:12:37.078 ], 00:12:37.078 "product_name": "passthru", 00:12:37.078 "block_size": 512, 00:12:37.078 "num_blocks": 65536, 00:12:37.078 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:37.078 "assigned_rate_limits": { 00:12:37.078 "rw_ios_per_sec": 0, 00:12:37.079 "rw_mbytes_per_sec": 0, 00:12:37.079 "r_mbytes_per_sec": 0, 00:12:37.079 "w_mbytes_per_sec": 0 00:12:37.079 }, 00:12:37.079 "claimed": true, 00:12:37.079 "claim_type": "exclusive_write", 00:12:37.079 "zoned": false, 00:12:37.079 "supported_io_types": { 00:12:37.079 "read": true, 00:12:37.079 "write": true, 00:12:37.079 "unmap": true, 00:12:37.079 "flush": true, 00:12:37.079 "reset": true, 00:12:37.079 "nvme_admin": false, 00:12:37.079 "nvme_io": false, 00:12:37.079 "nvme_io_md": false, 00:12:37.079 "write_zeroes": true, 00:12:37.079 "zcopy": true, 00:12:37.079 "get_zone_info": false, 00:12:37.079 "zone_management": false, 00:12:37.079 "zone_append": false, 00:12:37.079 "compare": false, 00:12:37.079 "compare_and_write": false, 00:12:37.079 "abort": true, 00:12:37.079 "seek_hole": false, 00:12:37.079 "seek_data": false, 00:12:37.079 "copy": true, 00:12:37.079 "nvme_iov_md": false 00:12:37.079 }, 00:12:37.079 "memory_domains": [ 00:12:37.079 { 00:12:37.079 "dma_device_id": "system", 00:12:37.079 "dma_device_type": 1 00:12:37.079 }, 00:12:37.079 { 00:12:37.079 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:37.079 "dma_device_type": 2 00:12:37.079 } 00:12:37.079 ], 00:12:37.079 "driver_specific": { 00:12:37.079 "passthru": { 00:12:37.079 "name": "pt2", 00:12:37.079 "base_bdev_name": "malloc2" 00:12:37.079 } 00:12:37.079 } 00:12:37.079 }' 00:12:37.079 23:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:37.079 23:34:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:37.079 23:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:37.079 23:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:37.079 23:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:37.337 23:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:37.337 23:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:37.337 23:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:37.337 23:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:37.337 23:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:37.337 23:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:37.337 23:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:37.337 23:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:37.337 23:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:37.337 23:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:12:37.595 23:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:37.595 "name": "pt3", 00:12:37.595 "aliases": [ 00:12:37.595 "00000000-0000-0000-0000-000000000003" 00:12:37.595 ], 00:12:37.595 "product_name": "passthru", 00:12:37.595 "block_size": 512, 00:12:37.595 "num_blocks": 65536, 00:12:37.595 "uuid": "00000000-0000-0000-0000-000000000003", 00:12:37.595 "assigned_rate_limits": { 00:12:37.595 "rw_ios_per_sec": 0, 00:12:37.595 "rw_mbytes_per_sec": 0, 00:12:37.595 "r_mbytes_per_sec": 0, 00:12:37.595 "w_mbytes_per_sec": 0 00:12:37.595 }, 00:12:37.595 "claimed": true, 00:12:37.595 "claim_type": "exclusive_write", 00:12:37.595 "zoned": false, 00:12:37.595 "supported_io_types": { 00:12:37.595 "read": true, 00:12:37.595 "write": true, 00:12:37.595 "unmap": true, 00:12:37.595 "flush": true, 00:12:37.595 "reset": true, 00:12:37.595 "nvme_admin": false, 00:12:37.595 "nvme_io": false, 00:12:37.595 "nvme_io_md": false, 00:12:37.595 "write_zeroes": true, 00:12:37.595 "zcopy": true, 00:12:37.595 "get_zone_info": false, 00:12:37.595 "zone_management": false, 00:12:37.595 "zone_append": false, 00:12:37.595 "compare": false, 00:12:37.595 "compare_and_write": false, 00:12:37.595 "abort": true, 00:12:37.595 "seek_hole": false, 00:12:37.595 "seek_data": false, 00:12:37.595 "copy": true, 00:12:37.595 "nvme_iov_md": false 00:12:37.595 }, 00:12:37.595 "memory_domains": [ 00:12:37.595 { 00:12:37.595 "dma_device_id": "system", 00:12:37.595 "dma_device_type": 1 00:12:37.595 }, 00:12:37.595 { 00:12:37.595 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:37.595 "dma_device_type": 2 00:12:37.595 } 00:12:37.595 ], 00:12:37.595 "driver_specific": { 00:12:37.595 "passthru": { 00:12:37.595 "name": "pt3", 00:12:37.595 "base_bdev_name": "malloc3" 00:12:37.595 } 00:12:37.595 } 00:12:37.595 }' 00:12:37.595 23:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:37.595 23:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:37.595 23:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:37.595 23:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:37.595 23:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:37.595 23:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:37.595 23:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:37.595 23:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:37.854 23:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:37.854 23:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:37.854 23:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:37.854 23:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:37.854 23:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:37.854 23:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:12:38.112 [2024-07-24 23:34:22.859167] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:38.112 23:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=795f64f8-e3c9-484b-8a3b-c3cd7adbc8f2 00:12:38.112 23:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 795f64f8-e3c9-484b-8a3b-c3cd7adbc8f2 ']' 00:12:38.112 23:34:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:38.112 [2024-07-24 23:34:23.031426] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:38.112 [2024-07-24 23:34:23.031439] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:38.112 [2024-07-24 23:34:23.031480] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:38.112 [2024-07-24 23:34:23.031516] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:38.112 [2024-07-24 23:34:23.031522] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25b2d20 name raid_bdev1, state offline 00:12:38.112 23:34:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:38.112 23:34:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:12:38.370 23:34:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:12:38.370 23:34:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:12:38.370 23:34:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:12:38.370 23:34:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:12:38.628 23:34:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:12:38.628 23:34:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:12:38.628 23:34:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:12:38.628 23:34:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:12:38.886 23:34:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:12:38.886 23:34:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:12:38.886 23:34:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:12:38.886 23:34:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:12:38.886 23:34:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # local es=0 00:12:38.886 23:34:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:12:38.886 23:34:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:38.886 23:34:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:12:38.886 23:34:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:38.886 23:34:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:12:38.886 23:34:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:38.886 23:34:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:12:38.886 23:34:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:38.886 23:34:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:12:38.886 23:34:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:12:39.144 [2024-07-24 23:34:24.034090] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:12:39.144 [2024-07-24 23:34:24.035065] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:12:39.144 [2024-07-24 23:34:24.035095] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:12:39.144 [2024-07-24 23:34:24.035128] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:12:39.144 [2024-07-24 23:34:24.035155] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:12:39.144 [2024-07-24 23:34:24.035172] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:12:39.144 [2024-07-24 23:34:24.035181] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:39.144 [2024-07-24 23:34:24.035187] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24f1250 name raid_bdev1, state configuring 00:12:39.144 request: 00:12:39.144 { 00:12:39.144 "name": "raid_bdev1", 00:12:39.144 "raid_level": "concat", 00:12:39.144 "base_bdevs": [ 00:12:39.144 "malloc1", 00:12:39.144 "malloc2", 00:12:39.144 "malloc3" 00:12:39.144 ], 00:12:39.144 "strip_size_kb": 64, 00:12:39.144 "superblock": false, 00:12:39.144 "method": "bdev_raid_create", 00:12:39.144 "req_id": 1 00:12:39.144 } 00:12:39.144 Got JSON-RPC error response 00:12:39.144 response: 00:12:39.144 { 00:12:39.144 "code": -17, 00:12:39.144 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:12:39.144 } 00:12:39.144 23:34:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # es=1 00:12:39.144 23:34:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:12:39.144 23:34:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:12:39.144 23:34:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:12:39.144 23:34:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:12:39.144 23:34:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:39.403 23:34:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:12:39.403 23:34:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:12:39.403 23:34:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:12:39.403 [2024-07-24 23:34:24.374934] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:12:39.403 [2024-07-24 23:34:24.374961] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:39.403 [2024-07-24 23:34:24.374970] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24f1000 00:12:39.403 [2024-07-24 23:34:24.374976] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:39.403 [2024-07-24 23:34:24.376117] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:39.403 [2024-07-24 23:34:24.376139] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:12:39.403 [2024-07-24 23:34:24.376184] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:12:39.403 [2024-07-24 23:34:24.376204] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:12:39.403 pt1 00:12:39.403 23:34:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 3 00:12:39.403 23:34:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:39.403 23:34:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:39.403 23:34:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:39.403 23:34:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:39.403 23:34:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:39.403 23:34:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:39.403 23:34:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:39.403 23:34:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:39.403 23:34:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:39.403 23:34:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:39.403 23:34:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:39.662 23:34:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:39.662 "name": "raid_bdev1", 00:12:39.662 "uuid": "795f64f8-e3c9-484b-8a3b-c3cd7adbc8f2", 00:12:39.662 "strip_size_kb": 64, 00:12:39.662 "state": "configuring", 00:12:39.662 "raid_level": "concat", 00:12:39.662 "superblock": true, 00:12:39.662 "num_base_bdevs": 3, 00:12:39.662 "num_base_bdevs_discovered": 1, 00:12:39.662 "num_base_bdevs_operational": 3, 00:12:39.662 "base_bdevs_list": [ 00:12:39.662 { 00:12:39.662 "name": "pt1", 00:12:39.662 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:39.662 "is_configured": true, 00:12:39.662 "data_offset": 2048, 00:12:39.662 "data_size": 63488 00:12:39.662 }, 00:12:39.662 { 00:12:39.662 "name": null, 00:12:39.662 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:39.662 "is_configured": false, 00:12:39.662 "data_offset": 2048, 00:12:39.662 "data_size": 63488 00:12:39.662 }, 00:12:39.662 { 00:12:39.662 "name": null, 00:12:39.662 "uuid": "00000000-0000-0000-0000-000000000003", 00:12:39.662 "is_configured": false, 00:12:39.662 "data_offset": 2048, 00:12:39.662 "data_size": 63488 00:12:39.662 } 00:12:39.662 ] 00:12:39.662 }' 00:12:39.662 23:34:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:39.662 23:34:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:40.225 23:34:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:12:40.225 23:34:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:40.225 [2024-07-24 23:34:25.189028] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:40.225 [2024-07-24 23:34:25.189062] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:40.225 [2024-07-24 23:34:25.189073] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25b2970 00:12:40.225 [2024-07-24 23:34:25.189079] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:40.225 [2024-07-24 23:34:25.189308] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:40.225 [2024-07-24 23:34:25.189319] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:40.225 [2024-07-24 23:34:25.189359] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:12:40.225 [2024-07-24 23:34:25.189373] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:40.225 pt2 00:12:40.225 23:34:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:12:40.482 [2024-07-24 23:34:25.357472] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:12:40.482 23:34:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 3 00:12:40.482 23:34:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:40.482 23:34:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:40.482 23:34:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:40.482 23:34:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:40.482 23:34:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:40.482 23:34:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:40.482 23:34:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:40.482 23:34:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:40.482 23:34:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:40.482 23:34:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:40.482 23:34:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:40.738 23:34:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:40.738 "name": "raid_bdev1", 00:12:40.738 "uuid": "795f64f8-e3c9-484b-8a3b-c3cd7adbc8f2", 00:12:40.738 "strip_size_kb": 64, 00:12:40.738 "state": "configuring", 00:12:40.738 "raid_level": "concat", 00:12:40.738 "superblock": true, 00:12:40.738 "num_base_bdevs": 3, 00:12:40.738 "num_base_bdevs_discovered": 1, 00:12:40.738 "num_base_bdevs_operational": 3, 00:12:40.738 "base_bdevs_list": [ 00:12:40.738 { 00:12:40.738 "name": "pt1", 00:12:40.738 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:40.738 "is_configured": true, 00:12:40.738 "data_offset": 2048, 00:12:40.738 "data_size": 63488 00:12:40.738 }, 00:12:40.738 { 00:12:40.738 "name": null, 00:12:40.738 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:40.738 "is_configured": false, 00:12:40.738 "data_offset": 2048, 00:12:40.738 "data_size": 63488 00:12:40.738 }, 00:12:40.738 { 00:12:40.738 "name": null, 00:12:40.738 "uuid": "00000000-0000-0000-0000-000000000003", 00:12:40.738 "is_configured": false, 00:12:40.738 "data_offset": 2048, 00:12:40.738 "data_size": 63488 00:12:40.738 } 00:12:40.738 ] 00:12:40.738 }' 00:12:40.738 23:34:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:40.738 23:34:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:41.303 23:34:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:12:41.303 23:34:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:12:41.303 23:34:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:41.303 [2024-07-24 23:34:26.203664] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:41.303 [2024-07-24 23:34:26.203702] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:41.303 [2024-07-24 23:34:26.203715] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24e9ca0 00:12:41.303 [2024-07-24 23:34:26.203721] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:41.303 [2024-07-24 23:34:26.203959] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:41.303 [2024-07-24 23:34:26.203970] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:41.303 [2024-07-24 23:34:26.204011] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:12:41.303 [2024-07-24 23:34:26.204024] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:41.303 pt2 00:12:41.303 23:34:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:12:41.303 23:34:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:12:41.303 23:34:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:12:41.562 [2024-07-24 23:34:26.356054] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:12:41.562 [2024-07-24 23:34:26.356075] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:41.562 [2024-07-24 23:34:26.356084] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24e7ae0 00:12:41.562 [2024-07-24 23:34:26.356090] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:41.562 [2024-07-24 23:34:26.356296] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:41.562 [2024-07-24 23:34:26.356306] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:12:41.562 [2024-07-24 23:34:26.356339] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:12:41.562 [2024-07-24 23:34:26.356350] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:12:41.562 [2024-07-24 23:34:26.356420] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x24f2cd0 00:12:41.562 [2024-07-24 23:34:26.356426] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:12:41.562 [2024-07-24 23:34:26.356542] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x24e9930 00:12:41.562 [2024-07-24 23:34:26.356626] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x24f2cd0 00:12:41.562 [2024-07-24 23:34:26.356631] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x24f2cd0 00:12:41.562 [2024-07-24 23:34:26.356704] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:41.562 pt3 00:12:41.562 23:34:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:12:41.562 23:34:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:12:41.562 23:34:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:12:41.562 23:34:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:41.562 23:34:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:41.562 23:34:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:41.562 23:34:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:41.562 23:34:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:41.562 23:34:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:41.562 23:34:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:41.562 23:34:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:41.562 23:34:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:41.562 23:34:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:41.562 23:34:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:41.562 23:34:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:41.562 "name": "raid_bdev1", 00:12:41.562 "uuid": "795f64f8-e3c9-484b-8a3b-c3cd7adbc8f2", 00:12:41.562 "strip_size_kb": 64, 00:12:41.562 "state": "online", 00:12:41.562 "raid_level": "concat", 00:12:41.562 "superblock": true, 00:12:41.562 "num_base_bdevs": 3, 00:12:41.562 "num_base_bdevs_discovered": 3, 00:12:41.562 "num_base_bdevs_operational": 3, 00:12:41.562 "base_bdevs_list": [ 00:12:41.562 { 00:12:41.562 "name": "pt1", 00:12:41.562 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:41.562 "is_configured": true, 00:12:41.562 "data_offset": 2048, 00:12:41.562 "data_size": 63488 00:12:41.562 }, 00:12:41.562 { 00:12:41.562 "name": "pt2", 00:12:41.562 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:41.562 "is_configured": true, 00:12:41.562 "data_offset": 2048, 00:12:41.562 "data_size": 63488 00:12:41.562 }, 00:12:41.562 { 00:12:41.562 "name": "pt3", 00:12:41.562 "uuid": "00000000-0000-0000-0000-000000000003", 00:12:41.562 "is_configured": true, 00:12:41.562 "data_offset": 2048, 00:12:41.562 "data_size": 63488 00:12:41.562 } 00:12:41.562 ] 00:12:41.562 }' 00:12:41.562 23:34:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:41.562 23:34:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:42.125 23:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:12:42.125 23:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:12:42.125 23:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:42.125 23:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:42.125 23:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:42.125 23:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:42.125 23:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:42.125 23:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:42.383 [2024-07-24 23:34:27.170370] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:42.384 23:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:42.384 "name": "raid_bdev1", 00:12:42.384 "aliases": [ 00:12:42.384 "795f64f8-e3c9-484b-8a3b-c3cd7adbc8f2" 00:12:42.384 ], 00:12:42.384 "product_name": "Raid Volume", 00:12:42.384 "block_size": 512, 00:12:42.384 "num_blocks": 190464, 00:12:42.384 "uuid": "795f64f8-e3c9-484b-8a3b-c3cd7adbc8f2", 00:12:42.384 "assigned_rate_limits": { 00:12:42.384 "rw_ios_per_sec": 0, 00:12:42.384 "rw_mbytes_per_sec": 0, 00:12:42.384 "r_mbytes_per_sec": 0, 00:12:42.384 "w_mbytes_per_sec": 0 00:12:42.384 }, 00:12:42.384 "claimed": false, 00:12:42.384 "zoned": false, 00:12:42.384 "supported_io_types": { 00:12:42.384 "read": true, 00:12:42.384 "write": true, 00:12:42.384 "unmap": true, 00:12:42.384 "flush": true, 00:12:42.384 "reset": true, 00:12:42.384 "nvme_admin": false, 00:12:42.384 "nvme_io": false, 00:12:42.384 "nvme_io_md": false, 00:12:42.384 "write_zeroes": true, 00:12:42.384 "zcopy": false, 00:12:42.384 "get_zone_info": false, 00:12:42.384 "zone_management": false, 00:12:42.384 "zone_append": false, 00:12:42.384 "compare": false, 00:12:42.384 "compare_and_write": false, 00:12:42.384 "abort": false, 00:12:42.384 "seek_hole": false, 00:12:42.384 "seek_data": false, 00:12:42.384 "copy": false, 00:12:42.384 "nvme_iov_md": false 00:12:42.384 }, 00:12:42.384 "memory_domains": [ 00:12:42.384 { 00:12:42.384 "dma_device_id": "system", 00:12:42.384 "dma_device_type": 1 00:12:42.384 }, 00:12:42.384 { 00:12:42.384 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:42.384 "dma_device_type": 2 00:12:42.384 }, 00:12:42.384 { 00:12:42.384 "dma_device_id": "system", 00:12:42.384 "dma_device_type": 1 00:12:42.384 }, 00:12:42.384 { 00:12:42.384 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:42.384 "dma_device_type": 2 00:12:42.384 }, 00:12:42.384 { 00:12:42.384 "dma_device_id": "system", 00:12:42.384 "dma_device_type": 1 00:12:42.384 }, 00:12:42.384 { 00:12:42.384 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:42.384 "dma_device_type": 2 00:12:42.384 } 00:12:42.384 ], 00:12:42.384 "driver_specific": { 00:12:42.384 "raid": { 00:12:42.384 "uuid": "795f64f8-e3c9-484b-8a3b-c3cd7adbc8f2", 00:12:42.384 "strip_size_kb": 64, 00:12:42.384 "state": "online", 00:12:42.384 "raid_level": "concat", 00:12:42.384 "superblock": true, 00:12:42.384 "num_base_bdevs": 3, 00:12:42.384 "num_base_bdevs_discovered": 3, 00:12:42.384 "num_base_bdevs_operational": 3, 00:12:42.384 "base_bdevs_list": [ 00:12:42.384 { 00:12:42.384 "name": "pt1", 00:12:42.384 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:42.384 "is_configured": true, 00:12:42.384 "data_offset": 2048, 00:12:42.384 "data_size": 63488 00:12:42.384 }, 00:12:42.384 { 00:12:42.384 "name": "pt2", 00:12:42.384 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:42.384 "is_configured": true, 00:12:42.384 "data_offset": 2048, 00:12:42.384 "data_size": 63488 00:12:42.384 }, 00:12:42.384 { 00:12:42.384 "name": "pt3", 00:12:42.384 "uuid": "00000000-0000-0000-0000-000000000003", 00:12:42.384 "is_configured": true, 00:12:42.384 "data_offset": 2048, 00:12:42.384 "data_size": 63488 00:12:42.384 } 00:12:42.384 ] 00:12:42.384 } 00:12:42.384 } 00:12:42.384 }' 00:12:42.384 23:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:42.384 23:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:12:42.384 pt2 00:12:42.384 pt3' 00:12:42.384 23:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:42.384 23:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:12:42.384 23:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:42.642 23:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:42.642 "name": "pt1", 00:12:42.642 "aliases": [ 00:12:42.642 "00000000-0000-0000-0000-000000000001" 00:12:42.642 ], 00:12:42.642 "product_name": "passthru", 00:12:42.642 "block_size": 512, 00:12:42.642 "num_blocks": 65536, 00:12:42.642 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:42.642 "assigned_rate_limits": { 00:12:42.642 "rw_ios_per_sec": 0, 00:12:42.642 "rw_mbytes_per_sec": 0, 00:12:42.642 "r_mbytes_per_sec": 0, 00:12:42.642 "w_mbytes_per_sec": 0 00:12:42.642 }, 00:12:42.642 "claimed": true, 00:12:42.642 "claim_type": "exclusive_write", 00:12:42.642 "zoned": false, 00:12:42.642 "supported_io_types": { 00:12:42.642 "read": true, 00:12:42.642 "write": true, 00:12:42.642 "unmap": true, 00:12:42.642 "flush": true, 00:12:42.642 "reset": true, 00:12:42.642 "nvme_admin": false, 00:12:42.642 "nvme_io": false, 00:12:42.642 "nvme_io_md": false, 00:12:42.642 "write_zeroes": true, 00:12:42.642 "zcopy": true, 00:12:42.642 "get_zone_info": false, 00:12:42.642 "zone_management": false, 00:12:42.642 "zone_append": false, 00:12:42.642 "compare": false, 00:12:42.642 "compare_and_write": false, 00:12:42.642 "abort": true, 00:12:42.642 "seek_hole": false, 00:12:42.642 "seek_data": false, 00:12:42.642 "copy": true, 00:12:42.642 "nvme_iov_md": false 00:12:42.642 }, 00:12:42.642 "memory_domains": [ 00:12:42.642 { 00:12:42.642 "dma_device_id": "system", 00:12:42.642 "dma_device_type": 1 00:12:42.642 }, 00:12:42.642 { 00:12:42.642 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:42.642 "dma_device_type": 2 00:12:42.642 } 00:12:42.642 ], 00:12:42.642 "driver_specific": { 00:12:42.642 "passthru": { 00:12:42.642 "name": "pt1", 00:12:42.642 "base_bdev_name": "malloc1" 00:12:42.642 } 00:12:42.642 } 00:12:42.642 }' 00:12:42.642 23:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:42.642 23:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:42.642 23:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:42.642 23:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:42.642 23:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:42.642 23:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:42.642 23:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:42.642 23:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:42.642 23:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:42.642 23:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:42.900 23:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:42.900 23:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:42.900 23:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:42.900 23:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:12:42.900 23:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:42.900 23:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:42.900 "name": "pt2", 00:12:42.900 "aliases": [ 00:12:42.900 "00000000-0000-0000-0000-000000000002" 00:12:42.900 ], 00:12:42.900 "product_name": "passthru", 00:12:42.900 "block_size": 512, 00:12:42.900 "num_blocks": 65536, 00:12:42.900 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:42.900 "assigned_rate_limits": { 00:12:42.900 "rw_ios_per_sec": 0, 00:12:42.900 "rw_mbytes_per_sec": 0, 00:12:42.900 "r_mbytes_per_sec": 0, 00:12:42.900 "w_mbytes_per_sec": 0 00:12:42.900 }, 00:12:42.900 "claimed": true, 00:12:42.900 "claim_type": "exclusive_write", 00:12:42.900 "zoned": false, 00:12:42.900 "supported_io_types": { 00:12:42.900 "read": true, 00:12:42.900 "write": true, 00:12:42.900 "unmap": true, 00:12:42.900 "flush": true, 00:12:42.900 "reset": true, 00:12:42.900 "nvme_admin": false, 00:12:42.900 "nvme_io": false, 00:12:42.900 "nvme_io_md": false, 00:12:42.900 "write_zeroes": true, 00:12:42.900 "zcopy": true, 00:12:42.900 "get_zone_info": false, 00:12:42.900 "zone_management": false, 00:12:42.900 "zone_append": false, 00:12:42.900 "compare": false, 00:12:42.900 "compare_and_write": false, 00:12:42.900 "abort": true, 00:12:42.900 "seek_hole": false, 00:12:42.900 "seek_data": false, 00:12:42.900 "copy": true, 00:12:42.900 "nvme_iov_md": false 00:12:42.900 }, 00:12:42.900 "memory_domains": [ 00:12:42.900 { 00:12:42.900 "dma_device_id": "system", 00:12:42.900 "dma_device_type": 1 00:12:42.900 }, 00:12:42.900 { 00:12:42.900 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:42.900 "dma_device_type": 2 00:12:42.900 } 00:12:42.900 ], 00:12:42.900 "driver_specific": { 00:12:42.900 "passthru": { 00:12:42.900 "name": "pt2", 00:12:42.900 "base_bdev_name": "malloc2" 00:12:42.900 } 00:12:42.900 } 00:12:42.900 }' 00:12:42.900 23:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:43.158 23:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:43.158 23:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:43.158 23:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:43.158 23:34:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:43.158 23:34:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:43.158 23:34:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:43.158 23:34:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:43.158 23:34:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:43.158 23:34:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:43.158 23:34:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:43.416 23:34:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:43.416 23:34:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:43.416 23:34:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:12:43.416 23:34:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:43.416 23:34:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:43.416 "name": "pt3", 00:12:43.416 "aliases": [ 00:12:43.416 "00000000-0000-0000-0000-000000000003" 00:12:43.416 ], 00:12:43.416 "product_name": "passthru", 00:12:43.416 "block_size": 512, 00:12:43.416 "num_blocks": 65536, 00:12:43.416 "uuid": "00000000-0000-0000-0000-000000000003", 00:12:43.416 "assigned_rate_limits": { 00:12:43.416 "rw_ios_per_sec": 0, 00:12:43.416 "rw_mbytes_per_sec": 0, 00:12:43.416 "r_mbytes_per_sec": 0, 00:12:43.416 "w_mbytes_per_sec": 0 00:12:43.416 }, 00:12:43.416 "claimed": true, 00:12:43.416 "claim_type": "exclusive_write", 00:12:43.416 "zoned": false, 00:12:43.416 "supported_io_types": { 00:12:43.416 "read": true, 00:12:43.416 "write": true, 00:12:43.416 "unmap": true, 00:12:43.416 "flush": true, 00:12:43.416 "reset": true, 00:12:43.416 "nvme_admin": false, 00:12:43.416 "nvme_io": false, 00:12:43.416 "nvme_io_md": false, 00:12:43.416 "write_zeroes": true, 00:12:43.416 "zcopy": true, 00:12:43.416 "get_zone_info": false, 00:12:43.416 "zone_management": false, 00:12:43.416 "zone_append": false, 00:12:43.416 "compare": false, 00:12:43.416 "compare_and_write": false, 00:12:43.416 "abort": true, 00:12:43.416 "seek_hole": false, 00:12:43.416 "seek_data": false, 00:12:43.416 "copy": true, 00:12:43.416 "nvme_iov_md": false 00:12:43.416 }, 00:12:43.416 "memory_domains": [ 00:12:43.416 { 00:12:43.416 "dma_device_id": "system", 00:12:43.416 "dma_device_type": 1 00:12:43.416 }, 00:12:43.416 { 00:12:43.416 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:43.416 "dma_device_type": 2 00:12:43.416 } 00:12:43.416 ], 00:12:43.416 "driver_specific": { 00:12:43.416 "passthru": { 00:12:43.416 "name": "pt3", 00:12:43.416 "base_bdev_name": "malloc3" 00:12:43.416 } 00:12:43.416 } 00:12:43.416 }' 00:12:43.416 23:34:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:43.416 23:34:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:43.674 23:34:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:43.674 23:34:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:43.674 23:34:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:43.674 23:34:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:43.674 23:34:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:43.674 23:34:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:43.674 23:34:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:43.674 23:34:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:43.674 23:34:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:43.674 23:34:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:43.674 23:34:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:12:43.674 23:34:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:43.933 [2024-07-24 23:34:28.806616] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:43.933 23:34:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 795f64f8-e3c9-484b-8a3b-c3cd7adbc8f2 '!=' 795f64f8-e3c9-484b-8a3b-c3cd7adbc8f2 ']' 00:12:43.933 23:34:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:12:43.933 23:34:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:43.933 23:34:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:43.933 23:34:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 284464 00:12:43.933 23:34:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@950 -- # '[' -z 284464 ']' 00:12:43.933 23:34:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # kill -0 284464 00:12:43.933 23:34:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # uname 00:12:43.933 23:34:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:12:43.933 23:34:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 284464 00:12:43.933 23:34:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:12:43.933 23:34:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:12:43.933 23:34:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 284464' 00:12:43.933 killing process with pid 284464 00:12:43.933 23:34:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@969 -- # kill 284464 00:12:43.933 [2024-07-24 23:34:28.861054] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:43.933 [2024-07-24 23:34:28.861093] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:43.933 23:34:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@974 -- # wait 284464 00:12:43.933 [2024-07-24 23:34:28.861130] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:43.933 [2024-07-24 23:34:28.861140] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24f2cd0 name raid_bdev1, state offline 00:12:43.933 [2024-07-24 23:34:28.884009] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:44.191 23:34:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:12:44.191 00:12:44.191 real 0m10.623s 00:12:44.192 user 0m19.401s 00:12:44.192 sys 0m1.607s 00:12:44.192 23:34:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:12:44.192 23:34:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:44.192 ************************************ 00:12:44.192 END TEST raid_superblock_test 00:12:44.192 ************************************ 00:12:44.192 23:34:29 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 3 read 00:12:44.192 23:34:29 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:12:44.192 23:34:29 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:12:44.192 23:34:29 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:44.192 ************************************ 00:12:44.192 START TEST raid_read_error_test 00:12:44.192 ************************************ 00:12:44.192 23:34:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test concat 3 read 00:12:44.192 23:34:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:12:44.192 23:34:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:12:44.192 23:34:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:12:44.192 23:34:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:12:44.192 23:34:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:44.192 23:34:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:12:44.192 23:34:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:44.192 23:34:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:44.192 23:34:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:12:44.192 23:34:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:44.192 23:34:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:44.192 23:34:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:12:44.192 23:34:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:44.192 23:34:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:44.192 23:34:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:12:44.192 23:34:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:12:44.192 23:34:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:12:44.192 23:34:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:12:44.192 23:34:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:12:44.192 23:34:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:12:44.192 23:34:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:12:44.192 23:34:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:12:44.192 23:34:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:12:44.192 23:34:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:12:44.192 23:34:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:12:44.192 23:34:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.lN35B67KI7 00:12:44.192 23:34:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=286571 00:12:44.192 23:34:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 286571 /var/tmp/spdk-raid.sock 00:12:44.192 23:34:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@831 -- # '[' -z 286571 ']' 00:12:44.192 23:34:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:44.192 23:34:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:12:44.192 23:34:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:44.192 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:44.192 23:34:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:12:44.192 23:34:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:12:44.192 23:34:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:44.192 [2024-07-24 23:34:29.178629] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:12:44.192 [2024-07-24 23:34:29.178667] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid286571 ] 00:12:44.451 [2024-07-24 23:34:29.240693] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:44.451 [2024-07-24 23:34:29.319470] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:44.451 [2024-07-24 23:34:29.374327] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:44.451 [2024-07-24 23:34:29.374354] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:45.017 23:34:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:12:45.017 23:34:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@864 -- # return 0 00:12:45.017 23:34:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:45.017 23:34:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:12:45.275 BaseBdev1_malloc 00:12:45.275 23:34:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:12:45.533 true 00:12:45.533 23:34:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:12:45.533 [2024-07-24 23:34:30.462558] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:12:45.533 [2024-07-24 23:34:30.462591] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:45.533 [2024-07-24 23:34:30.462602] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13df550 00:12:45.533 [2024-07-24 23:34:30.462609] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:45.533 [2024-07-24 23:34:30.463829] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:45.533 [2024-07-24 23:34:30.463850] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:12:45.533 BaseBdev1 00:12:45.533 23:34:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:45.533 23:34:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:12:45.791 BaseBdev2_malloc 00:12:45.791 23:34:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:12:45.791 true 00:12:46.049 23:34:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:12:46.049 [2024-07-24 23:34:30.943175] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:12:46.049 [2024-07-24 23:34:30.943204] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:46.049 [2024-07-24 23:34:30.943215] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13e3d90 00:12:46.049 [2024-07-24 23:34:30.943224] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:46.049 [2024-07-24 23:34:30.944237] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:46.049 [2024-07-24 23:34:30.944258] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:12:46.049 BaseBdev2 00:12:46.049 23:34:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:46.049 23:34:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:12:46.307 BaseBdev3_malloc 00:12:46.307 23:34:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:12:46.307 true 00:12:46.307 23:34:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:12:46.565 [2024-07-24 23:34:31.435969] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:12:46.565 [2024-07-24 23:34:31.436001] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:46.565 [2024-07-24 23:34:31.436012] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13e6050 00:12:46.565 [2024-07-24 23:34:31.436018] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:46.565 [2024-07-24 23:34:31.437033] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:46.565 [2024-07-24 23:34:31.437053] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:12:46.565 BaseBdev3 00:12:46.565 23:34:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:12:46.823 [2024-07-24 23:34:31.592398] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:46.823 [2024-07-24 23:34:31.593222] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:46.823 [2024-07-24 23:34:31.593267] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:46.823 [2024-07-24 23:34:31.593404] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x13e7700 00:12:46.823 [2024-07-24 23:34:31.593411] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:12:46.823 [2024-07-24 23:34:31.593545] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13e72a0 00:12:46.823 [2024-07-24 23:34:31.593644] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x13e7700 00:12:46.823 [2024-07-24 23:34:31.593649] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x13e7700 00:12:46.823 [2024-07-24 23:34:31.593714] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:46.823 23:34:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:12:46.823 23:34:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:46.823 23:34:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:46.823 23:34:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:46.823 23:34:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:46.823 23:34:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:46.823 23:34:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:46.823 23:34:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:46.823 23:34:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:46.823 23:34:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:46.823 23:34:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:46.823 23:34:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:46.823 23:34:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:46.823 "name": "raid_bdev1", 00:12:46.823 "uuid": "972907bd-1e52-4d75-8e2e-1abbd71fd898", 00:12:46.823 "strip_size_kb": 64, 00:12:46.823 "state": "online", 00:12:46.823 "raid_level": "concat", 00:12:46.823 "superblock": true, 00:12:46.823 "num_base_bdevs": 3, 00:12:46.823 "num_base_bdevs_discovered": 3, 00:12:46.823 "num_base_bdevs_operational": 3, 00:12:46.823 "base_bdevs_list": [ 00:12:46.823 { 00:12:46.823 "name": "BaseBdev1", 00:12:46.823 "uuid": "10f3b0be-e57b-52da-8e06-cf1794f0db7b", 00:12:46.824 "is_configured": true, 00:12:46.824 "data_offset": 2048, 00:12:46.824 "data_size": 63488 00:12:46.824 }, 00:12:46.824 { 00:12:46.824 "name": "BaseBdev2", 00:12:46.824 "uuid": "a480f518-589f-577d-acba-ced3cbf80a6c", 00:12:46.824 "is_configured": true, 00:12:46.824 "data_offset": 2048, 00:12:46.824 "data_size": 63488 00:12:46.824 }, 00:12:46.824 { 00:12:46.824 "name": "BaseBdev3", 00:12:46.824 "uuid": "2ee6025b-8e91-5837-9a77-56ddef6b2d6b", 00:12:46.824 "is_configured": true, 00:12:46.824 "data_offset": 2048, 00:12:46.824 "data_size": 63488 00:12:46.824 } 00:12:46.824 ] 00:12:46.824 }' 00:12:46.824 23:34:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:46.824 23:34:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:47.390 23:34:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:12:47.390 23:34:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:12:47.390 [2024-07-24 23:34:32.342537] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1235950 00:12:48.325 23:34:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:12:48.582 23:34:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:12:48.582 23:34:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:12:48.582 23:34:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:12:48.582 23:34:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:12:48.582 23:34:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:48.582 23:34:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:48.582 23:34:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:48.582 23:34:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:48.582 23:34:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:48.582 23:34:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:48.582 23:34:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:48.582 23:34:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:48.582 23:34:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:48.582 23:34:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:48.582 23:34:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:48.839 23:34:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:48.839 "name": "raid_bdev1", 00:12:48.839 "uuid": "972907bd-1e52-4d75-8e2e-1abbd71fd898", 00:12:48.839 "strip_size_kb": 64, 00:12:48.840 "state": "online", 00:12:48.840 "raid_level": "concat", 00:12:48.840 "superblock": true, 00:12:48.840 "num_base_bdevs": 3, 00:12:48.840 "num_base_bdevs_discovered": 3, 00:12:48.840 "num_base_bdevs_operational": 3, 00:12:48.840 "base_bdevs_list": [ 00:12:48.840 { 00:12:48.840 "name": "BaseBdev1", 00:12:48.840 "uuid": "10f3b0be-e57b-52da-8e06-cf1794f0db7b", 00:12:48.840 "is_configured": true, 00:12:48.840 "data_offset": 2048, 00:12:48.840 "data_size": 63488 00:12:48.840 }, 00:12:48.840 { 00:12:48.840 "name": "BaseBdev2", 00:12:48.840 "uuid": "a480f518-589f-577d-acba-ced3cbf80a6c", 00:12:48.840 "is_configured": true, 00:12:48.840 "data_offset": 2048, 00:12:48.840 "data_size": 63488 00:12:48.840 }, 00:12:48.840 { 00:12:48.840 "name": "BaseBdev3", 00:12:48.840 "uuid": "2ee6025b-8e91-5837-9a77-56ddef6b2d6b", 00:12:48.840 "is_configured": true, 00:12:48.840 "data_offset": 2048, 00:12:48.840 "data_size": 63488 00:12:48.840 } 00:12:48.840 ] 00:12:48.840 }' 00:12:48.840 23:34:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:48.840 23:34:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:49.404 23:34:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:49.404 [2024-07-24 23:34:34.262344] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:49.404 [2024-07-24 23:34:34.262381] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:49.404 [2024-07-24 23:34:34.264348] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:49.404 [2024-07-24 23:34:34.264373] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:49.404 [2024-07-24 23:34:34.264395] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:49.404 [2024-07-24 23:34:34.264400] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13e7700 name raid_bdev1, state offline 00:12:49.404 0 00:12:49.404 23:34:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 286571 00:12:49.404 23:34:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@950 -- # '[' -z 286571 ']' 00:12:49.404 23:34:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # kill -0 286571 00:12:49.404 23:34:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # uname 00:12:49.404 23:34:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:12:49.404 23:34:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 286571 00:12:49.404 23:34:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:12:49.404 23:34:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:12:49.404 23:34:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 286571' 00:12:49.404 killing process with pid 286571 00:12:49.404 23:34:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@969 -- # kill 286571 00:12:49.404 [2024-07-24 23:34:34.324507] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:49.404 23:34:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@974 -- # wait 286571 00:12:49.404 [2024-07-24 23:34:34.342644] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:49.662 23:34:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.lN35B67KI7 00:12:49.662 23:34:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:12:49.662 23:34:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:12:49.662 23:34:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.52 00:12:49.662 23:34:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:12:49.662 23:34:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:49.662 23:34:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:49.662 23:34:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.52 != \0\.\0\0 ]] 00:12:49.662 00:12:49.662 real 0m5.408s 00:12:49.662 user 0m8.414s 00:12:49.662 sys 0m0.752s 00:12:49.662 23:34:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:12:49.662 23:34:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:49.662 ************************************ 00:12:49.662 END TEST raid_read_error_test 00:12:49.662 ************************************ 00:12:49.662 23:34:34 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 3 write 00:12:49.662 23:34:34 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:12:49.662 23:34:34 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:12:49.662 23:34:34 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:49.662 ************************************ 00:12:49.662 START TEST raid_write_error_test 00:12:49.662 ************************************ 00:12:49.662 23:34:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test concat 3 write 00:12:49.662 23:34:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:12:49.662 23:34:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:12:49.662 23:34:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:12:49.662 23:34:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:12:49.662 23:34:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:49.662 23:34:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:12:49.662 23:34:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:49.662 23:34:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:49.662 23:34:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:12:49.662 23:34:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:49.663 23:34:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:49.663 23:34:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:12:49.663 23:34:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:49.663 23:34:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:49.663 23:34:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:12:49.663 23:34:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:12:49.663 23:34:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:12:49.663 23:34:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:12:49.663 23:34:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:12:49.663 23:34:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:12:49.663 23:34:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:12:49.663 23:34:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:12:49.663 23:34:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:12:49.663 23:34:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:12:49.663 23:34:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:12:49.663 23:34:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.YFyB9ccokv 00:12:49.663 23:34:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=287582 00:12:49.663 23:34:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 287582 /var/tmp/spdk-raid.sock 00:12:49.663 23:34:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:12:49.663 23:34:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@831 -- # '[' -z 287582 ']' 00:12:49.663 23:34:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:49.663 23:34:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:12:49.663 23:34:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:49.663 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:49.663 23:34:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:12:49.663 23:34:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:49.663 [2024-07-24 23:34:34.657835] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:12:49.663 [2024-07-24 23:34:34.657871] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid287582 ] 00:12:49.921 [2024-07-24 23:34:34.720935] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:49.921 [2024-07-24 23:34:34.796253] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:49.921 [2024-07-24 23:34:34.847987] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:49.921 [2024-07-24 23:34:34.848012] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:50.485 23:34:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:12:50.485 23:34:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@864 -- # return 0 00:12:50.485 23:34:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:50.485 23:34:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:12:50.743 BaseBdev1_malloc 00:12:50.743 23:34:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:12:51.000 true 00:12:51.000 23:34:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:12:51.000 [2024-07-24 23:34:35.947354] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:12:51.000 [2024-07-24 23:34:35.947386] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:51.000 [2024-07-24 23:34:35.947396] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23bc550 00:12:51.000 [2024-07-24 23:34:35.947402] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:51.000 [2024-07-24 23:34:35.948514] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:51.000 [2024-07-24 23:34:35.948537] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:12:51.000 BaseBdev1 00:12:51.000 23:34:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:51.000 23:34:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:12:51.259 BaseBdev2_malloc 00:12:51.259 23:34:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:12:51.544 true 00:12:51.544 23:34:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:12:51.544 [2024-07-24 23:34:36.451965] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:12:51.544 [2024-07-24 23:34:36.451998] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:51.544 [2024-07-24 23:34:36.452008] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23c0d90 00:12:51.544 [2024-07-24 23:34:36.452014] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:51.544 [2024-07-24 23:34:36.452987] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:51.544 [2024-07-24 23:34:36.453008] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:12:51.544 BaseBdev2 00:12:51.544 23:34:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:51.544 23:34:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:12:51.813 BaseBdev3_malloc 00:12:51.813 23:34:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:12:51.813 true 00:12:52.071 23:34:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:12:52.071 [2024-07-24 23:34:36.985036] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:12:52.071 [2024-07-24 23:34:36.985067] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:52.071 [2024-07-24 23:34:36.985078] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23c3050 00:12:52.071 [2024-07-24 23:34:36.985083] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:52.071 [2024-07-24 23:34:36.986052] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:52.071 [2024-07-24 23:34:36.986072] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:12:52.071 BaseBdev3 00:12:52.071 23:34:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:12:52.328 [2024-07-24 23:34:37.153500] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:52.328 [2024-07-24 23:34:37.154276] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:52.328 [2024-07-24 23:34:37.154321] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:52.328 [2024-07-24 23:34:37.154457] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x23c4700 00:12:52.328 [2024-07-24 23:34:37.154464] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:12:52.328 [2024-07-24 23:34:37.154596] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23c42a0 00:12:52.328 [2024-07-24 23:34:37.154690] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x23c4700 00:12:52.328 [2024-07-24 23:34:37.154694] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x23c4700 00:12:52.328 [2024-07-24 23:34:37.154755] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:52.328 23:34:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:12:52.328 23:34:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:52.328 23:34:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:52.328 23:34:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:52.328 23:34:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:52.328 23:34:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:52.328 23:34:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:52.328 23:34:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:52.328 23:34:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:52.328 23:34:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:52.328 23:34:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:52.328 23:34:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:52.585 23:34:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:52.585 "name": "raid_bdev1", 00:12:52.585 "uuid": "d5dea3c8-28d4-49d1-a359-69f8f144f5fe", 00:12:52.585 "strip_size_kb": 64, 00:12:52.585 "state": "online", 00:12:52.585 "raid_level": "concat", 00:12:52.585 "superblock": true, 00:12:52.585 "num_base_bdevs": 3, 00:12:52.585 "num_base_bdevs_discovered": 3, 00:12:52.585 "num_base_bdevs_operational": 3, 00:12:52.585 "base_bdevs_list": [ 00:12:52.585 { 00:12:52.585 "name": "BaseBdev1", 00:12:52.585 "uuid": "0f9eedae-0a39-567a-b9a7-35e3a18d14a4", 00:12:52.585 "is_configured": true, 00:12:52.585 "data_offset": 2048, 00:12:52.585 "data_size": 63488 00:12:52.585 }, 00:12:52.585 { 00:12:52.585 "name": "BaseBdev2", 00:12:52.585 "uuid": "92a49e1e-b3fa-5564-a812-4e73df143988", 00:12:52.585 "is_configured": true, 00:12:52.585 "data_offset": 2048, 00:12:52.585 "data_size": 63488 00:12:52.585 }, 00:12:52.585 { 00:12:52.585 "name": "BaseBdev3", 00:12:52.585 "uuid": "52889bee-9ec5-5dd5-b53a-46bd6fe1b72a", 00:12:52.585 "is_configured": true, 00:12:52.585 "data_offset": 2048, 00:12:52.585 "data_size": 63488 00:12:52.585 } 00:12:52.585 ] 00:12:52.585 }' 00:12:52.585 23:34:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:52.585 23:34:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:52.842 23:34:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:12:52.842 23:34:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:12:53.099 [2024-07-24 23:34:37.867569] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2212950 00:12:54.033 23:34:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:12:54.033 23:34:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:12:54.033 23:34:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:12:54.033 23:34:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:12:54.033 23:34:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:12:54.033 23:34:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:54.033 23:34:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:54.033 23:34:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:54.033 23:34:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:54.033 23:34:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:54.033 23:34:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:54.033 23:34:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:54.033 23:34:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:54.033 23:34:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:54.033 23:34:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:54.033 23:34:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:54.290 23:34:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:54.290 "name": "raid_bdev1", 00:12:54.290 "uuid": "d5dea3c8-28d4-49d1-a359-69f8f144f5fe", 00:12:54.290 "strip_size_kb": 64, 00:12:54.290 "state": "online", 00:12:54.290 "raid_level": "concat", 00:12:54.290 "superblock": true, 00:12:54.290 "num_base_bdevs": 3, 00:12:54.290 "num_base_bdevs_discovered": 3, 00:12:54.290 "num_base_bdevs_operational": 3, 00:12:54.290 "base_bdevs_list": [ 00:12:54.290 { 00:12:54.290 "name": "BaseBdev1", 00:12:54.290 "uuid": "0f9eedae-0a39-567a-b9a7-35e3a18d14a4", 00:12:54.290 "is_configured": true, 00:12:54.290 "data_offset": 2048, 00:12:54.290 "data_size": 63488 00:12:54.290 }, 00:12:54.290 { 00:12:54.290 "name": "BaseBdev2", 00:12:54.290 "uuid": "92a49e1e-b3fa-5564-a812-4e73df143988", 00:12:54.290 "is_configured": true, 00:12:54.290 "data_offset": 2048, 00:12:54.290 "data_size": 63488 00:12:54.290 }, 00:12:54.290 { 00:12:54.290 "name": "BaseBdev3", 00:12:54.291 "uuid": "52889bee-9ec5-5dd5-b53a-46bd6fe1b72a", 00:12:54.291 "is_configured": true, 00:12:54.291 "data_offset": 2048, 00:12:54.291 "data_size": 63488 00:12:54.291 } 00:12:54.291 ] 00:12:54.291 }' 00:12:54.291 23:34:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:54.291 23:34:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:54.856 23:34:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:54.856 [2024-07-24 23:34:39.784642] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:54.856 [2024-07-24 23:34:39.784672] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:54.856 [2024-07-24 23:34:39.786755] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:54.856 [2024-07-24 23:34:39.786788] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:54.856 [2024-07-24 23:34:39.786810] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:54.856 [2024-07-24 23:34:39.786815] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23c4700 name raid_bdev1, state offline 00:12:54.856 0 00:12:54.856 23:34:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 287582 00:12:54.856 23:34:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@950 -- # '[' -z 287582 ']' 00:12:54.856 23:34:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # kill -0 287582 00:12:54.856 23:34:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # uname 00:12:54.856 23:34:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:12:54.856 23:34:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 287582 00:12:54.856 23:34:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:12:54.856 23:34:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:12:54.856 23:34:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 287582' 00:12:54.856 killing process with pid 287582 00:12:54.856 23:34:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@969 -- # kill 287582 00:12:54.856 [2024-07-24 23:34:39.849546] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:54.856 23:34:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@974 -- # wait 287582 00:12:55.114 [2024-07-24 23:34:39.867850] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:55.114 23:34:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.YFyB9ccokv 00:12:55.114 23:34:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:12:55.114 23:34:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:12:55.114 23:34:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.52 00:12:55.114 23:34:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:12:55.114 23:34:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:55.114 23:34:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:55.114 23:34:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.52 != \0\.\0\0 ]] 00:12:55.114 00:12:55.114 real 0m5.465s 00:12:55.114 user 0m8.476s 00:12:55.114 sys 0m0.812s 00:12:55.114 23:34:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:12:55.114 23:34:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:55.114 ************************************ 00:12:55.114 END TEST raid_write_error_test 00:12:55.114 ************************************ 00:12:55.114 23:34:40 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:12:55.114 23:34:40 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 3 false 00:12:55.114 23:34:40 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:12:55.114 23:34:40 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:12:55.114 23:34:40 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:55.372 ************************************ 00:12:55.372 START TEST raid_state_function_test 00:12:55.372 ************************************ 00:12:55.373 23:34:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # raid_state_function_test raid1 3 false 00:12:55.373 23:34:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:12:55.373 23:34:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:12:55.373 23:34:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:12:55.373 23:34:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:12:55.373 23:34:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:12:55.373 23:34:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:55.373 23:34:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:12:55.373 23:34:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:55.373 23:34:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:55.373 23:34:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:12:55.373 23:34:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:55.373 23:34:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:55.373 23:34:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:12:55.373 23:34:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:55.373 23:34:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:55.373 23:34:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:12:55.373 23:34:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:12:55.373 23:34:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:12:55.373 23:34:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:12:55.373 23:34:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:12:55.373 23:34:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:12:55.373 23:34:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:12:55.373 23:34:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:12:55.373 23:34:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:12:55.373 23:34:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:12:55.373 23:34:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=288592 00:12:55.373 23:34:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 288592' 00:12:55.373 Process raid pid: 288592 00:12:55.373 23:34:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:55.373 23:34:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 288592 /var/tmp/spdk-raid.sock 00:12:55.373 23:34:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@831 -- # '[' -z 288592 ']' 00:12:55.373 23:34:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:55.373 23:34:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:12:55.373 23:34:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:55.373 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:55.373 23:34:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:12:55.373 23:34:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:55.373 [2024-07-24 23:34:40.183159] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:12:55.373 [2024-07-24 23:34:40.183196] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:55.373 [2024-07-24 23:34:40.248290] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:55.373 [2024-07-24 23:34:40.320099] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:55.373 [2024-07-24 23:34:40.369034] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:55.373 [2024-07-24 23:34:40.369057] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:56.307 23:34:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:12:56.307 23:34:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@864 -- # return 0 00:12:56.307 23:34:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:56.307 [2024-07-24 23:34:41.135937] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:56.307 [2024-07-24 23:34:41.135969] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:56.307 [2024-07-24 23:34:41.135975] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:56.307 [2024-07-24 23:34:41.135981] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:56.307 [2024-07-24 23:34:41.135985] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:12:56.307 [2024-07-24 23:34:41.135991] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:12:56.307 23:34:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:12:56.307 23:34:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:56.307 23:34:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:56.307 23:34:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:56.307 23:34:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:56.307 23:34:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:56.307 23:34:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:56.307 23:34:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:56.307 23:34:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:56.307 23:34:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:56.307 23:34:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:56.307 23:34:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:56.565 23:34:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:56.565 "name": "Existed_Raid", 00:12:56.565 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:56.565 "strip_size_kb": 0, 00:12:56.565 "state": "configuring", 00:12:56.565 "raid_level": "raid1", 00:12:56.565 "superblock": false, 00:12:56.565 "num_base_bdevs": 3, 00:12:56.565 "num_base_bdevs_discovered": 0, 00:12:56.565 "num_base_bdevs_operational": 3, 00:12:56.565 "base_bdevs_list": [ 00:12:56.565 { 00:12:56.565 "name": "BaseBdev1", 00:12:56.565 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:56.565 "is_configured": false, 00:12:56.565 "data_offset": 0, 00:12:56.565 "data_size": 0 00:12:56.565 }, 00:12:56.565 { 00:12:56.565 "name": "BaseBdev2", 00:12:56.565 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:56.565 "is_configured": false, 00:12:56.565 "data_offset": 0, 00:12:56.565 "data_size": 0 00:12:56.565 }, 00:12:56.565 { 00:12:56.565 "name": "BaseBdev3", 00:12:56.565 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:56.565 "is_configured": false, 00:12:56.565 "data_offset": 0, 00:12:56.565 "data_size": 0 00:12:56.565 } 00:12:56.565 ] 00:12:56.565 }' 00:12:56.565 23:34:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:56.565 23:34:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:56.824 23:34:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:57.081 [2024-07-24 23:34:41.949961] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:57.081 [2024-07-24 23:34:41.949982] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2777b30 name Existed_Raid, state configuring 00:12:57.082 23:34:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:57.340 [2024-07-24 23:34:42.122414] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:57.340 [2024-07-24 23:34:42.122435] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:57.340 [2024-07-24 23:34:42.122439] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:57.340 [2024-07-24 23:34:42.122444] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:57.340 [2024-07-24 23:34:42.122448] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:12:57.340 [2024-07-24 23:34:42.122452] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:12:57.340 23:34:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:57.340 [2024-07-24 23:34:42.311192] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:57.340 BaseBdev1 00:12:57.340 23:34:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:12:57.340 23:34:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:12:57.340 23:34:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:12:57.340 23:34:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:12:57.340 23:34:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:12:57.340 23:34:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:12:57.340 23:34:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:57.598 23:34:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:57.856 [ 00:12:57.856 { 00:12:57.856 "name": "BaseBdev1", 00:12:57.856 "aliases": [ 00:12:57.856 "b84a15a9-a21e-474c-a74d-ab84b6a6258d" 00:12:57.856 ], 00:12:57.856 "product_name": "Malloc disk", 00:12:57.856 "block_size": 512, 00:12:57.856 "num_blocks": 65536, 00:12:57.856 "uuid": "b84a15a9-a21e-474c-a74d-ab84b6a6258d", 00:12:57.856 "assigned_rate_limits": { 00:12:57.856 "rw_ios_per_sec": 0, 00:12:57.856 "rw_mbytes_per_sec": 0, 00:12:57.856 "r_mbytes_per_sec": 0, 00:12:57.856 "w_mbytes_per_sec": 0 00:12:57.856 }, 00:12:57.856 "claimed": true, 00:12:57.856 "claim_type": "exclusive_write", 00:12:57.856 "zoned": false, 00:12:57.856 "supported_io_types": { 00:12:57.856 "read": true, 00:12:57.856 "write": true, 00:12:57.856 "unmap": true, 00:12:57.856 "flush": true, 00:12:57.856 "reset": true, 00:12:57.856 "nvme_admin": false, 00:12:57.856 "nvme_io": false, 00:12:57.856 "nvme_io_md": false, 00:12:57.856 "write_zeroes": true, 00:12:57.856 "zcopy": true, 00:12:57.856 "get_zone_info": false, 00:12:57.856 "zone_management": false, 00:12:57.856 "zone_append": false, 00:12:57.856 "compare": false, 00:12:57.856 "compare_and_write": false, 00:12:57.856 "abort": true, 00:12:57.856 "seek_hole": false, 00:12:57.856 "seek_data": false, 00:12:57.856 "copy": true, 00:12:57.856 "nvme_iov_md": false 00:12:57.856 }, 00:12:57.856 "memory_domains": [ 00:12:57.856 { 00:12:57.856 "dma_device_id": "system", 00:12:57.856 "dma_device_type": 1 00:12:57.856 }, 00:12:57.856 { 00:12:57.856 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:57.856 "dma_device_type": 2 00:12:57.856 } 00:12:57.856 ], 00:12:57.856 "driver_specific": {} 00:12:57.856 } 00:12:57.856 ] 00:12:57.856 23:34:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:12:57.856 23:34:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:12:57.856 23:34:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:57.856 23:34:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:57.856 23:34:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:57.856 23:34:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:57.856 23:34:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:57.856 23:34:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:57.856 23:34:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:57.856 23:34:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:57.856 23:34:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:57.856 23:34:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:57.856 23:34:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:57.856 23:34:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:57.856 "name": "Existed_Raid", 00:12:57.856 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:57.856 "strip_size_kb": 0, 00:12:57.856 "state": "configuring", 00:12:57.856 "raid_level": "raid1", 00:12:57.856 "superblock": false, 00:12:57.856 "num_base_bdevs": 3, 00:12:57.856 "num_base_bdevs_discovered": 1, 00:12:57.856 "num_base_bdevs_operational": 3, 00:12:57.856 "base_bdevs_list": [ 00:12:57.856 { 00:12:57.856 "name": "BaseBdev1", 00:12:57.856 "uuid": "b84a15a9-a21e-474c-a74d-ab84b6a6258d", 00:12:57.856 "is_configured": true, 00:12:57.856 "data_offset": 0, 00:12:57.856 "data_size": 65536 00:12:57.856 }, 00:12:57.856 { 00:12:57.856 "name": "BaseBdev2", 00:12:57.856 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:57.856 "is_configured": false, 00:12:57.856 "data_offset": 0, 00:12:57.856 "data_size": 0 00:12:57.856 }, 00:12:57.856 { 00:12:57.856 "name": "BaseBdev3", 00:12:57.856 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:57.856 "is_configured": false, 00:12:57.856 "data_offset": 0, 00:12:57.856 "data_size": 0 00:12:57.856 } 00:12:57.856 ] 00:12:57.856 }' 00:12:57.856 23:34:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:57.856 23:34:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:58.423 23:34:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:58.684 [2024-07-24 23:34:43.482197] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:58.684 [2024-07-24 23:34:43.482232] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x27773a0 name Existed_Raid, state configuring 00:12:58.684 23:34:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:58.684 [2024-07-24 23:34:43.662692] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:58.684 [2024-07-24 23:34:43.663712] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:58.684 [2024-07-24 23:34:43.663740] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:58.684 [2024-07-24 23:34:43.663745] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:12:58.684 [2024-07-24 23:34:43.663750] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:12:58.943 23:34:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:12:58.943 23:34:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:58.943 23:34:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:12:58.943 23:34:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:58.943 23:34:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:58.943 23:34:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:58.943 23:34:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:58.943 23:34:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:58.943 23:34:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:58.943 23:34:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:58.943 23:34:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:58.943 23:34:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:58.943 23:34:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:58.943 23:34:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:58.943 23:34:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:58.943 "name": "Existed_Raid", 00:12:58.943 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:58.943 "strip_size_kb": 0, 00:12:58.943 "state": "configuring", 00:12:58.943 "raid_level": "raid1", 00:12:58.943 "superblock": false, 00:12:58.943 "num_base_bdevs": 3, 00:12:58.943 "num_base_bdevs_discovered": 1, 00:12:58.943 "num_base_bdevs_operational": 3, 00:12:58.943 "base_bdevs_list": [ 00:12:58.943 { 00:12:58.943 "name": "BaseBdev1", 00:12:58.943 "uuid": "b84a15a9-a21e-474c-a74d-ab84b6a6258d", 00:12:58.943 "is_configured": true, 00:12:58.943 "data_offset": 0, 00:12:58.943 "data_size": 65536 00:12:58.943 }, 00:12:58.943 { 00:12:58.943 "name": "BaseBdev2", 00:12:58.943 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:58.943 "is_configured": false, 00:12:58.943 "data_offset": 0, 00:12:58.943 "data_size": 0 00:12:58.943 }, 00:12:58.943 { 00:12:58.943 "name": "BaseBdev3", 00:12:58.943 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:58.943 "is_configured": false, 00:12:58.943 "data_offset": 0, 00:12:58.943 "data_size": 0 00:12:58.943 } 00:12:58.943 ] 00:12:58.943 }' 00:12:58.943 23:34:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:58.943 23:34:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:59.509 23:34:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:59.509 [2024-07-24 23:34:44.507621] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:59.509 BaseBdev2 00:12:59.767 23:34:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:12:59.767 23:34:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:12:59.767 23:34:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:12:59.767 23:34:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:12:59.767 23:34:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:12:59.767 23:34:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:12:59.767 23:34:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:59.767 23:34:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:00.025 [ 00:13:00.025 { 00:13:00.025 "name": "BaseBdev2", 00:13:00.025 "aliases": [ 00:13:00.025 "58df51e4-fcd7-4aa5-9046-4f789385e9d4" 00:13:00.025 ], 00:13:00.025 "product_name": "Malloc disk", 00:13:00.025 "block_size": 512, 00:13:00.025 "num_blocks": 65536, 00:13:00.025 "uuid": "58df51e4-fcd7-4aa5-9046-4f789385e9d4", 00:13:00.025 "assigned_rate_limits": { 00:13:00.025 "rw_ios_per_sec": 0, 00:13:00.025 "rw_mbytes_per_sec": 0, 00:13:00.025 "r_mbytes_per_sec": 0, 00:13:00.025 "w_mbytes_per_sec": 0 00:13:00.025 }, 00:13:00.025 "claimed": true, 00:13:00.025 "claim_type": "exclusive_write", 00:13:00.025 "zoned": false, 00:13:00.025 "supported_io_types": { 00:13:00.025 "read": true, 00:13:00.025 "write": true, 00:13:00.025 "unmap": true, 00:13:00.025 "flush": true, 00:13:00.025 "reset": true, 00:13:00.026 "nvme_admin": false, 00:13:00.026 "nvme_io": false, 00:13:00.026 "nvme_io_md": false, 00:13:00.026 "write_zeroes": true, 00:13:00.026 "zcopy": true, 00:13:00.026 "get_zone_info": false, 00:13:00.026 "zone_management": false, 00:13:00.026 "zone_append": false, 00:13:00.026 "compare": false, 00:13:00.026 "compare_and_write": false, 00:13:00.026 "abort": true, 00:13:00.026 "seek_hole": false, 00:13:00.026 "seek_data": false, 00:13:00.026 "copy": true, 00:13:00.026 "nvme_iov_md": false 00:13:00.026 }, 00:13:00.026 "memory_domains": [ 00:13:00.026 { 00:13:00.026 "dma_device_id": "system", 00:13:00.026 "dma_device_type": 1 00:13:00.026 }, 00:13:00.026 { 00:13:00.026 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:00.026 "dma_device_type": 2 00:13:00.026 } 00:13:00.026 ], 00:13:00.026 "driver_specific": {} 00:13:00.026 } 00:13:00.026 ] 00:13:00.026 23:34:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:13:00.026 23:34:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:00.026 23:34:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:00.026 23:34:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:00.026 23:34:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:00.026 23:34:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:00.026 23:34:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:00.026 23:34:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:00.026 23:34:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:00.026 23:34:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:00.026 23:34:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:00.026 23:34:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:00.026 23:34:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:00.026 23:34:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:00.026 23:34:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:00.284 23:34:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:00.284 "name": "Existed_Raid", 00:13:00.284 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:00.284 "strip_size_kb": 0, 00:13:00.284 "state": "configuring", 00:13:00.284 "raid_level": "raid1", 00:13:00.284 "superblock": false, 00:13:00.284 "num_base_bdevs": 3, 00:13:00.284 "num_base_bdevs_discovered": 2, 00:13:00.284 "num_base_bdevs_operational": 3, 00:13:00.284 "base_bdevs_list": [ 00:13:00.284 { 00:13:00.284 "name": "BaseBdev1", 00:13:00.284 "uuid": "b84a15a9-a21e-474c-a74d-ab84b6a6258d", 00:13:00.284 "is_configured": true, 00:13:00.284 "data_offset": 0, 00:13:00.284 "data_size": 65536 00:13:00.284 }, 00:13:00.284 { 00:13:00.284 "name": "BaseBdev2", 00:13:00.284 "uuid": "58df51e4-fcd7-4aa5-9046-4f789385e9d4", 00:13:00.284 "is_configured": true, 00:13:00.284 "data_offset": 0, 00:13:00.284 "data_size": 65536 00:13:00.284 }, 00:13:00.284 { 00:13:00.284 "name": "BaseBdev3", 00:13:00.284 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:00.284 "is_configured": false, 00:13:00.284 "data_offset": 0, 00:13:00.284 "data_size": 0 00:13:00.284 } 00:13:00.284 ] 00:13:00.284 }' 00:13:00.284 23:34:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:00.284 23:34:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:00.541 23:34:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:00.799 [2024-07-24 23:34:45.677303] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:00.800 [2024-07-24 23:34:45.677334] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x27782a0 00:13:00.800 [2024-07-24 23:34:45.677339] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:13:00.800 [2024-07-24 23:34:45.677475] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2778970 00:13:00.800 [2024-07-24 23:34:45.677568] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x27782a0 00:13:00.800 [2024-07-24 23:34:45.677573] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x27782a0 00:13:00.800 [2024-07-24 23:34:45.677711] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:00.800 BaseBdev3 00:13:00.800 23:34:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:13:00.800 23:34:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:13:00.800 23:34:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:13:00.800 23:34:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:13:00.800 23:34:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:13:00.800 23:34:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:13:00.800 23:34:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:01.058 23:34:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:01.058 [ 00:13:01.058 { 00:13:01.058 "name": "BaseBdev3", 00:13:01.058 "aliases": [ 00:13:01.058 "e8ce9908-3b0c-4fcf-a181-5057591660e4" 00:13:01.058 ], 00:13:01.058 "product_name": "Malloc disk", 00:13:01.058 "block_size": 512, 00:13:01.058 "num_blocks": 65536, 00:13:01.058 "uuid": "e8ce9908-3b0c-4fcf-a181-5057591660e4", 00:13:01.058 "assigned_rate_limits": { 00:13:01.058 "rw_ios_per_sec": 0, 00:13:01.058 "rw_mbytes_per_sec": 0, 00:13:01.058 "r_mbytes_per_sec": 0, 00:13:01.058 "w_mbytes_per_sec": 0 00:13:01.058 }, 00:13:01.058 "claimed": true, 00:13:01.058 "claim_type": "exclusive_write", 00:13:01.058 "zoned": false, 00:13:01.058 "supported_io_types": { 00:13:01.058 "read": true, 00:13:01.058 "write": true, 00:13:01.058 "unmap": true, 00:13:01.058 "flush": true, 00:13:01.058 "reset": true, 00:13:01.058 "nvme_admin": false, 00:13:01.058 "nvme_io": false, 00:13:01.058 "nvme_io_md": false, 00:13:01.058 "write_zeroes": true, 00:13:01.058 "zcopy": true, 00:13:01.058 "get_zone_info": false, 00:13:01.058 "zone_management": false, 00:13:01.058 "zone_append": false, 00:13:01.058 "compare": false, 00:13:01.058 "compare_and_write": false, 00:13:01.058 "abort": true, 00:13:01.058 "seek_hole": false, 00:13:01.058 "seek_data": false, 00:13:01.058 "copy": true, 00:13:01.058 "nvme_iov_md": false 00:13:01.058 }, 00:13:01.058 "memory_domains": [ 00:13:01.058 { 00:13:01.058 "dma_device_id": "system", 00:13:01.058 "dma_device_type": 1 00:13:01.058 }, 00:13:01.058 { 00:13:01.058 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:01.058 "dma_device_type": 2 00:13:01.058 } 00:13:01.058 ], 00:13:01.058 "driver_specific": {} 00:13:01.058 } 00:13:01.058 ] 00:13:01.058 23:34:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:13:01.058 23:34:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:01.058 23:34:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:01.058 23:34:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:13:01.058 23:34:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:01.058 23:34:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:01.058 23:34:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:01.058 23:34:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:01.058 23:34:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:01.058 23:34:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:01.058 23:34:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:01.058 23:34:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:01.058 23:34:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:01.058 23:34:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:01.058 23:34:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:01.316 23:34:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:01.316 "name": "Existed_Raid", 00:13:01.316 "uuid": "1856c38e-ed54-418c-9e44-33312fa2153d", 00:13:01.316 "strip_size_kb": 0, 00:13:01.316 "state": "online", 00:13:01.316 "raid_level": "raid1", 00:13:01.316 "superblock": false, 00:13:01.316 "num_base_bdevs": 3, 00:13:01.316 "num_base_bdevs_discovered": 3, 00:13:01.316 "num_base_bdevs_operational": 3, 00:13:01.316 "base_bdevs_list": [ 00:13:01.316 { 00:13:01.316 "name": "BaseBdev1", 00:13:01.316 "uuid": "b84a15a9-a21e-474c-a74d-ab84b6a6258d", 00:13:01.316 "is_configured": true, 00:13:01.316 "data_offset": 0, 00:13:01.316 "data_size": 65536 00:13:01.316 }, 00:13:01.316 { 00:13:01.316 "name": "BaseBdev2", 00:13:01.316 "uuid": "58df51e4-fcd7-4aa5-9046-4f789385e9d4", 00:13:01.317 "is_configured": true, 00:13:01.317 "data_offset": 0, 00:13:01.317 "data_size": 65536 00:13:01.317 }, 00:13:01.317 { 00:13:01.317 "name": "BaseBdev3", 00:13:01.317 "uuid": "e8ce9908-3b0c-4fcf-a181-5057591660e4", 00:13:01.317 "is_configured": true, 00:13:01.317 "data_offset": 0, 00:13:01.317 "data_size": 65536 00:13:01.317 } 00:13:01.317 ] 00:13:01.317 }' 00:13:01.317 23:34:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:01.317 23:34:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:01.895 23:34:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:13:01.895 23:34:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:01.895 23:34:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:01.895 23:34:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:01.895 23:34:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:01.895 23:34:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:01.895 23:34:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:01.895 23:34:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:01.895 [2024-07-24 23:34:46.832463] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:01.895 23:34:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:01.895 "name": "Existed_Raid", 00:13:01.895 "aliases": [ 00:13:01.895 "1856c38e-ed54-418c-9e44-33312fa2153d" 00:13:01.895 ], 00:13:01.895 "product_name": "Raid Volume", 00:13:01.895 "block_size": 512, 00:13:01.895 "num_blocks": 65536, 00:13:01.895 "uuid": "1856c38e-ed54-418c-9e44-33312fa2153d", 00:13:01.895 "assigned_rate_limits": { 00:13:01.895 "rw_ios_per_sec": 0, 00:13:01.895 "rw_mbytes_per_sec": 0, 00:13:01.895 "r_mbytes_per_sec": 0, 00:13:01.895 "w_mbytes_per_sec": 0 00:13:01.895 }, 00:13:01.895 "claimed": false, 00:13:01.895 "zoned": false, 00:13:01.895 "supported_io_types": { 00:13:01.895 "read": true, 00:13:01.895 "write": true, 00:13:01.895 "unmap": false, 00:13:01.895 "flush": false, 00:13:01.895 "reset": true, 00:13:01.895 "nvme_admin": false, 00:13:01.895 "nvme_io": false, 00:13:01.895 "nvme_io_md": false, 00:13:01.895 "write_zeroes": true, 00:13:01.895 "zcopy": false, 00:13:01.895 "get_zone_info": false, 00:13:01.895 "zone_management": false, 00:13:01.895 "zone_append": false, 00:13:01.895 "compare": false, 00:13:01.895 "compare_and_write": false, 00:13:01.895 "abort": false, 00:13:01.895 "seek_hole": false, 00:13:01.895 "seek_data": false, 00:13:01.895 "copy": false, 00:13:01.895 "nvme_iov_md": false 00:13:01.895 }, 00:13:01.895 "memory_domains": [ 00:13:01.895 { 00:13:01.895 "dma_device_id": "system", 00:13:01.895 "dma_device_type": 1 00:13:01.895 }, 00:13:01.895 { 00:13:01.895 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:01.895 "dma_device_type": 2 00:13:01.895 }, 00:13:01.895 { 00:13:01.895 "dma_device_id": "system", 00:13:01.895 "dma_device_type": 1 00:13:01.895 }, 00:13:01.895 { 00:13:01.895 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:01.895 "dma_device_type": 2 00:13:01.895 }, 00:13:01.895 { 00:13:01.895 "dma_device_id": "system", 00:13:01.895 "dma_device_type": 1 00:13:01.895 }, 00:13:01.895 { 00:13:01.895 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:01.895 "dma_device_type": 2 00:13:01.895 } 00:13:01.895 ], 00:13:01.895 "driver_specific": { 00:13:01.895 "raid": { 00:13:01.895 "uuid": "1856c38e-ed54-418c-9e44-33312fa2153d", 00:13:01.895 "strip_size_kb": 0, 00:13:01.895 "state": "online", 00:13:01.895 "raid_level": "raid1", 00:13:01.895 "superblock": false, 00:13:01.895 "num_base_bdevs": 3, 00:13:01.895 "num_base_bdevs_discovered": 3, 00:13:01.895 "num_base_bdevs_operational": 3, 00:13:01.895 "base_bdevs_list": [ 00:13:01.895 { 00:13:01.895 "name": "BaseBdev1", 00:13:01.895 "uuid": "b84a15a9-a21e-474c-a74d-ab84b6a6258d", 00:13:01.895 "is_configured": true, 00:13:01.895 "data_offset": 0, 00:13:01.895 "data_size": 65536 00:13:01.895 }, 00:13:01.895 { 00:13:01.895 "name": "BaseBdev2", 00:13:01.895 "uuid": "58df51e4-fcd7-4aa5-9046-4f789385e9d4", 00:13:01.895 "is_configured": true, 00:13:01.895 "data_offset": 0, 00:13:01.895 "data_size": 65536 00:13:01.895 }, 00:13:01.895 { 00:13:01.895 "name": "BaseBdev3", 00:13:01.895 "uuid": "e8ce9908-3b0c-4fcf-a181-5057591660e4", 00:13:01.895 "is_configured": true, 00:13:01.895 "data_offset": 0, 00:13:01.895 "data_size": 65536 00:13:01.895 } 00:13:01.895 ] 00:13:01.895 } 00:13:01.895 } 00:13:01.895 }' 00:13:01.895 23:34:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:02.157 23:34:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:13:02.157 BaseBdev2 00:13:02.157 BaseBdev3' 00:13:02.157 23:34:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:02.157 23:34:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:13:02.157 23:34:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:02.157 23:34:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:02.157 "name": "BaseBdev1", 00:13:02.157 "aliases": [ 00:13:02.158 "b84a15a9-a21e-474c-a74d-ab84b6a6258d" 00:13:02.158 ], 00:13:02.158 "product_name": "Malloc disk", 00:13:02.158 "block_size": 512, 00:13:02.158 "num_blocks": 65536, 00:13:02.158 "uuid": "b84a15a9-a21e-474c-a74d-ab84b6a6258d", 00:13:02.158 "assigned_rate_limits": { 00:13:02.158 "rw_ios_per_sec": 0, 00:13:02.158 "rw_mbytes_per_sec": 0, 00:13:02.158 "r_mbytes_per_sec": 0, 00:13:02.158 "w_mbytes_per_sec": 0 00:13:02.158 }, 00:13:02.158 "claimed": true, 00:13:02.158 "claim_type": "exclusive_write", 00:13:02.158 "zoned": false, 00:13:02.158 "supported_io_types": { 00:13:02.158 "read": true, 00:13:02.158 "write": true, 00:13:02.158 "unmap": true, 00:13:02.158 "flush": true, 00:13:02.158 "reset": true, 00:13:02.158 "nvme_admin": false, 00:13:02.158 "nvme_io": false, 00:13:02.158 "nvme_io_md": false, 00:13:02.158 "write_zeroes": true, 00:13:02.158 "zcopy": true, 00:13:02.158 "get_zone_info": false, 00:13:02.158 "zone_management": false, 00:13:02.158 "zone_append": false, 00:13:02.158 "compare": false, 00:13:02.158 "compare_and_write": false, 00:13:02.158 "abort": true, 00:13:02.158 "seek_hole": false, 00:13:02.158 "seek_data": false, 00:13:02.158 "copy": true, 00:13:02.158 "nvme_iov_md": false 00:13:02.158 }, 00:13:02.158 "memory_domains": [ 00:13:02.158 { 00:13:02.158 "dma_device_id": "system", 00:13:02.158 "dma_device_type": 1 00:13:02.158 }, 00:13:02.158 { 00:13:02.158 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:02.158 "dma_device_type": 2 00:13:02.158 } 00:13:02.158 ], 00:13:02.158 "driver_specific": {} 00:13:02.158 }' 00:13:02.158 23:34:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:02.158 23:34:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:02.158 23:34:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:02.158 23:34:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:02.415 23:34:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:02.415 23:34:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:02.415 23:34:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:02.415 23:34:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:02.415 23:34:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:02.415 23:34:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:02.415 23:34:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:02.415 23:34:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:02.415 23:34:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:02.415 23:34:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:02.415 23:34:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:02.672 23:34:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:02.672 "name": "BaseBdev2", 00:13:02.672 "aliases": [ 00:13:02.672 "58df51e4-fcd7-4aa5-9046-4f789385e9d4" 00:13:02.672 ], 00:13:02.672 "product_name": "Malloc disk", 00:13:02.672 "block_size": 512, 00:13:02.672 "num_blocks": 65536, 00:13:02.672 "uuid": "58df51e4-fcd7-4aa5-9046-4f789385e9d4", 00:13:02.672 "assigned_rate_limits": { 00:13:02.672 "rw_ios_per_sec": 0, 00:13:02.672 "rw_mbytes_per_sec": 0, 00:13:02.672 "r_mbytes_per_sec": 0, 00:13:02.672 "w_mbytes_per_sec": 0 00:13:02.672 }, 00:13:02.672 "claimed": true, 00:13:02.672 "claim_type": "exclusive_write", 00:13:02.672 "zoned": false, 00:13:02.672 "supported_io_types": { 00:13:02.672 "read": true, 00:13:02.672 "write": true, 00:13:02.672 "unmap": true, 00:13:02.672 "flush": true, 00:13:02.672 "reset": true, 00:13:02.672 "nvme_admin": false, 00:13:02.672 "nvme_io": false, 00:13:02.672 "nvme_io_md": false, 00:13:02.672 "write_zeroes": true, 00:13:02.672 "zcopy": true, 00:13:02.673 "get_zone_info": false, 00:13:02.673 "zone_management": false, 00:13:02.673 "zone_append": false, 00:13:02.673 "compare": false, 00:13:02.673 "compare_and_write": false, 00:13:02.673 "abort": true, 00:13:02.673 "seek_hole": false, 00:13:02.673 "seek_data": false, 00:13:02.673 "copy": true, 00:13:02.673 "nvme_iov_md": false 00:13:02.673 }, 00:13:02.673 "memory_domains": [ 00:13:02.673 { 00:13:02.673 "dma_device_id": "system", 00:13:02.673 "dma_device_type": 1 00:13:02.673 }, 00:13:02.673 { 00:13:02.673 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:02.673 "dma_device_type": 2 00:13:02.673 } 00:13:02.673 ], 00:13:02.673 "driver_specific": {} 00:13:02.673 }' 00:13:02.673 23:34:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:02.673 23:34:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:02.673 23:34:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:02.673 23:34:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:02.673 23:34:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:02.931 23:34:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:02.931 23:34:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:02.931 23:34:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:02.931 23:34:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:02.931 23:34:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:02.931 23:34:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:02.931 23:34:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:02.931 23:34:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:02.931 23:34:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:02.931 23:34:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:13:03.188 23:34:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:03.188 "name": "BaseBdev3", 00:13:03.188 "aliases": [ 00:13:03.188 "e8ce9908-3b0c-4fcf-a181-5057591660e4" 00:13:03.188 ], 00:13:03.188 "product_name": "Malloc disk", 00:13:03.188 "block_size": 512, 00:13:03.188 "num_blocks": 65536, 00:13:03.188 "uuid": "e8ce9908-3b0c-4fcf-a181-5057591660e4", 00:13:03.188 "assigned_rate_limits": { 00:13:03.188 "rw_ios_per_sec": 0, 00:13:03.188 "rw_mbytes_per_sec": 0, 00:13:03.188 "r_mbytes_per_sec": 0, 00:13:03.188 "w_mbytes_per_sec": 0 00:13:03.188 }, 00:13:03.188 "claimed": true, 00:13:03.188 "claim_type": "exclusive_write", 00:13:03.188 "zoned": false, 00:13:03.188 "supported_io_types": { 00:13:03.188 "read": true, 00:13:03.188 "write": true, 00:13:03.188 "unmap": true, 00:13:03.188 "flush": true, 00:13:03.188 "reset": true, 00:13:03.188 "nvme_admin": false, 00:13:03.188 "nvme_io": false, 00:13:03.188 "nvme_io_md": false, 00:13:03.188 "write_zeroes": true, 00:13:03.188 "zcopy": true, 00:13:03.188 "get_zone_info": false, 00:13:03.188 "zone_management": false, 00:13:03.188 "zone_append": false, 00:13:03.188 "compare": false, 00:13:03.189 "compare_and_write": false, 00:13:03.189 "abort": true, 00:13:03.189 "seek_hole": false, 00:13:03.189 "seek_data": false, 00:13:03.189 "copy": true, 00:13:03.189 "nvme_iov_md": false 00:13:03.189 }, 00:13:03.189 "memory_domains": [ 00:13:03.189 { 00:13:03.189 "dma_device_id": "system", 00:13:03.189 "dma_device_type": 1 00:13:03.189 }, 00:13:03.189 { 00:13:03.189 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:03.189 "dma_device_type": 2 00:13:03.189 } 00:13:03.189 ], 00:13:03.189 "driver_specific": {} 00:13:03.189 }' 00:13:03.189 23:34:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:03.189 23:34:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:03.189 23:34:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:03.189 23:34:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:03.189 23:34:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:03.189 23:34:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:03.189 23:34:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:03.446 23:34:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:03.446 23:34:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:03.446 23:34:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:03.446 23:34:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:03.446 23:34:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:03.446 23:34:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:03.704 [2024-07-24 23:34:48.476701] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:03.704 23:34:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:13:03.704 23:34:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:13:03.704 23:34:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:03.704 23:34:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:13:03.704 23:34:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:13:03.704 23:34:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:13:03.704 23:34:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:03.704 23:34:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:03.704 23:34:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:03.704 23:34:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:03.704 23:34:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:03.704 23:34:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:03.704 23:34:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:03.704 23:34:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:03.704 23:34:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:03.704 23:34:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:03.705 23:34:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:03.705 23:34:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:03.705 "name": "Existed_Raid", 00:13:03.705 "uuid": "1856c38e-ed54-418c-9e44-33312fa2153d", 00:13:03.705 "strip_size_kb": 0, 00:13:03.705 "state": "online", 00:13:03.705 "raid_level": "raid1", 00:13:03.705 "superblock": false, 00:13:03.705 "num_base_bdevs": 3, 00:13:03.705 "num_base_bdevs_discovered": 2, 00:13:03.705 "num_base_bdevs_operational": 2, 00:13:03.705 "base_bdevs_list": [ 00:13:03.705 { 00:13:03.705 "name": null, 00:13:03.705 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:03.705 "is_configured": false, 00:13:03.705 "data_offset": 0, 00:13:03.705 "data_size": 65536 00:13:03.705 }, 00:13:03.705 { 00:13:03.705 "name": "BaseBdev2", 00:13:03.705 "uuid": "58df51e4-fcd7-4aa5-9046-4f789385e9d4", 00:13:03.705 "is_configured": true, 00:13:03.705 "data_offset": 0, 00:13:03.705 "data_size": 65536 00:13:03.705 }, 00:13:03.705 { 00:13:03.705 "name": "BaseBdev3", 00:13:03.705 "uuid": "e8ce9908-3b0c-4fcf-a181-5057591660e4", 00:13:03.705 "is_configured": true, 00:13:03.705 "data_offset": 0, 00:13:03.705 "data_size": 65536 00:13:03.705 } 00:13:03.705 ] 00:13:03.705 }' 00:13:03.705 23:34:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:03.705 23:34:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:04.271 23:34:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:13:04.271 23:34:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:04.271 23:34:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:04.271 23:34:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:04.529 23:34:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:04.529 23:34:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:04.529 23:34:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:13:04.529 [2024-07-24 23:34:49.492176] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:04.529 23:34:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:04.529 23:34:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:04.529 23:34:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:04.529 23:34:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:04.787 23:34:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:04.788 23:34:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:04.788 23:34:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:13:05.046 [2024-07-24 23:34:49.838633] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:13:05.046 [2024-07-24 23:34:49.838691] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:05.046 [2024-07-24 23:34:49.848686] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:05.046 [2024-07-24 23:34:49.848712] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:05.046 [2024-07-24 23:34:49.848717] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x27782a0 name Existed_Raid, state offline 00:13:05.046 23:34:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:05.046 23:34:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:05.046 23:34:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:05.046 23:34:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:13:05.046 23:34:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:13:05.046 23:34:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:13:05.046 23:34:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:13:05.046 23:34:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:13:05.046 23:34:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:05.046 23:34:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:05.304 BaseBdev2 00:13:05.304 23:34:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:13:05.304 23:34:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:13:05.304 23:34:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:13:05.304 23:34:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:13:05.304 23:34:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:13:05.304 23:34:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:13:05.304 23:34:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:05.562 23:34:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:05.562 [ 00:13:05.562 { 00:13:05.562 "name": "BaseBdev2", 00:13:05.562 "aliases": [ 00:13:05.562 "b591dcad-b336-4322-9d02-8ebfeaa557ae" 00:13:05.562 ], 00:13:05.562 "product_name": "Malloc disk", 00:13:05.562 "block_size": 512, 00:13:05.562 "num_blocks": 65536, 00:13:05.562 "uuid": "b591dcad-b336-4322-9d02-8ebfeaa557ae", 00:13:05.563 "assigned_rate_limits": { 00:13:05.563 "rw_ios_per_sec": 0, 00:13:05.563 "rw_mbytes_per_sec": 0, 00:13:05.563 "r_mbytes_per_sec": 0, 00:13:05.563 "w_mbytes_per_sec": 0 00:13:05.563 }, 00:13:05.563 "claimed": false, 00:13:05.563 "zoned": false, 00:13:05.563 "supported_io_types": { 00:13:05.563 "read": true, 00:13:05.563 "write": true, 00:13:05.563 "unmap": true, 00:13:05.563 "flush": true, 00:13:05.563 "reset": true, 00:13:05.563 "nvme_admin": false, 00:13:05.563 "nvme_io": false, 00:13:05.563 "nvme_io_md": false, 00:13:05.563 "write_zeroes": true, 00:13:05.563 "zcopy": true, 00:13:05.563 "get_zone_info": false, 00:13:05.563 "zone_management": false, 00:13:05.563 "zone_append": false, 00:13:05.563 "compare": false, 00:13:05.563 "compare_and_write": false, 00:13:05.563 "abort": true, 00:13:05.563 "seek_hole": false, 00:13:05.563 "seek_data": false, 00:13:05.563 "copy": true, 00:13:05.563 "nvme_iov_md": false 00:13:05.563 }, 00:13:05.563 "memory_domains": [ 00:13:05.563 { 00:13:05.563 "dma_device_id": "system", 00:13:05.563 "dma_device_type": 1 00:13:05.563 }, 00:13:05.563 { 00:13:05.563 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:05.563 "dma_device_type": 2 00:13:05.563 } 00:13:05.563 ], 00:13:05.563 "driver_specific": {} 00:13:05.563 } 00:13:05.563 ] 00:13:05.563 23:34:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:13:05.563 23:34:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:13:05.563 23:34:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:05.563 23:34:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:05.821 BaseBdev3 00:13:05.821 23:34:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:13:05.821 23:34:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:13:05.821 23:34:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:13:05.821 23:34:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:13:05.821 23:34:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:13:05.821 23:34:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:13:05.821 23:34:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:06.079 23:34:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:06.079 [ 00:13:06.079 { 00:13:06.079 "name": "BaseBdev3", 00:13:06.079 "aliases": [ 00:13:06.079 "4d48420f-cd97-477d-896c-89bb0c0335cb" 00:13:06.079 ], 00:13:06.079 "product_name": "Malloc disk", 00:13:06.079 "block_size": 512, 00:13:06.079 "num_blocks": 65536, 00:13:06.079 "uuid": "4d48420f-cd97-477d-896c-89bb0c0335cb", 00:13:06.079 "assigned_rate_limits": { 00:13:06.079 "rw_ios_per_sec": 0, 00:13:06.079 "rw_mbytes_per_sec": 0, 00:13:06.079 "r_mbytes_per_sec": 0, 00:13:06.079 "w_mbytes_per_sec": 0 00:13:06.079 }, 00:13:06.079 "claimed": false, 00:13:06.079 "zoned": false, 00:13:06.079 "supported_io_types": { 00:13:06.079 "read": true, 00:13:06.079 "write": true, 00:13:06.079 "unmap": true, 00:13:06.079 "flush": true, 00:13:06.079 "reset": true, 00:13:06.079 "nvme_admin": false, 00:13:06.079 "nvme_io": false, 00:13:06.079 "nvme_io_md": false, 00:13:06.079 "write_zeroes": true, 00:13:06.079 "zcopy": true, 00:13:06.079 "get_zone_info": false, 00:13:06.079 "zone_management": false, 00:13:06.079 "zone_append": false, 00:13:06.079 "compare": false, 00:13:06.079 "compare_and_write": false, 00:13:06.079 "abort": true, 00:13:06.079 "seek_hole": false, 00:13:06.079 "seek_data": false, 00:13:06.079 "copy": true, 00:13:06.079 "nvme_iov_md": false 00:13:06.079 }, 00:13:06.079 "memory_domains": [ 00:13:06.079 { 00:13:06.079 "dma_device_id": "system", 00:13:06.079 "dma_device_type": 1 00:13:06.079 }, 00:13:06.079 { 00:13:06.079 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:06.079 "dma_device_type": 2 00:13:06.079 } 00:13:06.079 ], 00:13:06.079 "driver_specific": {} 00:13:06.079 } 00:13:06.079 ] 00:13:06.079 23:34:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:13:06.079 23:34:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:13:06.079 23:34:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:06.079 23:34:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:06.337 [2024-07-24 23:34:51.195575] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:06.337 [2024-07-24 23:34:51.195608] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:06.337 [2024-07-24 23:34:51.195621] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:06.337 [2024-07-24 23:34:51.196595] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:06.337 23:34:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:06.337 23:34:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:06.337 23:34:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:06.337 23:34:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:06.337 23:34:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:06.337 23:34:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:06.337 23:34:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:06.337 23:34:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:06.337 23:34:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:06.337 23:34:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:06.337 23:34:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:06.337 23:34:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:06.595 23:34:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:06.595 "name": "Existed_Raid", 00:13:06.595 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:06.595 "strip_size_kb": 0, 00:13:06.595 "state": "configuring", 00:13:06.595 "raid_level": "raid1", 00:13:06.595 "superblock": false, 00:13:06.595 "num_base_bdevs": 3, 00:13:06.595 "num_base_bdevs_discovered": 2, 00:13:06.595 "num_base_bdevs_operational": 3, 00:13:06.595 "base_bdevs_list": [ 00:13:06.595 { 00:13:06.595 "name": "BaseBdev1", 00:13:06.595 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:06.595 "is_configured": false, 00:13:06.595 "data_offset": 0, 00:13:06.595 "data_size": 0 00:13:06.595 }, 00:13:06.595 { 00:13:06.595 "name": "BaseBdev2", 00:13:06.595 "uuid": "b591dcad-b336-4322-9d02-8ebfeaa557ae", 00:13:06.595 "is_configured": true, 00:13:06.595 "data_offset": 0, 00:13:06.595 "data_size": 65536 00:13:06.595 }, 00:13:06.595 { 00:13:06.595 "name": "BaseBdev3", 00:13:06.595 "uuid": "4d48420f-cd97-477d-896c-89bb0c0335cb", 00:13:06.595 "is_configured": true, 00:13:06.595 "data_offset": 0, 00:13:06.595 "data_size": 65536 00:13:06.595 } 00:13:06.595 ] 00:13:06.595 }' 00:13:06.595 23:34:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:06.595 23:34:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:07.161 23:34:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:13:07.161 [2024-07-24 23:34:52.037738] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:07.161 23:34:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:07.161 23:34:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:07.161 23:34:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:07.161 23:34:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:07.161 23:34:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:07.161 23:34:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:07.161 23:34:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:07.161 23:34:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:07.161 23:34:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:07.161 23:34:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:07.161 23:34:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:07.161 23:34:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:07.419 23:34:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:07.419 "name": "Existed_Raid", 00:13:07.419 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:07.419 "strip_size_kb": 0, 00:13:07.419 "state": "configuring", 00:13:07.419 "raid_level": "raid1", 00:13:07.419 "superblock": false, 00:13:07.419 "num_base_bdevs": 3, 00:13:07.419 "num_base_bdevs_discovered": 1, 00:13:07.419 "num_base_bdevs_operational": 3, 00:13:07.419 "base_bdevs_list": [ 00:13:07.419 { 00:13:07.419 "name": "BaseBdev1", 00:13:07.419 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:07.419 "is_configured": false, 00:13:07.419 "data_offset": 0, 00:13:07.419 "data_size": 0 00:13:07.419 }, 00:13:07.419 { 00:13:07.419 "name": null, 00:13:07.419 "uuid": "b591dcad-b336-4322-9d02-8ebfeaa557ae", 00:13:07.419 "is_configured": false, 00:13:07.419 "data_offset": 0, 00:13:07.419 "data_size": 65536 00:13:07.419 }, 00:13:07.419 { 00:13:07.419 "name": "BaseBdev3", 00:13:07.419 "uuid": "4d48420f-cd97-477d-896c-89bb0c0335cb", 00:13:07.419 "is_configured": true, 00:13:07.419 "data_offset": 0, 00:13:07.419 "data_size": 65536 00:13:07.419 } 00:13:07.419 ] 00:13:07.419 }' 00:13:07.419 23:34:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:07.419 23:34:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:08.010 23:34:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:08.010 23:34:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:13:08.010 23:34:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:13:08.010 23:34:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:08.268 [2024-07-24 23:34:53.047041] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:08.268 BaseBdev1 00:13:08.268 23:34:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:13:08.268 23:34:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:13:08.268 23:34:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:13:08.268 23:34:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:13:08.268 23:34:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:13:08.268 23:34:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:13:08.268 23:34:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:08.268 23:34:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:08.526 [ 00:13:08.526 { 00:13:08.526 "name": "BaseBdev1", 00:13:08.526 "aliases": [ 00:13:08.526 "e4b55382-4843-4907-904c-4cb0a94d71ef" 00:13:08.526 ], 00:13:08.526 "product_name": "Malloc disk", 00:13:08.526 "block_size": 512, 00:13:08.526 "num_blocks": 65536, 00:13:08.526 "uuid": "e4b55382-4843-4907-904c-4cb0a94d71ef", 00:13:08.526 "assigned_rate_limits": { 00:13:08.526 "rw_ios_per_sec": 0, 00:13:08.526 "rw_mbytes_per_sec": 0, 00:13:08.526 "r_mbytes_per_sec": 0, 00:13:08.526 "w_mbytes_per_sec": 0 00:13:08.526 }, 00:13:08.526 "claimed": true, 00:13:08.526 "claim_type": "exclusive_write", 00:13:08.526 "zoned": false, 00:13:08.526 "supported_io_types": { 00:13:08.526 "read": true, 00:13:08.526 "write": true, 00:13:08.526 "unmap": true, 00:13:08.526 "flush": true, 00:13:08.526 "reset": true, 00:13:08.526 "nvme_admin": false, 00:13:08.526 "nvme_io": false, 00:13:08.526 "nvme_io_md": false, 00:13:08.526 "write_zeroes": true, 00:13:08.526 "zcopy": true, 00:13:08.526 "get_zone_info": false, 00:13:08.526 "zone_management": false, 00:13:08.526 "zone_append": false, 00:13:08.526 "compare": false, 00:13:08.526 "compare_and_write": false, 00:13:08.526 "abort": true, 00:13:08.526 "seek_hole": false, 00:13:08.526 "seek_data": false, 00:13:08.526 "copy": true, 00:13:08.526 "nvme_iov_md": false 00:13:08.526 }, 00:13:08.526 "memory_domains": [ 00:13:08.526 { 00:13:08.526 "dma_device_id": "system", 00:13:08.526 "dma_device_type": 1 00:13:08.526 }, 00:13:08.526 { 00:13:08.526 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:08.526 "dma_device_type": 2 00:13:08.526 } 00:13:08.526 ], 00:13:08.526 "driver_specific": {} 00:13:08.526 } 00:13:08.526 ] 00:13:08.526 23:34:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:13:08.526 23:34:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:08.526 23:34:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:08.526 23:34:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:08.526 23:34:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:08.527 23:34:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:08.527 23:34:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:08.527 23:34:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:08.527 23:34:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:08.527 23:34:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:08.527 23:34:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:08.527 23:34:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:08.527 23:34:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:08.784 23:34:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:08.784 "name": "Existed_Raid", 00:13:08.784 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:08.784 "strip_size_kb": 0, 00:13:08.784 "state": "configuring", 00:13:08.784 "raid_level": "raid1", 00:13:08.784 "superblock": false, 00:13:08.784 "num_base_bdevs": 3, 00:13:08.784 "num_base_bdevs_discovered": 2, 00:13:08.784 "num_base_bdevs_operational": 3, 00:13:08.784 "base_bdevs_list": [ 00:13:08.784 { 00:13:08.784 "name": "BaseBdev1", 00:13:08.784 "uuid": "e4b55382-4843-4907-904c-4cb0a94d71ef", 00:13:08.784 "is_configured": true, 00:13:08.784 "data_offset": 0, 00:13:08.784 "data_size": 65536 00:13:08.784 }, 00:13:08.784 { 00:13:08.784 "name": null, 00:13:08.784 "uuid": "b591dcad-b336-4322-9d02-8ebfeaa557ae", 00:13:08.784 "is_configured": false, 00:13:08.784 "data_offset": 0, 00:13:08.784 "data_size": 65536 00:13:08.784 }, 00:13:08.784 { 00:13:08.784 "name": "BaseBdev3", 00:13:08.784 "uuid": "4d48420f-cd97-477d-896c-89bb0c0335cb", 00:13:08.784 "is_configured": true, 00:13:08.784 "data_offset": 0, 00:13:08.784 "data_size": 65536 00:13:08.784 } 00:13:08.784 ] 00:13:08.784 }' 00:13:08.784 23:34:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:08.784 23:34:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:09.042 23:34:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:13:09.042 23:34:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:09.300 23:34:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:13:09.300 23:34:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:13:09.559 [2024-07-24 23:34:54.334394] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:13:09.559 23:34:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:09.559 23:34:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:09.559 23:34:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:09.559 23:34:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:09.559 23:34:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:09.559 23:34:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:09.559 23:34:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:09.559 23:34:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:09.559 23:34:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:09.559 23:34:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:09.559 23:34:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:09.559 23:34:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:09.559 23:34:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:09.559 "name": "Existed_Raid", 00:13:09.559 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:09.559 "strip_size_kb": 0, 00:13:09.559 "state": "configuring", 00:13:09.559 "raid_level": "raid1", 00:13:09.559 "superblock": false, 00:13:09.559 "num_base_bdevs": 3, 00:13:09.559 "num_base_bdevs_discovered": 1, 00:13:09.559 "num_base_bdevs_operational": 3, 00:13:09.559 "base_bdevs_list": [ 00:13:09.559 { 00:13:09.559 "name": "BaseBdev1", 00:13:09.559 "uuid": "e4b55382-4843-4907-904c-4cb0a94d71ef", 00:13:09.559 "is_configured": true, 00:13:09.559 "data_offset": 0, 00:13:09.559 "data_size": 65536 00:13:09.559 }, 00:13:09.559 { 00:13:09.559 "name": null, 00:13:09.559 "uuid": "b591dcad-b336-4322-9d02-8ebfeaa557ae", 00:13:09.559 "is_configured": false, 00:13:09.559 "data_offset": 0, 00:13:09.559 "data_size": 65536 00:13:09.559 }, 00:13:09.559 { 00:13:09.559 "name": null, 00:13:09.559 "uuid": "4d48420f-cd97-477d-896c-89bb0c0335cb", 00:13:09.559 "is_configured": false, 00:13:09.559 "data_offset": 0, 00:13:09.559 "data_size": 65536 00:13:09.559 } 00:13:09.559 ] 00:13:09.559 }' 00:13:09.559 23:34:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:09.559 23:34:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:10.126 23:34:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:10.126 23:34:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:13:10.384 23:34:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:13:10.384 23:34:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:13:10.384 [2024-07-24 23:34:55.345021] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:10.384 23:34:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:10.384 23:34:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:10.384 23:34:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:10.384 23:34:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:10.384 23:34:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:10.384 23:34:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:10.384 23:34:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:10.384 23:34:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:10.384 23:34:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:10.384 23:34:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:10.384 23:34:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:10.384 23:34:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:10.642 23:34:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:10.642 "name": "Existed_Raid", 00:13:10.642 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:10.642 "strip_size_kb": 0, 00:13:10.642 "state": "configuring", 00:13:10.642 "raid_level": "raid1", 00:13:10.642 "superblock": false, 00:13:10.642 "num_base_bdevs": 3, 00:13:10.642 "num_base_bdevs_discovered": 2, 00:13:10.642 "num_base_bdevs_operational": 3, 00:13:10.642 "base_bdevs_list": [ 00:13:10.642 { 00:13:10.642 "name": "BaseBdev1", 00:13:10.642 "uuid": "e4b55382-4843-4907-904c-4cb0a94d71ef", 00:13:10.642 "is_configured": true, 00:13:10.642 "data_offset": 0, 00:13:10.642 "data_size": 65536 00:13:10.642 }, 00:13:10.642 { 00:13:10.642 "name": null, 00:13:10.642 "uuid": "b591dcad-b336-4322-9d02-8ebfeaa557ae", 00:13:10.642 "is_configured": false, 00:13:10.642 "data_offset": 0, 00:13:10.642 "data_size": 65536 00:13:10.642 }, 00:13:10.642 { 00:13:10.642 "name": "BaseBdev3", 00:13:10.642 "uuid": "4d48420f-cd97-477d-896c-89bb0c0335cb", 00:13:10.642 "is_configured": true, 00:13:10.642 "data_offset": 0, 00:13:10.642 "data_size": 65536 00:13:10.642 } 00:13:10.642 ] 00:13:10.642 }' 00:13:10.642 23:34:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:10.642 23:34:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:11.209 23:34:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:13:11.209 23:34:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:11.209 23:34:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:13:11.209 23:34:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:11.467 [2024-07-24 23:34:56.327568] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:11.467 23:34:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:11.467 23:34:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:11.467 23:34:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:11.467 23:34:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:11.467 23:34:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:11.467 23:34:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:11.467 23:34:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:11.467 23:34:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:11.467 23:34:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:11.467 23:34:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:11.467 23:34:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:11.467 23:34:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:11.725 23:34:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:11.725 "name": "Existed_Raid", 00:13:11.725 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:11.725 "strip_size_kb": 0, 00:13:11.725 "state": "configuring", 00:13:11.725 "raid_level": "raid1", 00:13:11.725 "superblock": false, 00:13:11.725 "num_base_bdevs": 3, 00:13:11.725 "num_base_bdevs_discovered": 1, 00:13:11.725 "num_base_bdevs_operational": 3, 00:13:11.725 "base_bdevs_list": [ 00:13:11.725 { 00:13:11.725 "name": null, 00:13:11.725 "uuid": "e4b55382-4843-4907-904c-4cb0a94d71ef", 00:13:11.725 "is_configured": false, 00:13:11.725 "data_offset": 0, 00:13:11.725 "data_size": 65536 00:13:11.725 }, 00:13:11.725 { 00:13:11.725 "name": null, 00:13:11.725 "uuid": "b591dcad-b336-4322-9d02-8ebfeaa557ae", 00:13:11.725 "is_configured": false, 00:13:11.725 "data_offset": 0, 00:13:11.725 "data_size": 65536 00:13:11.725 }, 00:13:11.725 { 00:13:11.725 "name": "BaseBdev3", 00:13:11.725 "uuid": "4d48420f-cd97-477d-896c-89bb0c0335cb", 00:13:11.725 "is_configured": true, 00:13:11.725 "data_offset": 0, 00:13:11.725 "data_size": 65536 00:13:11.725 } 00:13:11.725 ] 00:13:11.725 }' 00:13:11.725 23:34:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:11.725 23:34:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:12.291 23:34:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:12.291 23:34:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:13:12.291 23:34:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:13:12.291 23:34:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:13:12.550 [2024-07-24 23:34:57.327906] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:12.550 23:34:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:12.550 23:34:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:12.550 23:34:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:12.550 23:34:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:12.550 23:34:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:12.550 23:34:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:12.550 23:34:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:12.550 23:34:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:12.550 23:34:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:12.550 23:34:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:12.550 23:34:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:12.550 23:34:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:12.550 23:34:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:12.550 "name": "Existed_Raid", 00:13:12.550 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:12.550 "strip_size_kb": 0, 00:13:12.550 "state": "configuring", 00:13:12.550 "raid_level": "raid1", 00:13:12.550 "superblock": false, 00:13:12.550 "num_base_bdevs": 3, 00:13:12.550 "num_base_bdevs_discovered": 2, 00:13:12.550 "num_base_bdevs_operational": 3, 00:13:12.550 "base_bdevs_list": [ 00:13:12.550 { 00:13:12.550 "name": null, 00:13:12.550 "uuid": "e4b55382-4843-4907-904c-4cb0a94d71ef", 00:13:12.550 "is_configured": false, 00:13:12.550 "data_offset": 0, 00:13:12.550 "data_size": 65536 00:13:12.550 }, 00:13:12.550 { 00:13:12.550 "name": "BaseBdev2", 00:13:12.550 "uuid": "b591dcad-b336-4322-9d02-8ebfeaa557ae", 00:13:12.550 "is_configured": true, 00:13:12.550 "data_offset": 0, 00:13:12.550 "data_size": 65536 00:13:12.550 }, 00:13:12.550 { 00:13:12.550 "name": "BaseBdev3", 00:13:12.550 "uuid": "4d48420f-cd97-477d-896c-89bb0c0335cb", 00:13:12.550 "is_configured": true, 00:13:12.550 "data_offset": 0, 00:13:12.550 "data_size": 65536 00:13:12.550 } 00:13:12.550 ] 00:13:12.550 }' 00:13:12.550 23:34:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:12.550 23:34:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:13.117 23:34:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:13.117 23:34:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:13:13.376 23:34:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:13:13.376 23:34:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:13.376 23:34:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:13:13.376 23:34:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u e4b55382-4843-4907-904c-4cb0a94d71ef 00:13:13.634 [2024-07-24 23:34:58.457541] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:13:13.634 [2024-07-24 23:34:58.457573] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x27791e0 00:13:13.634 [2024-07-24 23:34:58.457576] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:13:13.634 [2024-07-24 23:34:58.457710] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2771730 00:13:13.634 [2024-07-24 23:34:58.457796] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x27791e0 00:13:13.634 [2024-07-24 23:34:58.457801] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x27791e0 00:13:13.634 [2024-07-24 23:34:58.457934] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:13.634 NewBaseBdev 00:13:13.634 23:34:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:13:13.634 23:34:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=NewBaseBdev 00:13:13.635 23:34:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:13:13.635 23:34:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:13:13.635 23:34:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:13:13.635 23:34:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:13:13.635 23:34:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:13.892 23:34:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:13:13.892 [ 00:13:13.892 { 00:13:13.892 "name": "NewBaseBdev", 00:13:13.892 "aliases": [ 00:13:13.892 "e4b55382-4843-4907-904c-4cb0a94d71ef" 00:13:13.892 ], 00:13:13.892 "product_name": "Malloc disk", 00:13:13.892 "block_size": 512, 00:13:13.892 "num_blocks": 65536, 00:13:13.892 "uuid": "e4b55382-4843-4907-904c-4cb0a94d71ef", 00:13:13.892 "assigned_rate_limits": { 00:13:13.892 "rw_ios_per_sec": 0, 00:13:13.892 "rw_mbytes_per_sec": 0, 00:13:13.892 "r_mbytes_per_sec": 0, 00:13:13.892 "w_mbytes_per_sec": 0 00:13:13.892 }, 00:13:13.892 "claimed": true, 00:13:13.892 "claim_type": "exclusive_write", 00:13:13.892 "zoned": false, 00:13:13.892 "supported_io_types": { 00:13:13.892 "read": true, 00:13:13.892 "write": true, 00:13:13.892 "unmap": true, 00:13:13.892 "flush": true, 00:13:13.892 "reset": true, 00:13:13.892 "nvme_admin": false, 00:13:13.892 "nvme_io": false, 00:13:13.892 "nvme_io_md": false, 00:13:13.892 "write_zeroes": true, 00:13:13.892 "zcopy": true, 00:13:13.892 "get_zone_info": false, 00:13:13.892 "zone_management": false, 00:13:13.892 "zone_append": false, 00:13:13.892 "compare": false, 00:13:13.892 "compare_and_write": false, 00:13:13.892 "abort": true, 00:13:13.892 "seek_hole": false, 00:13:13.892 "seek_data": false, 00:13:13.892 "copy": true, 00:13:13.892 "nvme_iov_md": false 00:13:13.892 }, 00:13:13.892 "memory_domains": [ 00:13:13.892 { 00:13:13.892 "dma_device_id": "system", 00:13:13.892 "dma_device_type": 1 00:13:13.892 }, 00:13:13.892 { 00:13:13.892 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:13.892 "dma_device_type": 2 00:13:13.892 } 00:13:13.892 ], 00:13:13.892 "driver_specific": {} 00:13:13.892 } 00:13:13.892 ] 00:13:13.892 23:34:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:13:13.892 23:34:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:13:13.892 23:34:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:13.892 23:34:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:13.892 23:34:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:13.892 23:34:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:13.892 23:34:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:13.892 23:34:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:13.892 23:34:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:13.892 23:34:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:13.892 23:34:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:13.892 23:34:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:13.892 23:34:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:14.149 23:34:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:14.149 "name": "Existed_Raid", 00:13:14.149 "uuid": "a3c69c76-64f4-4557-be1a-2943a518c86d", 00:13:14.149 "strip_size_kb": 0, 00:13:14.149 "state": "online", 00:13:14.149 "raid_level": "raid1", 00:13:14.149 "superblock": false, 00:13:14.149 "num_base_bdevs": 3, 00:13:14.149 "num_base_bdevs_discovered": 3, 00:13:14.149 "num_base_bdevs_operational": 3, 00:13:14.149 "base_bdevs_list": [ 00:13:14.149 { 00:13:14.149 "name": "NewBaseBdev", 00:13:14.149 "uuid": "e4b55382-4843-4907-904c-4cb0a94d71ef", 00:13:14.149 "is_configured": true, 00:13:14.149 "data_offset": 0, 00:13:14.149 "data_size": 65536 00:13:14.149 }, 00:13:14.149 { 00:13:14.149 "name": "BaseBdev2", 00:13:14.149 "uuid": "b591dcad-b336-4322-9d02-8ebfeaa557ae", 00:13:14.149 "is_configured": true, 00:13:14.149 "data_offset": 0, 00:13:14.149 "data_size": 65536 00:13:14.149 }, 00:13:14.149 { 00:13:14.149 "name": "BaseBdev3", 00:13:14.149 "uuid": "4d48420f-cd97-477d-896c-89bb0c0335cb", 00:13:14.149 "is_configured": true, 00:13:14.149 "data_offset": 0, 00:13:14.149 "data_size": 65536 00:13:14.149 } 00:13:14.149 ] 00:13:14.149 }' 00:13:14.149 23:34:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:14.149 23:34:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:14.729 23:34:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:13:14.729 23:34:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:14.729 23:34:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:14.729 23:34:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:14.729 23:34:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:14.729 23:34:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:14.729 23:34:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:14.729 23:34:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:14.729 [2024-07-24 23:34:59.572687] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:14.729 23:34:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:14.729 "name": "Existed_Raid", 00:13:14.729 "aliases": [ 00:13:14.729 "a3c69c76-64f4-4557-be1a-2943a518c86d" 00:13:14.729 ], 00:13:14.729 "product_name": "Raid Volume", 00:13:14.729 "block_size": 512, 00:13:14.729 "num_blocks": 65536, 00:13:14.729 "uuid": "a3c69c76-64f4-4557-be1a-2943a518c86d", 00:13:14.729 "assigned_rate_limits": { 00:13:14.729 "rw_ios_per_sec": 0, 00:13:14.729 "rw_mbytes_per_sec": 0, 00:13:14.729 "r_mbytes_per_sec": 0, 00:13:14.729 "w_mbytes_per_sec": 0 00:13:14.729 }, 00:13:14.729 "claimed": false, 00:13:14.729 "zoned": false, 00:13:14.729 "supported_io_types": { 00:13:14.729 "read": true, 00:13:14.729 "write": true, 00:13:14.729 "unmap": false, 00:13:14.729 "flush": false, 00:13:14.729 "reset": true, 00:13:14.729 "nvme_admin": false, 00:13:14.729 "nvme_io": false, 00:13:14.729 "nvme_io_md": false, 00:13:14.729 "write_zeroes": true, 00:13:14.729 "zcopy": false, 00:13:14.729 "get_zone_info": false, 00:13:14.729 "zone_management": false, 00:13:14.729 "zone_append": false, 00:13:14.729 "compare": false, 00:13:14.729 "compare_and_write": false, 00:13:14.729 "abort": false, 00:13:14.729 "seek_hole": false, 00:13:14.729 "seek_data": false, 00:13:14.729 "copy": false, 00:13:14.729 "nvme_iov_md": false 00:13:14.729 }, 00:13:14.729 "memory_domains": [ 00:13:14.729 { 00:13:14.729 "dma_device_id": "system", 00:13:14.729 "dma_device_type": 1 00:13:14.729 }, 00:13:14.729 { 00:13:14.729 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:14.729 "dma_device_type": 2 00:13:14.729 }, 00:13:14.729 { 00:13:14.729 "dma_device_id": "system", 00:13:14.729 "dma_device_type": 1 00:13:14.729 }, 00:13:14.729 { 00:13:14.729 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:14.729 "dma_device_type": 2 00:13:14.729 }, 00:13:14.729 { 00:13:14.729 "dma_device_id": "system", 00:13:14.729 "dma_device_type": 1 00:13:14.729 }, 00:13:14.729 { 00:13:14.729 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:14.729 "dma_device_type": 2 00:13:14.729 } 00:13:14.729 ], 00:13:14.729 "driver_specific": { 00:13:14.729 "raid": { 00:13:14.729 "uuid": "a3c69c76-64f4-4557-be1a-2943a518c86d", 00:13:14.729 "strip_size_kb": 0, 00:13:14.729 "state": "online", 00:13:14.729 "raid_level": "raid1", 00:13:14.729 "superblock": false, 00:13:14.729 "num_base_bdevs": 3, 00:13:14.729 "num_base_bdevs_discovered": 3, 00:13:14.729 "num_base_bdevs_operational": 3, 00:13:14.729 "base_bdevs_list": [ 00:13:14.729 { 00:13:14.729 "name": "NewBaseBdev", 00:13:14.729 "uuid": "e4b55382-4843-4907-904c-4cb0a94d71ef", 00:13:14.729 "is_configured": true, 00:13:14.729 "data_offset": 0, 00:13:14.729 "data_size": 65536 00:13:14.729 }, 00:13:14.729 { 00:13:14.729 "name": "BaseBdev2", 00:13:14.729 "uuid": "b591dcad-b336-4322-9d02-8ebfeaa557ae", 00:13:14.729 "is_configured": true, 00:13:14.729 "data_offset": 0, 00:13:14.729 "data_size": 65536 00:13:14.729 }, 00:13:14.729 { 00:13:14.729 "name": "BaseBdev3", 00:13:14.729 "uuid": "4d48420f-cd97-477d-896c-89bb0c0335cb", 00:13:14.729 "is_configured": true, 00:13:14.729 "data_offset": 0, 00:13:14.729 "data_size": 65536 00:13:14.729 } 00:13:14.729 ] 00:13:14.729 } 00:13:14.729 } 00:13:14.729 }' 00:13:14.729 23:34:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:14.729 23:34:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:13:14.729 BaseBdev2 00:13:14.729 BaseBdev3' 00:13:14.729 23:34:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:14.729 23:34:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:13:14.729 23:34:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:14.987 23:34:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:14.987 "name": "NewBaseBdev", 00:13:14.987 "aliases": [ 00:13:14.987 "e4b55382-4843-4907-904c-4cb0a94d71ef" 00:13:14.987 ], 00:13:14.987 "product_name": "Malloc disk", 00:13:14.987 "block_size": 512, 00:13:14.987 "num_blocks": 65536, 00:13:14.987 "uuid": "e4b55382-4843-4907-904c-4cb0a94d71ef", 00:13:14.987 "assigned_rate_limits": { 00:13:14.987 "rw_ios_per_sec": 0, 00:13:14.987 "rw_mbytes_per_sec": 0, 00:13:14.987 "r_mbytes_per_sec": 0, 00:13:14.987 "w_mbytes_per_sec": 0 00:13:14.987 }, 00:13:14.987 "claimed": true, 00:13:14.987 "claim_type": "exclusive_write", 00:13:14.987 "zoned": false, 00:13:14.987 "supported_io_types": { 00:13:14.987 "read": true, 00:13:14.987 "write": true, 00:13:14.987 "unmap": true, 00:13:14.987 "flush": true, 00:13:14.987 "reset": true, 00:13:14.987 "nvme_admin": false, 00:13:14.987 "nvme_io": false, 00:13:14.987 "nvme_io_md": false, 00:13:14.987 "write_zeroes": true, 00:13:14.987 "zcopy": true, 00:13:14.987 "get_zone_info": false, 00:13:14.987 "zone_management": false, 00:13:14.987 "zone_append": false, 00:13:14.987 "compare": false, 00:13:14.987 "compare_and_write": false, 00:13:14.987 "abort": true, 00:13:14.987 "seek_hole": false, 00:13:14.987 "seek_data": false, 00:13:14.987 "copy": true, 00:13:14.987 "nvme_iov_md": false 00:13:14.987 }, 00:13:14.987 "memory_domains": [ 00:13:14.987 { 00:13:14.987 "dma_device_id": "system", 00:13:14.987 "dma_device_type": 1 00:13:14.987 }, 00:13:14.987 { 00:13:14.987 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:14.987 "dma_device_type": 2 00:13:14.987 } 00:13:14.987 ], 00:13:14.987 "driver_specific": {} 00:13:14.987 }' 00:13:14.987 23:34:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:14.987 23:34:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:14.987 23:34:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:14.987 23:34:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:14.987 23:34:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:14.988 23:34:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:14.988 23:34:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:14.988 23:34:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:15.246 23:35:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:15.246 23:35:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:15.246 23:35:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:15.246 23:35:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:15.246 23:35:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:15.246 23:35:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:15.246 23:35:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:15.504 23:35:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:15.504 "name": "BaseBdev2", 00:13:15.504 "aliases": [ 00:13:15.504 "b591dcad-b336-4322-9d02-8ebfeaa557ae" 00:13:15.504 ], 00:13:15.504 "product_name": "Malloc disk", 00:13:15.504 "block_size": 512, 00:13:15.504 "num_blocks": 65536, 00:13:15.504 "uuid": "b591dcad-b336-4322-9d02-8ebfeaa557ae", 00:13:15.504 "assigned_rate_limits": { 00:13:15.504 "rw_ios_per_sec": 0, 00:13:15.504 "rw_mbytes_per_sec": 0, 00:13:15.504 "r_mbytes_per_sec": 0, 00:13:15.504 "w_mbytes_per_sec": 0 00:13:15.504 }, 00:13:15.504 "claimed": true, 00:13:15.504 "claim_type": "exclusive_write", 00:13:15.504 "zoned": false, 00:13:15.504 "supported_io_types": { 00:13:15.504 "read": true, 00:13:15.504 "write": true, 00:13:15.504 "unmap": true, 00:13:15.504 "flush": true, 00:13:15.504 "reset": true, 00:13:15.504 "nvme_admin": false, 00:13:15.504 "nvme_io": false, 00:13:15.504 "nvme_io_md": false, 00:13:15.504 "write_zeroes": true, 00:13:15.504 "zcopy": true, 00:13:15.504 "get_zone_info": false, 00:13:15.504 "zone_management": false, 00:13:15.504 "zone_append": false, 00:13:15.504 "compare": false, 00:13:15.504 "compare_and_write": false, 00:13:15.504 "abort": true, 00:13:15.504 "seek_hole": false, 00:13:15.504 "seek_data": false, 00:13:15.504 "copy": true, 00:13:15.504 "nvme_iov_md": false 00:13:15.504 }, 00:13:15.504 "memory_domains": [ 00:13:15.504 { 00:13:15.504 "dma_device_id": "system", 00:13:15.504 "dma_device_type": 1 00:13:15.504 }, 00:13:15.504 { 00:13:15.504 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:15.504 "dma_device_type": 2 00:13:15.504 } 00:13:15.504 ], 00:13:15.504 "driver_specific": {} 00:13:15.504 }' 00:13:15.504 23:35:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:15.504 23:35:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:15.504 23:35:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:15.504 23:35:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:15.504 23:35:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:15.504 23:35:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:15.504 23:35:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:15.504 23:35:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:15.504 23:35:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:15.504 23:35:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:15.762 23:35:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:15.762 23:35:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:15.762 23:35:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:15.762 23:35:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:15.763 23:35:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:13:15.763 23:35:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:15.763 "name": "BaseBdev3", 00:13:15.763 "aliases": [ 00:13:15.763 "4d48420f-cd97-477d-896c-89bb0c0335cb" 00:13:15.763 ], 00:13:15.763 "product_name": "Malloc disk", 00:13:15.763 "block_size": 512, 00:13:15.763 "num_blocks": 65536, 00:13:15.763 "uuid": "4d48420f-cd97-477d-896c-89bb0c0335cb", 00:13:15.763 "assigned_rate_limits": { 00:13:15.763 "rw_ios_per_sec": 0, 00:13:15.763 "rw_mbytes_per_sec": 0, 00:13:15.763 "r_mbytes_per_sec": 0, 00:13:15.763 "w_mbytes_per_sec": 0 00:13:15.763 }, 00:13:15.763 "claimed": true, 00:13:15.763 "claim_type": "exclusive_write", 00:13:15.763 "zoned": false, 00:13:15.763 "supported_io_types": { 00:13:15.763 "read": true, 00:13:15.763 "write": true, 00:13:15.763 "unmap": true, 00:13:15.763 "flush": true, 00:13:15.763 "reset": true, 00:13:15.763 "nvme_admin": false, 00:13:15.763 "nvme_io": false, 00:13:15.763 "nvme_io_md": false, 00:13:15.763 "write_zeroes": true, 00:13:15.763 "zcopy": true, 00:13:15.763 "get_zone_info": false, 00:13:15.763 "zone_management": false, 00:13:15.763 "zone_append": false, 00:13:15.763 "compare": false, 00:13:15.763 "compare_and_write": false, 00:13:15.763 "abort": true, 00:13:15.763 "seek_hole": false, 00:13:15.763 "seek_data": false, 00:13:15.763 "copy": true, 00:13:15.763 "nvme_iov_md": false 00:13:15.763 }, 00:13:15.763 "memory_domains": [ 00:13:15.763 { 00:13:15.763 "dma_device_id": "system", 00:13:15.763 "dma_device_type": 1 00:13:15.763 }, 00:13:15.763 { 00:13:15.763 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:15.763 "dma_device_type": 2 00:13:15.763 } 00:13:15.763 ], 00:13:15.763 "driver_specific": {} 00:13:15.763 }' 00:13:15.763 23:35:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:16.021 23:35:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:16.021 23:35:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:16.021 23:35:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:16.021 23:35:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:16.021 23:35:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:16.021 23:35:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:16.021 23:35:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:16.021 23:35:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:16.021 23:35:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:16.021 23:35:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:16.280 23:35:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:16.280 23:35:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:16.280 [2024-07-24 23:35:01.176668] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:16.280 [2024-07-24 23:35:01.176689] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:16.280 [2024-07-24 23:35:01.176728] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:16.280 [2024-07-24 23:35:01.176901] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:16.280 [2024-07-24 23:35:01.176908] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x27791e0 name Existed_Raid, state offline 00:13:16.280 23:35:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 288592 00:13:16.280 23:35:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@950 -- # '[' -z 288592 ']' 00:13:16.280 23:35:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # kill -0 288592 00:13:16.280 23:35:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # uname 00:13:16.280 23:35:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:13:16.280 23:35:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 288592 00:13:16.280 23:35:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:13:16.280 23:35:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:13:16.280 23:35:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 288592' 00:13:16.280 killing process with pid 288592 00:13:16.280 23:35:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@969 -- # kill 288592 00:13:16.280 [2024-07-24 23:35:01.234200] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:16.280 23:35:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@974 -- # wait 288592 00:13:16.280 [2024-07-24 23:35:01.256748] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:16.539 23:35:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:13:16.539 00:13:16.539 real 0m21.307s 00:13:16.539 user 0m39.626s 00:13:16.539 sys 0m3.300s 00:13:16.539 23:35:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:13:16.539 23:35:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:16.539 ************************************ 00:13:16.539 END TEST raid_state_function_test 00:13:16.539 ************************************ 00:13:16.539 23:35:01 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 3 true 00:13:16.539 23:35:01 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:13:16.539 23:35:01 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:13:16.539 23:35:01 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:16.539 ************************************ 00:13:16.539 START TEST raid_state_function_test_sb 00:13:16.539 ************************************ 00:13:16.539 23:35:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # raid_state_function_test raid1 3 true 00:13:16.539 23:35:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:13:16.539 23:35:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:13:16.539 23:35:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:13:16.539 23:35:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:13:16.539 23:35:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:13:16.539 23:35:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:16.539 23:35:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:13:16.539 23:35:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:16.539 23:35:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:16.539 23:35:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:13:16.539 23:35:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:16.539 23:35:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:16.539 23:35:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:13:16.539 23:35:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:16.539 23:35:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:16.539 23:35:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:13:16.539 23:35:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:13:16.539 23:35:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:13:16.539 23:35:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:13:16.539 23:35:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:13:16.539 23:35:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:13:16.539 23:35:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:13:16.539 23:35:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:13:16.539 23:35:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:13:16.539 23:35:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:13:16.539 23:35:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=292744 00:13:16.539 23:35:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 292744' 00:13:16.539 Process raid pid: 292744 00:13:16.539 23:35:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:13:16.539 23:35:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 292744 /var/tmp/spdk-raid.sock 00:13:16.539 23:35:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@831 -- # '[' -z 292744 ']' 00:13:16.539 23:35:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:16.539 23:35:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # local max_retries=100 00:13:16.539 23:35:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:16.539 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:16.540 23:35:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@840 -- # xtrace_disable 00:13:16.540 23:35:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:16.798 [2024-07-24 23:35:01.557618] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:13:16.798 [2024-07-24 23:35:01.557656] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:13:16.798 [2024-07-24 23:35:01.625769] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:16.798 [2024-07-24 23:35:01.699684] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:16.798 [2024-07-24 23:35:01.753598] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:16.798 [2024-07-24 23:35:01.753625] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:17.365 23:35:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:13:17.365 23:35:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@864 -- # return 0 00:13:17.365 23:35:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:17.624 [2024-07-24 23:35:02.496801] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:17.624 [2024-07-24 23:35:02.496830] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:17.624 [2024-07-24 23:35:02.496835] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:17.624 [2024-07-24 23:35:02.496841] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:17.624 [2024-07-24 23:35:02.496844] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:17.624 [2024-07-24 23:35:02.496849] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:17.624 23:35:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:17.624 23:35:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:17.624 23:35:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:17.624 23:35:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:17.624 23:35:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:17.624 23:35:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:17.624 23:35:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:17.624 23:35:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:17.624 23:35:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:17.624 23:35:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:17.624 23:35:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:17.624 23:35:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:17.883 23:35:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:17.883 "name": "Existed_Raid", 00:13:17.883 "uuid": "600f70a9-c2c9-464d-84e2-5db4cab0a76a", 00:13:17.883 "strip_size_kb": 0, 00:13:17.883 "state": "configuring", 00:13:17.883 "raid_level": "raid1", 00:13:17.883 "superblock": true, 00:13:17.883 "num_base_bdevs": 3, 00:13:17.883 "num_base_bdevs_discovered": 0, 00:13:17.883 "num_base_bdevs_operational": 3, 00:13:17.883 "base_bdevs_list": [ 00:13:17.883 { 00:13:17.883 "name": "BaseBdev1", 00:13:17.883 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:17.883 "is_configured": false, 00:13:17.883 "data_offset": 0, 00:13:17.883 "data_size": 0 00:13:17.883 }, 00:13:17.883 { 00:13:17.883 "name": "BaseBdev2", 00:13:17.883 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:17.883 "is_configured": false, 00:13:17.883 "data_offset": 0, 00:13:17.883 "data_size": 0 00:13:17.883 }, 00:13:17.883 { 00:13:17.883 "name": "BaseBdev3", 00:13:17.883 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:17.883 "is_configured": false, 00:13:17.883 "data_offset": 0, 00:13:17.883 "data_size": 0 00:13:17.883 } 00:13:17.883 ] 00:13:17.883 }' 00:13:17.883 23:35:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:17.883 23:35:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:18.449 23:35:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:18.449 [2024-07-24 23:35:03.318841] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:18.449 [2024-07-24 23:35:03.318860] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x93db30 name Existed_Raid, state configuring 00:13:18.449 23:35:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:18.707 [2024-07-24 23:35:03.491304] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:18.707 [2024-07-24 23:35:03.491320] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:18.707 [2024-07-24 23:35:03.491324] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:18.707 [2024-07-24 23:35:03.491329] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:18.707 [2024-07-24 23:35:03.491333] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:18.707 [2024-07-24 23:35:03.491338] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:18.707 23:35:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:18.707 [2024-07-24 23:35:03.663668] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:18.707 BaseBdev1 00:13:18.707 23:35:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:13:18.707 23:35:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:13:18.707 23:35:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:13:18.707 23:35:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:13:18.707 23:35:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:13:18.707 23:35:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:13:18.707 23:35:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:18.965 23:35:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:19.223 [ 00:13:19.223 { 00:13:19.223 "name": "BaseBdev1", 00:13:19.223 "aliases": [ 00:13:19.223 "4eee9793-4c92-45d1-af1d-490fbcbc9508" 00:13:19.223 ], 00:13:19.223 "product_name": "Malloc disk", 00:13:19.223 "block_size": 512, 00:13:19.223 "num_blocks": 65536, 00:13:19.223 "uuid": "4eee9793-4c92-45d1-af1d-490fbcbc9508", 00:13:19.223 "assigned_rate_limits": { 00:13:19.223 "rw_ios_per_sec": 0, 00:13:19.223 "rw_mbytes_per_sec": 0, 00:13:19.223 "r_mbytes_per_sec": 0, 00:13:19.223 "w_mbytes_per_sec": 0 00:13:19.223 }, 00:13:19.223 "claimed": true, 00:13:19.223 "claim_type": "exclusive_write", 00:13:19.223 "zoned": false, 00:13:19.223 "supported_io_types": { 00:13:19.223 "read": true, 00:13:19.223 "write": true, 00:13:19.223 "unmap": true, 00:13:19.223 "flush": true, 00:13:19.223 "reset": true, 00:13:19.223 "nvme_admin": false, 00:13:19.223 "nvme_io": false, 00:13:19.223 "nvme_io_md": false, 00:13:19.223 "write_zeroes": true, 00:13:19.223 "zcopy": true, 00:13:19.223 "get_zone_info": false, 00:13:19.223 "zone_management": false, 00:13:19.223 "zone_append": false, 00:13:19.223 "compare": false, 00:13:19.223 "compare_and_write": false, 00:13:19.223 "abort": true, 00:13:19.223 "seek_hole": false, 00:13:19.223 "seek_data": false, 00:13:19.223 "copy": true, 00:13:19.223 "nvme_iov_md": false 00:13:19.223 }, 00:13:19.223 "memory_domains": [ 00:13:19.223 { 00:13:19.223 "dma_device_id": "system", 00:13:19.223 "dma_device_type": 1 00:13:19.223 }, 00:13:19.223 { 00:13:19.223 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:19.223 "dma_device_type": 2 00:13:19.223 } 00:13:19.223 ], 00:13:19.223 "driver_specific": {} 00:13:19.223 } 00:13:19.223 ] 00:13:19.223 23:35:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:13:19.223 23:35:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:19.223 23:35:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:19.223 23:35:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:19.223 23:35:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:19.223 23:35:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:19.223 23:35:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:19.223 23:35:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:19.223 23:35:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:19.223 23:35:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:19.223 23:35:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:19.223 23:35:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:19.223 23:35:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:19.223 23:35:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:19.223 "name": "Existed_Raid", 00:13:19.223 "uuid": "efe07053-3921-4f6c-b75b-bcdb11a4b737", 00:13:19.223 "strip_size_kb": 0, 00:13:19.223 "state": "configuring", 00:13:19.223 "raid_level": "raid1", 00:13:19.223 "superblock": true, 00:13:19.223 "num_base_bdevs": 3, 00:13:19.223 "num_base_bdevs_discovered": 1, 00:13:19.223 "num_base_bdevs_operational": 3, 00:13:19.223 "base_bdevs_list": [ 00:13:19.223 { 00:13:19.223 "name": "BaseBdev1", 00:13:19.223 "uuid": "4eee9793-4c92-45d1-af1d-490fbcbc9508", 00:13:19.223 "is_configured": true, 00:13:19.223 "data_offset": 2048, 00:13:19.223 "data_size": 63488 00:13:19.223 }, 00:13:19.223 { 00:13:19.223 "name": "BaseBdev2", 00:13:19.223 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:19.223 "is_configured": false, 00:13:19.223 "data_offset": 0, 00:13:19.223 "data_size": 0 00:13:19.223 }, 00:13:19.223 { 00:13:19.223 "name": "BaseBdev3", 00:13:19.223 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:19.223 "is_configured": false, 00:13:19.223 "data_offset": 0, 00:13:19.223 "data_size": 0 00:13:19.223 } 00:13:19.223 ] 00:13:19.223 }' 00:13:19.223 23:35:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:19.223 23:35:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:19.790 23:35:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:20.048 [2024-07-24 23:35:04.834678] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:20.048 [2024-07-24 23:35:04.834707] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x93d3a0 name Existed_Raid, state configuring 00:13:20.048 23:35:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:20.048 [2024-07-24 23:35:05.007143] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:20.048 [2024-07-24 23:35:05.008204] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:20.048 [2024-07-24 23:35:05.008229] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:20.048 [2024-07-24 23:35:05.008234] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:20.048 [2024-07-24 23:35:05.008239] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:20.048 23:35:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:13:20.048 23:35:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:20.048 23:35:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:20.048 23:35:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:20.048 23:35:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:20.048 23:35:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:20.048 23:35:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:20.048 23:35:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:20.048 23:35:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:20.048 23:35:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:20.048 23:35:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:20.048 23:35:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:20.048 23:35:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:20.048 23:35:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:20.306 23:35:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:20.306 "name": "Existed_Raid", 00:13:20.306 "uuid": "2018b2f3-dd47-45de-95b4-72c0185885ee", 00:13:20.306 "strip_size_kb": 0, 00:13:20.306 "state": "configuring", 00:13:20.306 "raid_level": "raid1", 00:13:20.306 "superblock": true, 00:13:20.306 "num_base_bdevs": 3, 00:13:20.306 "num_base_bdevs_discovered": 1, 00:13:20.306 "num_base_bdevs_operational": 3, 00:13:20.306 "base_bdevs_list": [ 00:13:20.306 { 00:13:20.306 "name": "BaseBdev1", 00:13:20.306 "uuid": "4eee9793-4c92-45d1-af1d-490fbcbc9508", 00:13:20.306 "is_configured": true, 00:13:20.306 "data_offset": 2048, 00:13:20.306 "data_size": 63488 00:13:20.306 }, 00:13:20.306 { 00:13:20.306 "name": "BaseBdev2", 00:13:20.306 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:20.306 "is_configured": false, 00:13:20.306 "data_offset": 0, 00:13:20.306 "data_size": 0 00:13:20.306 }, 00:13:20.306 { 00:13:20.306 "name": "BaseBdev3", 00:13:20.306 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:20.306 "is_configured": false, 00:13:20.306 "data_offset": 0, 00:13:20.306 "data_size": 0 00:13:20.306 } 00:13:20.306 ] 00:13:20.306 }' 00:13:20.306 23:35:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:20.306 23:35:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:20.870 23:35:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:20.870 [2024-07-24 23:35:05.819903] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:20.870 BaseBdev2 00:13:20.870 23:35:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:13:20.870 23:35:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:13:20.870 23:35:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:13:20.870 23:35:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:13:20.870 23:35:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:13:20.870 23:35:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:13:20.870 23:35:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:21.127 23:35:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:21.384 [ 00:13:21.384 { 00:13:21.384 "name": "BaseBdev2", 00:13:21.384 "aliases": [ 00:13:21.384 "5f26dfcc-1e50-4091-8cee-99e69c508cc1" 00:13:21.384 ], 00:13:21.384 "product_name": "Malloc disk", 00:13:21.384 "block_size": 512, 00:13:21.384 "num_blocks": 65536, 00:13:21.384 "uuid": "5f26dfcc-1e50-4091-8cee-99e69c508cc1", 00:13:21.384 "assigned_rate_limits": { 00:13:21.384 "rw_ios_per_sec": 0, 00:13:21.384 "rw_mbytes_per_sec": 0, 00:13:21.384 "r_mbytes_per_sec": 0, 00:13:21.384 "w_mbytes_per_sec": 0 00:13:21.384 }, 00:13:21.384 "claimed": true, 00:13:21.384 "claim_type": "exclusive_write", 00:13:21.384 "zoned": false, 00:13:21.384 "supported_io_types": { 00:13:21.384 "read": true, 00:13:21.384 "write": true, 00:13:21.384 "unmap": true, 00:13:21.384 "flush": true, 00:13:21.384 "reset": true, 00:13:21.384 "nvme_admin": false, 00:13:21.384 "nvme_io": false, 00:13:21.384 "nvme_io_md": false, 00:13:21.384 "write_zeroes": true, 00:13:21.384 "zcopy": true, 00:13:21.384 "get_zone_info": false, 00:13:21.384 "zone_management": false, 00:13:21.384 "zone_append": false, 00:13:21.384 "compare": false, 00:13:21.384 "compare_and_write": false, 00:13:21.384 "abort": true, 00:13:21.384 "seek_hole": false, 00:13:21.384 "seek_data": false, 00:13:21.384 "copy": true, 00:13:21.384 "nvme_iov_md": false 00:13:21.384 }, 00:13:21.384 "memory_domains": [ 00:13:21.384 { 00:13:21.384 "dma_device_id": "system", 00:13:21.384 "dma_device_type": 1 00:13:21.384 }, 00:13:21.384 { 00:13:21.384 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:21.384 "dma_device_type": 2 00:13:21.384 } 00:13:21.384 ], 00:13:21.384 "driver_specific": {} 00:13:21.384 } 00:13:21.384 ] 00:13:21.384 23:35:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:13:21.384 23:35:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:21.384 23:35:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:21.384 23:35:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:21.384 23:35:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:21.384 23:35:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:21.384 23:35:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:21.384 23:35:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:21.384 23:35:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:21.384 23:35:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:21.384 23:35:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:21.384 23:35:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:21.384 23:35:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:21.384 23:35:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:21.384 23:35:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:21.384 23:35:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:21.384 "name": "Existed_Raid", 00:13:21.384 "uuid": "2018b2f3-dd47-45de-95b4-72c0185885ee", 00:13:21.384 "strip_size_kb": 0, 00:13:21.384 "state": "configuring", 00:13:21.384 "raid_level": "raid1", 00:13:21.384 "superblock": true, 00:13:21.384 "num_base_bdevs": 3, 00:13:21.384 "num_base_bdevs_discovered": 2, 00:13:21.384 "num_base_bdevs_operational": 3, 00:13:21.384 "base_bdevs_list": [ 00:13:21.384 { 00:13:21.384 "name": "BaseBdev1", 00:13:21.384 "uuid": "4eee9793-4c92-45d1-af1d-490fbcbc9508", 00:13:21.384 "is_configured": true, 00:13:21.384 "data_offset": 2048, 00:13:21.384 "data_size": 63488 00:13:21.384 }, 00:13:21.384 { 00:13:21.384 "name": "BaseBdev2", 00:13:21.384 "uuid": "5f26dfcc-1e50-4091-8cee-99e69c508cc1", 00:13:21.384 "is_configured": true, 00:13:21.384 "data_offset": 2048, 00:13:21.384 "data_size": 63488 00:13:21.384 }, 00:13:21.384 { 00:13:21.384 "name": "BaseBdev3", 00:13:21.384 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:21.384 "is_configured": false, 00:13:21.384 "data_offset": 0, 00:13:21.384 "data_size": 0 00:13:21.384 } 00:13:21.384 ] 00:13:21.384 }' 00:13:21.384 23:35:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:21.384 23:35:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:21.951 23:35:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:22.209 [2024-07-24 23:35:06.969559] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:22.209 [2024-07-24 23:35:06.969703] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x93e2a0 00:13:22.209 [2024-07-24 23:35:06.969713] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:22.209 [2024-07-24 23:35:06.969839] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x93e970 00:13:22.209 [2024-07-24 23:35:06.969931] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x93e2a0 00:13:22.209 [2024-07-24 23:35:06.969940] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x93e2a0 00:13:22.209 [2024-07-24 23:35:06.970003] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:22.209 BaseBdev3 00:13:22.210 23:35:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:13:22.210 23:35:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:13:22.210 23:35:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:13:22.210 23:35:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:13:22.210 23:35:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:13:22.210 23:35:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:13:22.210 23:35:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:22.210 23:35:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:22.468 [ 00:13:22.468 { 00:13:22.468 "name": "BaseBdev3", 00:13:22.468 "aliases": [ 00:13:22.468 "3544d0f5-a656-4511-9ddd-fc96de9bb5c5" 00:13:22.468 ], 00:13:22.468 "product_name": "Malloc disk", 00:13:22.468 "block_size": 512, 00:13:22.468 "num_blocks": 65536, 00:13:22.468 "uuid": "3544d0f5-a656-4511-9ddd-fc96de9bb5c5", 00:13:22.468 "assigned_rate_limits": { 00:13:22.468 "rw_ios_per_sec": 0, 00:13:22.468 "rw_mbytes_per_sec": 0, 00:13:22.468 "r_mbytes_per_sec": 0, 00:13:22.468 "w_mbytes_per_sec": 0 00:13:22.468 }, 00:13:22.468 "claimed": true, 00:13:22.468 "claim_type": "exclusive_write", 00:13:22.468 "zoned": false, 00:13:22.468 "supported_io_types": { 00:13:22.468 "read": true, 00:13:22.468 "write": true, 00:13:22.468 "unmap": true, 00:13:22.468 "flush": true, 00:13:22.468 "reset": true, 00:13:22.468 "nvme_admin": false, 00:13:22.468 "nvme_io": false, 00:13:22.468 "nvme_io_md": false, 00:13:22.468 "write_zeroes": true, 00:13:22.468 "zcopy": true, 00:13:22.468 "get_zone_info": false, 00:13:22.468 "zone_management": false, 00:13:22.468 "zone_append": false, 00:13:22.468 "compare": false, 00:13:22.468 "compare_and_write": false, 00:13:22.468 "abort": true, 00:13:22.468 "seek_hole": false, 00:13:22.468 "seek_data": false, 00:13:22.468 "copy": true, 00:13:22.468 "nvme_iov_md": false 00:13:22.468 }, 00:13:22.468 "memory_domains": [ 00:13:22.468 { 00:13:22.468 "dma_device_id": "system", 00:13:22.468 "dma_device_type": 1 00:13:22.468 }, 00:13:22.468 { 00:13:22.468 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:22.468 "dma_device_type": 2 00:13:22.468 } 00:13:22.468 ], 00:13:22.468 "driver_specific": {} 00:13:22.468 } 00:13:22.468 ] 00:13:22.468 23:35:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:13:22.468 23:35:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:22.468 23:35:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:22.468 23:35:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:13:22.468 23:35:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:22.468 23:35:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:22.468 23:35:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:22.468 23:35:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:22.468 23:35:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:22.468 23:35:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:22.468 23:35:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:22.468 23:35:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:22.468 23:35:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:22.468 23:35:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:22.468 23:35:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:22.468 23:35:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:22.468 "name": "Existed_Raid", 00:13:22.468 "uuid": "2018b2f3-dd47-45de-95b4-72c0185885ee", 00:13:22.468 "strip_size_kb": 0, 00:13:22.469 "state": "online", 00:13:22.469 "raid_level": "raid1", 00:13:22.469 "superblock": true, 00:13:22.469 "num_base_bdevs": 3, 00:13:22.469 "num_base_bdevs_discovered": 3, 00:13:22.469 "num_base_bdevs_operational": 3, 00:13:22.469 "base_bdevs_list": [ 00:13:22.469 { 00:13:22.469 "name": "BaseBdev1", 00:13:22.469 "uuid": "4eee9793-4c92-45d1-af1d-490fbcbc9508", 00:13:22.469 "is_configured": true, 00:13:22.469 "data_offset": 2048, 00:13:22.469 "data_size": 63488 00:13:22.469 }, 00:13:22.469 { 00:13:22.469 "name": "BaseBdev2", 00:13:22.469 "uuid": "5f26dfcc-1e50-4091-8cee-99e69c508cc1", 00:13:22.469 "is_configured": true, 00:13:22.469 "data_offset": 2048, 00:13:22.469 "data_size": 63488 00:13:22.469 }, 00:13:22.469 { 00:13:22.469 "name": "BaseBdev3", 00:13:22.469 "uuid": "3544d0f5-a656-4511-9ddd-fc96de9bb5c5", 00:13:22.469 "is_configured": true, 00:13:22.469 "data_offset": 2048, 00:13:22.469 "data_size": 63488 00:13:22.469 } 00:13:22.469 ] 00:13:22.469 }' 00:13:22.469 23:35:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:22.469 23:35:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:23.035 23:35:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:13:23.035 23:35:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:23.035 23:35:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:23.035 23:35:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:23.035 23:35:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:23.035 23:35:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:13:23.036 23:35:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:23.036 23:35:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:23.292 [2024-07-24 23:35:08.092661] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:23.292 23:35:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:23.292 "name": "Existed_Raid", 00:13:23.292 "aliases": [ 00:13:23.292 "2018b2f3-dd47-45de-95b4-72c0185885ee" 00:13:23.292 ], 00:13:23.292 "product_name": "Raid Volume", 00:13:23.292 "block_size": 512, 00:13:23.292 "num_blocks": 63488, 00:13:23.292 "uuid": "2018b2f3-dd47-45de-95b4-72c0185885ee", 00:13:23.292 "assigned_rate_limits": { 00:13:23.292 "rw_ios_per_sec": 0, 00:13:23.292 "rw_mbytes_per_sec": 0, 00:13:23.293 "r_mbytes_per_sec": 0, 00:13:23.293 "w_mbytes_per_sec": 0 00:13:23.293 }, 00:13:23.293 "claimed": false, 00:13:23.293 "zoned": false, 00:13:23.293 "supported_io_types": { 00:13:23.293 "read": true, 00:13:23.293 "write": true, 00:13:23.293 "unmap": false, 00:13:23.293 "flush": false, 00:13:23.293 "reset": true, 00:13:23.293 "nvme_admin": false, 00:13:23.293 "nvme_io": false, 00:13:23.293 "nvme_io_md": false, 00:13:23.293 "write_zeroes": true, 00:13:23.293 "zcopy": false, 00:13:23.293 "get_zone_info": false, 00:13:23.293 "zone_management": false, 00:13:23.293 "zone_append": false, 00:13:23.293 "compare": false, 00:13:23.293 "compare_and_write": false, 00:13:23.293 "abort": false, 00:13:23.293 "seek_hole": false, 00:13:23.293 "seek_data": false, 00:13:23.293 "copy": false, 00:13:23.293 "nvme_iov_md": false 00:13:23.293 }, 00:13:23.293 "memory_domains": [ 00:13:23.293 { 00:13:23.293 "dma_device_id": "system", 00:13:23.293 "dma_device_type": 1 00:13:23.293 }, 00:13:23.293 { 00:13:23.293 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:23.293 "dma_device_type": 2 00:13:23.293 }, 00:13:23.293 { 00:13:23.293 "dma_device_id": "system", 00:13:23.293 "dma_device_type": 1 00:13:23.293 }, 00:13:23.293 { 00:13:23.293 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:23.293 "dma_device_type": 2 00:13:23.293 }, 00:13:23.293 { 00:13:23.293 "dma_device_id": "system", 00:13:23.293 "dma_device_type": 1 00:13:23.293 }, 00:13:23.293 { 00:13:23.293 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:23.293 "dma_device_type": 2 00:13:23.293 } 00:13:23.293 ], 00:13:23.293 "driver_specific": { 00:13:23.293 "raid": { 00:13:23.293 "uuid": "2018b2f3-dd47-45de-95b4-72c0185885ee", 00:13:23.293 "strip_size_kb": 0, 00:13:23.293 "state": "online", 00:13:23.293 "raid_level": "raid1", 00:13:23.293 "superblock": true, 00:13:23.293 "num_base_bdevs": 3, 00:13:23.293 "num_base_bdevs_discovered": 3, 00:13:23.293 "num_base_bdevs_operational": 3, 00:13:23.293 "base_bdevs_list": [ 00:13:23.293 { 00:13:23.293 "name": "BaseBdev1", 00:13:23.293 "uuid": "4eee9793-4c92-45d1-af1d-490fbcbc9508", 00:13:23.293 "is_configured": true, 00:13:23.293 "data_offset": 2048, 00:13:23.293 "data_size": 63488 00:13:23.293 }, 00:13:23.293 { 00:13:23.293 "name": "BaseBdev2", 00:13:23.293 "uuid": "5f26dfcc-1e50-4091-8cee-99e69c508cc1", 00:13:23.293 "is_configured": true, 00:13:23.293 "data_offset": 2048, 00:13:23.293 "data_size": 63488 00:13:23.293 }, 00:13:23.293 { 00:13:23.293 "name": "BaseBdev3", 00:13:23.293 "uuid": "3544d0f5-a656-4511-9ddd-fc96de9bb5c5", 00:13:23.293 "is_configured": true, 00:13:23.293 "data_offset": 2048, 00:13:23.293 "data_size": 63488 00:13:23.293 } 00:13:23.293 ] 00:13:23.293 } 00:13:23.293 } 00:13:23.293 }' 00:13:23.293 23:35:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:23.293 23:35:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:13:23.293 BaseBdev2 00:13:23.293 BaseBdev3' 00:13:23.293 23:35:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:23.293 23:35:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:13:23.293 23:35:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:23.550 23:35:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:23.550 "name": "BaseBdev1", 00:13:23.550 "aliases": [ 00:13:23.550 "4eee9793-4c92-45d1-af1d-490fbcbc9508" 00:13:23.550 ], 00:13:23.550 "product_name": "Malloc disk", 00:13:23.550 "block_size": 512, 00:13:23.550 "num_blocks": 65536, 00:13:23.550 "uuid": "4eee9793-4c92-45d1-af1d-490fbcbc9508", 00:13:23.550 "assigned_rate_limits": { 00:13:23.550 "rw_ios_per_sec": 0, 00:13:23.550 "rw_mbytes_per_sec": 0, 00:13:23.550 "r_mbytes_per_sec": 0, 00:13:23.550 "w_mbytes_per_sec": 0 00:13:23.550 }, 00:13:23.550 "claimed": true, 00:13:23.550 "claim_type": "exclusive_write", 00:13:23.550 "zoned": false, 00:13:23.550 "supported_io_types": { 00:13:23.550 "read": true, 00:13:23.550 "write": true, 00:13:23.550 "unmap": true, 00:13:23.550 "flush": true, 00:13:23.550 "reset": true, 00:13:23.550 "nvme_admin": false, 00:13:23.550 "nvme_io": false, 00:13:23.550 "nvme_io_md": false, 00:13:23.550 "write_zeroes": true, 00:13:23.550 "zcopy": true, 00:13:23.550 "get_zone_info": false, 00:13:23.550 "zone_management": false, 00:13:23.550 "zone_append": false, 00:13:23.550 "compare": false, 00:13:23.550 "compare_and_write": false, 00:13:23.550 "abort": true, 00:13:23.550 "seek_hole": false, 00:13:23.550 "seek_data": false, 00:13:23.550 "copy": true, 00:13:23.550 "nvme_iov_md": false 00:13:23.550 }, 00:13:23.550 "memory_domains": [ 00:13:23.550 { 00:13:23.550 "dma_device_id": "system", 00:13:23.550 "dma_device_type": 1 00:13:23.550 }, 00:13:23.550 { 00:13:23.550 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:23.550 "dma_device_type": 2 00:13:23.550 } 00:13:23.550 ], 00:13:23.550 "driver_specific": {} 00:13:23.550 }' 00:13:23.550 23:35:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:23.550 23:35:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:23.550 23:35:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:23.550 23:35:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:23.550 23:35:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:23.550 23:35:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:23.550 23:35:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:23.550 23:35:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:23.807 23:35:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:23.807 23:35:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:23.807 23:35:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:23.807 23:35:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:23.807 23:35:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:23.807 23:35:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:23.807 23:35:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:23.807 23:35:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:23.807 "name": "BaseBdev2", 00:13:23.807 "aliases": [ 00:13:23.807 "5f26dfcc-1e50-4091-8cee-99e69c508cc1" 00:13:23.807 ], 00:13:23.807 "product_name": "Malloc disk", 00:13:23.807 "block_size": 512, 00:13:23.807 "num_blocks": 65536, 00:13:23.807 "uuid": "5f26dfcc-1e50-4091-8cee-99e69c508cc1", 00:13:23.807 "assigned_rate_limits": { 00:13:23.807 "rw_ios_per_sec": 0, 00:13:23.807 "rw_mbytes_per_sec": 0, 00:13:23.807 "r_mbytes_per_sec": 0, 00:13:23.807 "w_mbytes_per_sec": 0 00:13:23.807 }, 00:13:23.807 "claimed": true, 00:13:23.807 "claim_type": "exclusive_write", 00:13:23.807 "zoned": false, 00:13:23.807 "supported_io_types": { 00:13:23.807 "read": true, 00:13:23.807 "write": true, 00:13:23.807 "unmap": true, 00:13:23.807 "flush": true, 00:13:23.807 "reset": true, 00:13:23.807 "nvme_admin": false, 00:13:23.807 "nvme_io": false, 00:13:23.807 "nvme_io_md": false, 00:13:23.807 "write_zeroes": true, 00:13:23.807 "zcopy": true, 00:13:23.807 "get_zone_info": false, 00:13:23.807 "zone_management": false, 00:13:23.807 "zone_append": false, 00:13:23.807 "compare": false, 00:13:23.807 "compare_and_write": false, 00:13:23.807 "abort": true, 00:13:23.807 "seek_hole": false, 00:13:23.807 "seek_data": false, 00:13:23.807 "copy": true, 00:13:23.807 "nvme_iov_md": false 00:13:23.807 }, 00:13:23.807 "memory_domains": [ 00:13:23.807 { 00:13:23.807 "dma_device_id": "system", 00:13:23.807 "dma_device_type": 1 00:13:23.807 }, 00:13:23.807 { 00:13:23.807 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:23.807 "dma_device_type": 2 00:13:23.807 } 00:13:23.807 ], 00:13:23.807 "driver_specific": {} 00:13:23.807 }' 00:13:23.807 23:35:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:24.064 23:35:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:24.064 23:35:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:24.064 23:35:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:24.064 23:35:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:24.064 23:35:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:24.064 23:35:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:24.064 23:35:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:24.064 23:35:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:24.064 23:35:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:24.326 23:35:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:24.326 23:35:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:24.326 23:35:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:24.326 23:35:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:13:24.326 23:35:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:24.326 23:35:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:24.326 "name": "BaseBdev3", 00:13:24.326 "aliases": [ 00:13:24.326 "3544d0f5-a656-4511-9ddd-fc96de9bb5c5" 00:13:24.326 ], 00:13:24.326 "product_name": "Malloc disk", 00:13:24.326 "block_size": 512, 00:13:24.326 "num_blocks": 65536, 00:13:24.326 "uuid": "3544d0f5-a656-4511-9ddd-fc96de9bb5c5", 00:13:24.326 "assigned_rate_limits": { 00:13:24.326 "rw_ios_per_sec": 0, 00:13:24.326 "rw_mbytes_per_sec": 0, 00:13:24.326 "r_mbytes_per_sec": 0, 00:13:24.326 "w_mbytes_per_sec": 0 00:13:24.326 }, 00:13:24.326 "claimed": true, 00:13:24.326 "claim_type": "exclusive_write", 00:13:24.326 "zoned": false, 00:13:24.326 "supported_io_types": { 00:13:24.326 "read": true, 00:13:24.326 "write": true, 00:13:24.326 "unmap": true, 00:13:24.326 "flush": true, 00:13:24.326 "reset": true, 00:13:24.326 "nvme_admin": false, 00:13:24.326 "nvme_io": false, 00:13:24.326 "nvme_io_md": false, 00:13:24.326 "write_zeroes": true, 00:13:24.326 "zcopy": true, 00:13:24.326 "get_zone_info": false, 00:13:24.326 "zone_management": false, 00:13:24.326 "zone_append": false, 00:13:24.326 "compare": false, 00:13:24.326 "compare_and_write": false, 00:13:24.326 "abort": true, 00:13:24.326 "seek_hole": false, 00:13:24.326 "seek_data": false, 00:13:24.326 "copy": true, 00:13:24.326 "nvme_iov_md": false 00:13:24.326 }, 00:13:24.326 "memory_domains": [ 00:13:24.326 { 00:13:24.326 "dma_device_id": "system", 00:13:24.326 "dma_device_type": 1 00:13:24.326 }, 00:13:24.326 { 00:13:24.326 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:24.326 "dma_device_type": 2 00:13:24.326 } 00:13:24.326 ], 00:13:24.326 "driver_specific": {} 00:13:24.326 }' 00:13:24.326 23:35:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:24.644 23:35:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:24.644 23:35:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:24.644 23:35:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:24.644 23:35:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:24.644 23:35:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:24.644 23:35:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:24.644 23:35:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:24.644 23:35:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:24.644 23:35:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:24.644 23:35:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:24.644 23:35:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:24.644 23:35:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:24.902 [2024-07-24 23:35:09.752829] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:24.902 23:35:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:13:24.902 23:35:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:13:24.902 23:35:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:24.902 23:35:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:13:24.902 23:35:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:13:24.902 23:35:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:13:24.902 23:35:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:24.902 23:35:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:24.902 23:35:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:24.902 23:35:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:24.902 23:35:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:24.902 23:35:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:24.902 23:35:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:24.902 23:35:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:24.902 23:35:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:24.902 23:35:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:24.902 23:35:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:25.160 23:35:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:25.160 "name": "Existed_Raid", 00:13:25.160 "uuid": "2018b2f3-dd47-45de-95b4-72c0185885ee", 00:13:25.160 "strip_size_kb": 0, 00:13:25.160 "state": "online", 00:13:25.160 "raid_level": "raid1", 00:13:25.160 "superblock": true, 00:13:25.160 "num_base_bdevs": 3, 00:13:25.160 "num_base_bdevs_discovered": 2, 00:13:25.160 "num_base_bdevs_operational": 2, 00:13:25.160 "base_bdevs_list": [ 00:13:25.160 { 00:13:25.160 "name": null, 00:13:25.160 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:25.160 "is_configured": false, 00:13:25.160 "data_offset": 2048, 00:13:25.160 "data_size": 63488 00:13:25.160 }, 00:13:25.160 { 00:13:25.160 "name": "BaseBdev2", 00:13:25.160 "uuid": "5f26dfcc-1e50-4091-8cee-99e69c508cc1", 00:13:25.160 "is_configured": true, 00:13:25.160 "data_offset": 2048, 00:13:25.160 "data_size": 63488 00:13:25.160 }, 00:13:25.160 { 00:13:25.160 "name": "BaseBdev3", 00:13:25.160 "uuid": "3544d0f5-a656-4511-9ddd-fc96de9bb5c5", 00:13:25.160 "is_configured": true, 00:13:25.160 "data_offset": 2048, 00:13:25.160 "data_size": 63488 00:13:25.160 } 00:13:25.160 ] 00:13:25.160 }' 00:13:25.160 23:35:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:25.160 23:35:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:25.723 23:35:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:13:25.723 23:35:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:25.723 23:35:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:25.723 23:35:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:25.723 23:35:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:25.723 23:35:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:25.723 23:35:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:13:25.980 [2024-07-24 23:35:10.784380] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:25.980 23:35:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:25.980 23:35:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:25.980 23:35:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:25.980 23:35:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:25.980 23:35:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:25.980 23:35:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:25.980 23:35:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:13:26.237 [2024-07-24 23:35:11.131081] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:13:26.237 [2024-07-24 23:35:11.131148] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:26.237 [2024-07-24 23:35:11.141102] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:26.237 [2024-07-24 23:35:11.141143] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:26.237 [2024-07-24 23:35:11.141149] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x93e2a0 name Existed_Raid, state offline 00:13:26.237 23:35:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:26.237 23:35:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:26.237 23:35:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:26.237 23:35:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:13:26.494 23:35:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:13:26.494 23:35:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:13:26.494 23:35:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:13:26.494 23:35:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:13:26.494 23:35:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:26.494 23:35:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:26.494 BaseBdev2 00:13:26.751 23:35:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:13:26.751 23:35:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:13:26.751 23:35:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:13:26.751 23:35:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:13:26.751 23:35:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:13:26.751 23:35:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:13:26.751 23:35:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:26.751 23:35:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:27.009 [ 00:13:27.009 { 00:13:27.009 "name": "BaseBdev2", 00:13:27.009 "aliases": [ 00:13:27.009 "6cf1b8dc-77d3-4d3f-b773-e7dfaf62b523" 00:13:27.009 ], 00:13:27.009 "product_name": "Malloc disk", 00:13:27.009 "block_size": 512, 00:13:27.009 "num_blocks": 65536, 00:13:27.009 "uuid": "6cf1b8dc-77d3-4d3f-b773-e7dfaf62b523", 00:13:27.009 "assigned_rate_limits": { 00:13:27.009 "rw_ios_per_sec": 0, 00:13:27.009 "rw_mbytes_per_sec": 0, 00:13:27.009 "r_mbytes_per_sec": 0, 00:13:27.009 "w_mbytes_per_sec": 0 00:13:27.009 }, 00:13:27.009 "claimed": false, 00:13:27.009 "zoned": false, 00:13:27.009 "supported_io_types": { 00:13:27.009 "read": true, 00:13:27.009 "write": true, 00:13:27.009 "unmap": true, 00:13:27.009 "flush": true, 00:13:27.009 "reset": true, 00:13:27.009 "nvme_admin": false, 00:13:27.009 "nvme_io": false, 00:13:27.009 "nvme_io_md": false, 00:13:27.009 "write_zeroes": true, 00:13:27.009 "zcopy": true, 00:13:27.009 "get_zone_info": false, 00:13:27.009 "zone_management": false, 00:13:27.009 "zone_append": false, 00:13:27.009 "compare": false, 00:13:27.009 "compare_and_write": false, 00:13:27.009 "abort": true, 00:13:27.009 "seek_hole": false, 00:13:27.009 "seek_data": false, 00:13:27.009 "copy": true, 00:13:27.009 "nvme_iov_md": false 00:13:27.009 }, 00:13:27.009 "memory_domains": [ 00:13:27.009 { 00:13:27.009 "dma_device_id": "system", 00:13:27.009 "dma_device_type": 1 00:13:27.009 }, 00:13:27.009 { 00:13:27.009 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:27.009 "dma_device_type": 2 00:13:27.009 } 00:13:27.009 ], 00:13:27.009 "driver_specific": {} 00:13:27.009 } 00:13:27.009 ] 00:13:27.009 23:35:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:13:27.009 23:35:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:13:27.009 23:35:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:27.009 23:35:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:27.266 BaseBdev3 00:13:27.266 23:35:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:13:27.266 23:35:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:13:27.266 23:35:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:13:27.266 23:35:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:13:27.266 23:35:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:13:27.266 23:35:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:13:27.266 23:35:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:27.266 23:35:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:27.523 [ 00:13:27.523 { 00:13:27.523 "name": "BaseBdev3", 00:13:27.523 "aliases": [ 00:13:27.523 "042b5d6e-428d-4a14-b6f1-376c79e9d418" 00:13:27.523 ], 00:13:27.523 "product_name": "Malloc disk", 00:13:27.523 "block_size": 512, 00:13:27.523 "num_blocks": 65536, 00:13:27.523 "uuid": "042b5d6e-428d-4a14-b6f1-376c79e9d418", 00:13:27.523 "assigned_rate_limits": { 00:13:27.523 "rw_ios_per_sec": 0, 00:13:27.523 "rw_mbytes_per_sec": 0, 00:13:27.523 "r_mbytes_per_sec": 0, 00:13:27.523 "w_mbytes_per_sec": 0 00:13:27.523 }, 00:13:27.523 "claimed": false, 00:13:27.523 "zoned": false, 00:13:27.523 "supported_io_types": { 00:13:27.523 "read": true, 00:13:27.523 "write": true, 00:13:27.523 "unmap": true, 00:13:27.523 "flush": true, 00:13:27.523 "reset": true, 00:13:27.523 "nvme_admin": false, 00:13:27.523 "nvme_io": false, 00:13:27.523 "nvme_io_md": false, 00:13:27.523 "write_zeroes": true, 00:13:27.523 "zcopy": true, 00:13:27.523 "get_zone_info": false, 00:13:27.523 "zone_management": false, 00:13:27.523 "zone_append": false, 00:13:27.523 "compare": false, 00:13:27.523 "compare_and_write": false, 00:13:27.523 "abort": true, 00:13:27.523 "seek_hole": false, 00:13:27.523 "seek_data": false, 00:13:27.523 "copy": true, 00:13:27.523 "nvme_iov_md": false 00:13:27.523 }, 00:13:27.523 "memory_domains": [ 00:13:27.523 { 00:13:27.523 "dma_device_id": "system", 00:13:27.523 "dma_device_type": 1 00:13:27.523 }, 00:13:27.523 { 00:13:27.523 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:27.523 "dma_device_type": 2 00:13:27.523 } 00:13:27.523 ], 00:13:27.523 "driver_specific": {} 00:13:27.523 } 00:13:27.523 ] 00:13:27.523 23:35:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:13:27.523 23:35:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:13:27.523 23:35:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:27.523 23:35:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:27.523 [2024-07-24 23:35:12.480089] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:27.523 [2024-07-24 23:35:12.480119] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:27.523 [2024-07-24 23:35:12.480131] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:27.523 [2024-07-24 23:35:12.481021] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:27.523 23:35:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:27.523 23:35:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:27.523 23:35:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:27.523 23:35:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:27.523 23:35:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:27.523 23:35:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:27.523 23:35:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:27.523 23:35:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:27.523 23:35:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:27.524 23:35:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:27.524 23:35:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:27.524 23:35:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:27.782 23:35:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:27.782 "name": "Existed_Raid", 00:13:27.782 "uuid": "923e71f0-6bc3-4689-8f15-15a58643a843", 00:13:27.782 "strip_size_kb": 0, 00:13:27.782 "state": "configuring", 00:13:27.782 "raid_level": "raid1", 00:13:27.782 "superblock": true, 00:13:27.782 "num_base_bdevs": 3, 00:13:27.782 "num_base_bdevs_discovered": 2, 00:13:27.782 "num_base_bdevs_operational": 3, 00:13:27.782 "base_bdevs_list": [ 00:13:27.782 { 00:13:27.782 "name": "BaseBdev1", 00:13:27.782 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:27.782 "is_configured": false, 00:13:27.782 "data_offset": 0, 00:13:27.782 "data_size": 0 00:13:27.782 }, 00:13:27.782 { 00:13:27.782 "name": "BaseBdev2", 00:13:27.782 "uuid": "6cf1b8dc-77d3-4d3f-b773-e7dfaf62b523", 00:13:27.782 "is_configured": true, 00:13:27.782 "data_offset": 2048, 00:13:27.782 "data_size": 63488 00:13:27.782 }, 00:13:27.782 { 00:13:27.782 "name": "BaseBdev3", 00:13:27.782 "uuid": "042b5d6e-428d-4a14-b6f1-376c79e9d418", 00:13:27.782 "is_configured": true, 00:13:27.782 "data_offset": 2048, 00:13:27.782 "data_size": 63488 00:13:27.782 } 00:13:27.782 ] 00:13:27.782 }' 00:13:27.782 23:35:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:27.782 23:35:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:28.347 23:35:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:13:28.347 [2024-07-24 23:35:13.262091] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:28.347 23:35:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:28.347 23:35:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:28.347 23:35:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:28.347 23:35:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:28.347 23:35:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:28.347 23:35:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:28.347 23:35:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:28.347 23:35:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:28.347 23:35:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:28.347 23:35:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:28.347 23:35:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:28.347 23:35:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:28.604 23:35:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:28.604 "name": "Existed_Raid", 00:13:28.604 "uuid": "923e71f0-6bc3-4689-8f15-15a58643a843", 00:13:28.604 "strip_size_kb": 0, 00:13:28.604 "state": "configuring", 00:13:28.604 "raid_level": "raid1", 00:13:28.604 "superblock": true, 00:13:28.604 "num_base_bdevs": 3, 00:13:28.604 "num_base_bdevs_discovered": 1, 00:13:28.604 "num_base_bdevs_operational": 3, 00:13:28.604 "base_bdevs_list": [ 00:13:28.604 { 00:13:28.604 "name": "BaseBdev1", 00:13:28.604 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:28.604 "is_configured": false, 00:13:28.604 "data_offset": 0, 00:13:28.604 "data_size": 0 00:13:28.604 }, 00:13:28.604 { 00:13:28.604 "name": null, 00:13:28.604 "uuid": "6cf1b8dc-77d3-4d3f-b773-e7dfaf62b523", 00:13:28.604 "is_configured": false, 00:13:28.604 "data_offset": 2048, 00:13:28.604 "data_size": 63488 00:13:28.604 }, 00:13:28.604 { 00:13:28.604 "name": "BaseBdev3", 00:13:28.604 "uuid": "042b5d6e-428d-4a14-b6f1-376c79e9d418", 00:13:28.604 "is_configured": true, 00:13:28.604 "data_offset": 2048, 00:13:28.604 "data_size": 63488 00:13:28.604 } 00:13:28.604 ] 00:13:28.604 }' 00:13:28.604 23:35:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:28.604 23:35:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:29.168 23:35:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:29.168 23:35:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:13:29.168 23:35:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:13:29.168 23:35:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:29.425 [2024-07-24 23:35:14.247422] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:29.425 BaseBdev1 00:13:29.425 23:35:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:13:29.425 23:35:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:13:29.425 23:35:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:13:29.425 23:35:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:13:29.425 23:35:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:13:29.425 23:35:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:13:29.425 23:35:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:29.683 23:35:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:29.683 [ 00:13:29.683 { 00:13:29.683 "name": "BaseBdev1", 00:13:29.683 "aliases": [ 00:13:29.683 "08732da8-9256-4de2-a121-43ff07117c2f" 00:13:29.683 ], 00:13:29.683 "product_name": "Malloc disk", 00:13:29.683 "block_size": 512, 00:13:29.683 "num_blocks": 65536, 00:13:29.683 "uuid": "08732da8-9256-4de2-a121-43ff07117c2f", 00:13:29.683 "assigned_rate_limits": { 00:13:29.683 "rw_ios_per_sec": 0, 00:13:29.683 "rw_mbytes_per_sec": 0, 00:13:29.683 "r_mbytes_per_sec": 0, 00:13:29.683 "w_mbytes_per_sec": 0 00:13:29.683 }, 00:13:29.683 "claimed": true, 00:13:29.683 "claim_type": "exclusive_write", 00:13:29.683 "zoned": false, 00:13:29.683 "supported_io_types": { 00:13:29.683 "read": true, 00:13:29.683 "write": true, 00:13:29.683 "unmap": true, 00:13:29.683 "flush": true, 00:13:29.683 "reset": true, 00:13:29.683 "nvme_admin": false, 00:13:29.683 "nvme_io": false, 00:13:29.683 "nvme_io_md": false, 00:13:29.683 "write_zeroes": true, 00:13:29.683 "zcopy": true, 00:13:29.683 "get_zone_info": false, 00:13:29.683 "zone_management": false, 00:13:29.683 "zone_append": false, 00:13:29.683 "compare": false, 00:13:29.683 "compare_and_write": false, 00:13:29.683 "abort": true, 00:13:29.683 "seek_hole": false, 00:13:29.683 "seek_data": false, 00:13:29.683 "copy": true, 00:13:29.683 "nvme_iov_md": false 00:13:29.683 }, 00:13:29.683 "memory_domains": [ 00:13:29.683 { 00:13:29.683 "dma_device_id": "system", 00:13:29.683 "dma_device_type": 1 00:13:29.683 }, 00:13:29.683 { 00:13:29.683 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:29.683 "dma_device_type": 2 00:13:29.683 } 00:13:29.683 ], 00:13:29.683 "driver_specific": {} 00:13:29.683 } 00:13:29.683 ] 00:13:29.683 23:35:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:13:29.683 23:35:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:29.683 23:35:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:29.684 23:35:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:29.684 23:35:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:29.684 23:35:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:29.684 23:35:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:29.684 23:35:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:29.684 23:35:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:29.684 23:35:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:29.684 23:35:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:29.684 23:35:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:29.684 23:35:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:29.941 23:35:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:29.941 "name": "Existed_Raid", 00:13:29.941 "uuid": "923e71f0-6bc3-4689-8f15-15a58643a843", 00:13:29.941 "strip_size_kb": 0, 00:13:29.941 "state": "configuring", 00:13:29.941 "raid_level": "raid1", 00:13:29.941 "superblock": true, 00:13:29.941 "num_base_bdevs": 3, 00:13:29.942 "num_base_bdevs_discovered": 2, 00:13:29.942 "num_base_bdevs_operational": 3, 00:13:29.942 "base_bdevs_list": [ 00:13:29.942 { 00:13:29.942 "name": "BaseBdev1", 00:13:29.942 "uuid": "08732da8-9256-4de2-a121-43ff07117c2f", 00:13:29.942 "is_configured": true, 00:13:29.942 "data_offset": 2048, 00:13:29.942 "data_size": 63488 00:13:29.942 }, 00:13:29.942 { 00:13:29.942 "name": null, 00:13:29.942 "uuid": "6cf1b8dc-77d3-4d3f-b773-e7dfaf62b523", 00:13:29.942 "is_configured": false, 00:13:29.942 "data_offset": 2048, 00:13:29.942 "data_size": 63488 00:13:29.942 }, 00:13:29.942 { 00:13:29.942 "name": "BaseBdev3", 00:13:29.942 "uuid": "042b5d6e-428d-4a14-b6f1-376c79e9d418", 00:13:29.942 "is_configured": true, 00:13:29.942 "data_offset": 2048, 00:13:29.942 "data_size": 63488 00:13:29.942 } 00:13:29.942 ] 00:13:29.942 }' 00:13:29.942 23:35:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:29.942 23:35:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:30.506 23:35:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:30.506 23:35:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:13:30.506 23:35:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:13:30.506 23:35:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:13:30.765 [2024-07-24 23:35:15.582887] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:13:30.765 23:35:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:30.765 23:35:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:30.765 23:35:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:30.765 23:35:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:30.765 23:35:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:30.765 23:35:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:30.765 23:35:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:30.765 23:35:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:30.765 23:35:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:30.765 23:35:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:30.765 23:35:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:30.765 23:35:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:31.022 23:35:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:31.022 "name": "Existed_Raid", 00:13:31.022 "uuid": "923e71f0-6bc3-4689-8f15-15a58643a843", 00:13:31.022 "strip_size_kb": 0, 00:13:31.022 "state": "configuring", 00:13:31.022 "raid_level": "raid1", 00:13:31.022 "superblock": true, 00:13:31.022 "num_base_bdevs": 3, 00:13:31.022 "num_base_bdevs_discovered": 1, 00:13:31.022 "num_base_bdevs_operational": 3, 00:13:31.022 "base_bdevs_list": [ 00:13:31.022 { 00:13:31.022 "name": "BaseBdev1", 00:13:31.022 "uuid": "08732da8-9256-4de2-a121-43ff07117c2f", 00:13:31.022 "is_configured": true, 00:13:31.022 "data_offset": 2048, 00:13:31.022 "data_size": 63488 00:13:31.022 }, 00:13:31.022 { 00:13:31.022 "name": null, 00:13:31.022 "uuid": "6cf1b8dc-77d3-4d3f-b773-e7dfaf62b523", 00:13:31.022 "is_configured": false, 00:13:31.022 "data_offset": 2048, 00:13:31.022 "data_size": 63488 00:13:31.022 }, 00:13:31.022 { 00:13:31.022 "name": null, 00:13:31.022 "uuid": "042b5d6e-428d-4a14-b6f1-376c79e9d418", 00:13:31.022 "is_configured": false, 00:13:31.022 "data_offset": 2048, 00:13:31.022 "data_size": 63488 00:13:31.022 } 00:13:31.022 ] 00:13:31.022 }' 00:13:31.022 23:35:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:31.022 23:35:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:31.279 23:35:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:31.279 23:35:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:13:31.537 23:35:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:13:31.537 23:35:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:13:31.795 [2024-07-24 23:35:16.593529] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:31.795 23:35:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:31.795 23:35:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:31.795 23:35:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:31.795 23:35:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:31.795 23:35:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:31.795 23:35:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:31.795 23:35:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:31.795 23:35:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:31.795 23:35:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:31.795 23:35:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:31.795 23:35:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:31.795 23:35:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:31.795 23:35:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:31.795 "name": "Existed_Raid", 00:13:31.795 "uuid": "923e71f0-6bc3-4689-8f15-15a58643a843", 00:13:31.795 "strip_size_kb": 0, 00:13:31.795 "state": "configuring", 00:13:31.795 "raid_level": "raid1", 00:13:31.795 "superblock": true, 00:13:31.795 "num_base_bdevs": 3, 00:13:31.795 "num_base_bdevs_discovered": 2, 00:13:31.795 "num_base_bdevs_operational": 3, 00:13:31.795 "base_bdevs_list": [ 00:13:31.795 { 00:13:31.795 "name": "BaseBdev1", 00:13:31.795 "uuid": "08732da8-9256-4de2-a121-43ff07117c2f", 00:13:31.795 "is_configured": true, 00:13:31.795 "data_offset": 2048, 00:13:31.795 "data_size": 63488 00:13:31.795 }, 00:13:31.795 { 00:13:31.795 "name": null, 00:13:31.795 "uuid": "6cf1b8dc-77d3-4d3f-b773-e7dfaf62b523", 00:13:31.795 "is_configured": false, 00:13:31.795 "data_offset": 2048, 00:13:31.795 "data_size": 63488 00:13:31.795 }, 00:13:31.795 { 00:13:31.795 "name": "BaseBdev3", 00:13:31.795 "uuid": "042b5d6e-428d-4a14-b6f1-376c79e9d418", 00:13:31.795 "is_configured": true, 00:13:31.795 "data_offset": 2048, 00:13:31.795 "data_size": 63488 00:13:31.795 } 00:13:31.795 ] 00:13:31.795 }' 00:13:31.795 23:35:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:31.795 23:35:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:32.359 23:35:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:32.359 23:35:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:13:32.617 23:35:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:13:32.617 23:35:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:32.617 [2024-07-24 23:35:17.564160] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:32.617 23:35:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:32.617 23:35:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:32.617 23:35:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:32.617 23:35:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:32.617 23:35:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:32.617 23:35:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:32.617 23:35:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:32.617 23:35:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:32.617 23:35:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:32.617 23:35:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:32.617 23:35:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:32.617 23:35:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:32.874 23:35:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:32.874 "name": "Existed_Raid", 00:13:32.874 "uuid": "923e71f0-6bc3-4689-8f15-15a58643a843", 00:13:32.874 "strip_size_kb": 0, 00:13:32.874 "state": "configuring", 00:13:32.874 "raid_level": "raid1", 00:13:32.874 "superblock": true, 00:13:32.874 "num_base_bdevs": 3, 00:13:32.874 "num_base_bdevs_discovered": 1, 00:13:32.874 "num_base_bdevs_operational": 3, 00:13:32.874 "base_bdevs_list": [ 00:13:32.874 { 00:13:32.874 "name": null, 00:13:32.874 "uuid": "08732da8-9256-4de2-a121-43ff07117c2f", 00:13:32.874 "is_configured": false, 00:13:32.874 "data_offset": 2048, 00:13:32.874 "data_size": 63488 00:13:32.874 }, 00:13:32.874 { 00:13:32.874 "name": null, 00:13:32.874 "uuid": "6cf1b8dc-77d3-4d3f-b773-e7dfaf62b523", 00:13:32.874 "is_configured": false, 00:13:32.874 "data_offset": 2048, 00:13:32.874 "data_size": 63488 00:13:32.874 }, 00:13:32.874 { 00:13:32.874 "name": "BaseBdev3", 00:13:32.874 "uuid": "042b5d6e-428d-4a14-b6f1-376c79e9d418", 00:13:32.874 "is_configured": true, 00:13:32.874 "data_offset": 2048, 00:13:32.874 "data_size": 63488 00:13:32.874 } 00:13:32.874 ] 00:13:32.874 }' 00:13:32.874 23:35:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:32.874 23:35:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:33.441 23:35:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:33.441 23:35:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:13:33.441 23:35:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:13:33.441 23:35:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:13:33.700 [2024-07-24 23:35:18.540500] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:33.700 23:35:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:33.700 23:35:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:33.700 23:35:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:33.700 23:35:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:33.700 23:35:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:33.700 23:35:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:33.700 23:35:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:33.700 23:35:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:33.700 23:35:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:33.700 23:35:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:33.700 23:35:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:33.700 23:35:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:33.958 23:35:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:33.958 "name": "Existed_Raid", 00:13:33.958 "uuid": "923e71f0-6bc3-4689-8f15-15a58643a843", 00:13:33.958 "strip_size_kb": 0, 00:13:33.958 "state": "configuring", 00:13:33.958 "raid_level": "raid1", 00:13:33.958 "superblock": true, 00:13:33.958 "num_base_bdevs": 3, 00:13:33.958 "num_base_bdevs_discovered": 2, 00:13:33.958 "num_base_bdevs_operational": 3, 00:13:33.958 "base_bdevs_list": [ 00:13:33.958 { 00:13:33.958 "name": null, 00:13:33.958 "uuid": "08732da8-9256-4de2-a121-43ff07117c2f", 00:13:33.958 "is_configured": false, 00:13:33.958 "data_offset": 2048, 00:13:33.958 "data_size": 63488 00:13:33.958 }, 00:13:33.958 { 00:13:33.958 "name": "BaseBdev2", 00:13:33.958 "uuid": "6cf1b8dc-77d3-4d3f-b773-e7dfaf62b523", 00:13:33.958 "is_configured": true, 00:13:33.958 "data_offset": 2048, 00:13:33.958 "data_size": 63488 00:13:33.958 }, 00:13:33.958 { 00:13:33.958 "name": "BaseBdev3", 00:13:33.958 "uuid": "042b5d6e-428d-4a14-b6f1-376c79e9d418", 00:13:33.958 "is_configured": true, 00:13:33.958 "data_offset": 2048, 00:13:33.958 "data_size": 63488 00:13:33.958 } 00:13:33.958 ] 00:13:33.958 }' 00:13:33.958 23:35:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:33.958 23:35:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:34.216 23:35:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:34.216 23:35:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:13:34.475 23:35:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:13:34.475 23:35:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:34.475 23:35:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:13:34.733 23:35:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 08732da8-9256-4de2-a121-43ff07117c2f 00:13:34.733 [2024-07-24 23:35:19.714226] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:13:34.733 [2024-07-24 23:35:19.714338] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x935d10 00:13:34.733 [2024-07-24 23:35:19.714346] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:34.733 [2024-07-24 23:35:19.714466] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xaef1e0 00:13:34.733 [2024-07-24 23:35:19.714581] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x935d10 00:13:34.733 [2024-07-24 23:35:19.714586] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x935d10 00:13:34.733 [2024-07-24 23:35:19.714648] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:34.733 NewBaseBdev 00:13:34.733 23:35:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:13:34.733 23:35:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=NewBaseBdev 00:13:34.733 23:35:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:13:34.733 23:35:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:13:34.733 23:35:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:13:34.733 23:35:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:13:34.733 23:35:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:34.992 23:35:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:13:35.250 [ 00:13:35.250 { 00:13:35.250 "name": "NewBaseBdev", 00:13:35.250 "aliases": [ 00:13:35.250 "08732da8-9256-4de2-a121-43ff07117c2f" 00:13:35.250 ], 00:13:35.250 "product_name": "Malloc disk", 00:13:35.250 "block_size": 512, 00:13:35.250 "num_blocks": 65536, 00:13:35.250 "uuid": "08732da8-9256-4de2-a121-43ff07117c2f", 00:13:35.250 "assigned_rate_limits": { 00:13:35.250 "rw_ios_per_sec": 0, 00:13:35.250 "rw_mbytes_per_sec": 0, 00:13:35.250 "r_mbytes_per_sec": 0, 00:13:35.250 "w_mbytes_per_sec": 0 00:13:35.250 }, 00:13:35.250 "claimed": true, 00:13:35.250 "claim_type": "exclusive_write", 00:13:35.250 "zoned": false, 00:13:35.250 "supported_io_types": { 00:13:35.250 "read": true, 00:13:35.250 "write": true, 00:13:35.250 "unmap": true, 00:13:35.250 "flush": true, 00:13:35.250 "reset": true, 00:13:35.250 "nvme_admin": false, 00:13:35.250 "nvme_io": false, 00:13:35.250 "nvme_io_md": false, 00:13:35.250 "write_zeroes": true, 00:13:35.250 "zcopy": true, 00:13:35.250 "get_zone_info": false, 00:13:35.250 "zone_management": false, 00:13:35.250 "zone_append": false, 00:13:35.250 "compare": false, 00:13:35.250 "compare_and_write": false, 00:13:35.250 "abort": true, 00:13:35.250 "seek_hole": false, 00:13:35.250 "seek_data": false, 00:13:35.250 "copy": true, 00:13:35.250 "nvme_iov_md": false 00:13:35.250 }, 00:13:35.250 "memory_domains": [ 00:13:35.250 { 00:13:35.250 "dma_device_id": "system", 00:13:35.250 "dma_device_type": 1 00:13:35.250 }, 00:13:35.250 { 00:13:35.250 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:35.250 "dma_device_type": 2 00:13:35.250 } 00:13:35.250 ], 00:13:35.250 "driver_specific": {} 00:13:35.250 } 00:13:35.250 ] 00:13:35.250 23:35:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:13:35.250 23:35:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:13:35.250 23:35:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:35.250 23:35:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:35.250 23:35:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:35.251 23:35:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:35.251 23:35:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:35.251 23:35:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:35.251 23:35:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:35.251 23:35:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:35.251 23:35:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:35.251 23:35:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:35.251 23:35:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:35.251 23:35:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:35.251 "name": "Existed_Raid", 00:13:35.251 "uuid": "923e71f0-6bc3-4689-8f15-15a58643a843", 00:13:35.251 "strip_size_kb": 0, 00:13:35.251 "state": "online", 00:13:35.251 "raid_level": "raid1", 00:13:35.251 "superblock": true, 00:13:35.251 "num_base_bdevs": 3, 00:13:35.251 "num_base_bdevs_discovered": 3, 00:13:35.251 "num_base_bdevs_operational": 3, 00:13:35.251 "base_bdevs_list": [ 00:13:35.251 { 00:13:35.251 "name": "NewBaseBdev", 00:13:35.251 "uuid": "08732da8-9256-4de2-a121-43ff07117c2f", 00:13:35.251 "is_configured": true, 00:13:35.251 "data_offset": 2048, 00:13:35.251 "data_size": 63488 00:13:35.251 }, 00:13:35.251 { 00:13:35.251 "name": "BaseBdev2", 00:13:35.251 "uuid": "6cf1b8dc-77d3-4d3f-b773-e7dfaf62b523", 00:13:35.251 "is_configured": true, 00:13:35.251 "data_offset": 2048, 00:13:35.251 "data_size": 63488 00:13:35.251 }, 00:13:35.251 { 00:13:35.251 "name": "BaseBdev3", 00:13:35.251 "uuid": "042b5d6e-428d-4a14-b6f1-376c79e9d418", 00:13:35.251 "is_configured": true, 00:13:35.251 "data_offset": 2048, 00:13:35.251 "data_size": 63488 00:13:35.251 } 00:13:35.251 ] 00:13:35.251 }' 00:13:35.251 23:35:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:35.251 23:35:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:35.819 23:35:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:13:35.819 23:35:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:35.819 23:35:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:35.819 23:35:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:35.819 23:35:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:35.819 23:35:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:13:35.819 23:35:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:35.819 23:35:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:36.077 [2024-07-24 23:35:20.857377] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:36.077 23:35:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:36.077 "name": "Existed_Raid", 00:13:36.077 "aliases": [ 00:13:36.077 "923e71f0-6bc3-4689-8f15-15a58643a843" 00:13:36.077 ], 00:13:36.077 "product_name": "Raid Volume", 00:13:36.077 "block_size": 512, 00:13:36.077 "num_blocks": 63488, 00:13:36.077 "uuid": "923e71f0-6bc3-4689-8f15-15a58643a843", 00:13:36.077 "assigned_rate_limits": { 00:13:36.077 "rw_ios_per_sec": 0, 00:13:36.077 "rw_mbytes_per_sec": 0, 00:13:36.077 "r_mbytes_per_sec": 0, 00:13:36.077 "w_mbytes_per_sec": 0 00:13:36.077 }, 00:13:36.077 "claimed": false, 00:13:36.077 "zoned": false, 00:13:36.077 "supported_io_types": { 00:13:36.077 "read": true, 00:13:36.077 "write": true, 00:13:36.077 "unmap": false, 00:13:36.077 "flush": false, 00:13:36.077 "reset": true, 00:13:36.077 "nvme_admin": false, 00:13:36.077 "nvme_io": false, 00:13:36.077 "nvme_io_md": false, 00:13:36.077 "write_zeroes": true, 00:13:36.077 "zcopy": false, 00:13:36.077 "get_zone_info": false, 00:13:36.077 "zone_management": false, 00:13:36.077 "zone_append": false, 00:13:36.077 "compare": false, 00:13:36.077 "compare_and_write": false, 00:13:36.077 "abort": false, 00:13:36.077 "seek_hole": false, 00:13:36.077 "seek_data": false, 00:13:36.077 "copy": false, 00:13:36.077 "nvme_iov_md": false 00:13:36.077 }, 00:13:36.077 "memory_domains": [ 00:13:36.077 { 00:13:36.077 "dma_device_id": "system", 00:13:36.077 "dma_device_type": 1 00:13:36.077 }, 00:13:36.077 { 00:13:36.077 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:36.077 "dma_device_type": 2 00:13:36.077 }, 00:13:36.077 { 00:13:36.077 "dma_device_id": "system", 00:13:36.077 "dma_device_type": 1 00:13:36.077 }, 00:13:36.077 { 00:13:36.077 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:36.077 "dma_device_type": 2 00:13:36.077 }, 00:13:36.077 { 00:13:36.077 "dma_device_id": "system", 00:13:36.077 "dma_device_type": 1 00:13:36.077 }, 00:13:36.077 { 00:13:36.077 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:36.077 "dma_device_type": 2 00:13:36.077 } 00:13:36.077 ], 00:13:36.077 "driver_specific": { 00:13:36.077 "raid": { 00:13:36.077 "uuid": "923e71f0-6bc3-4689-8f15-15a58643a843", 00:13:36.077 "strip_size_kb": 0, 00:13:36.077 "state": "online", 00:13:36.077 "raid_level": "raid1", 00:13:36.077 "superblock": true, 00:13:36.077 "num_base_bdevs": 3, 00:13:36.077 "num_base_bdevs_discovered": 3, 00:13:36.077 "num_base_bdevs_operational": 3, 00:13:36.077 "base_bdevs_list": [ 00:13:36.077 { 00:13:36.077 "name": "NewBaseBdev", 00:13:36.077 "uuid": "08732da8-9256-4de2-a121-43ff07117c2f", 00:13:36.077 "is_configured": true, 00:13:36.077 "data_offset": 2048, 00:13:36.077 "data_size": 63488 00:13:36.077 }, 00:13:36.077 { 00:13:36.077 "name": "BaseBdev2", 00:13:36.077 "uuid": "6cf1b8dc-77d3-4d3f-b773-e7dfaf62b523", 00:13:36.077 "is_configured": true, 00:13:36.077 "data_offset": 2048, 00:13:36.077 "data_size": 63488 00:13:36.077 }, 00:13:36.077 { 00:13:36.077 "name": "BaseBdev3", 00:13:36.077 "uuid": "042b5d6e-428d-4a14-b6f1-376c79e9d418", 00:13:36.077 "is_configured": true, 00:13:36.077 "data_offset": 2048, 00:13:36.077 "data_size": 63488 00:13:36.077 } 00:13:36.077 ] 00:13:36.077 } 00:13:36.077 } 00:13:36.077 }' 00:13:36.077 23:35:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:36.077 23:35:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:13:36.077 BaseBdev2 00:13:36.077 BaseBdev3' 00:13:36.077 23:35:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:36.077 23:35:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:13:36.077 23:35:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:36.336 23:35:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:36.336 "name": "NewBaseBdev", 00:13:36.336 "aliases": [ 00:13:36.336 "08732da8-9256-4de2-a121-43ff07117c2f" 00:13:36.336 ], 00:13:36.336 "product_name": "Malloc disk", 00:13:36.336 "block_size": 512, 00:13:36.336 "num_blocks": 65536, 00:13:36.336 "uuid": "08732da8-9256-4de2-a121-43ff07117c2f", 00:13:36.336 "assigned_rate_limits": { 00:13:36.336 "rw_ios_per_sec": 0, 00:13:36.336 "rw_mbytes_per_sec": 0, 00:13:36.336 "r_mbytes_per_sec": 0, 00:13:36.336 "w_mbytes_per_sec": 0 00:13:36.336 }, 00:13:36.336 "claimed": true, 00:13:36.336 "claim_type": "exclusive_write", 00:13:36.336 "zoned": false, 00:13:36.336 "supported_io_types": { 00:13:36.336 "read": true, 00:13:36.336 "write": true, 00:13:36.336 "unmap": true, 00:13:36.336 "flush": true, 00:13:36.336 "reset": true, 00:13:36.336 "nvme_admin": false, 00:13:36.336 "nvme_io": false, 00:13:36.336 "nvme_io_md": false, 00:13:36.336 "write_zeroes": true, 00:13:36.336 "zcopy": true, 00:13:36.336 "get_zone_info": false, 00:13:36.336 "zone_management": false, 00:13:36.336 "zone_append": false, 00:13:36.336 "compare": false, 00:13:36.336 "compare_and_write": false, 00:13:36.336 "abort": true, 00:13:36.336 "seek_hole": false, 00:13:36.336 "seek_data": false, 00:13:36.336 "copy": true, 00:13:36.336 "nvme_iov_md": false 00:13:36.336 }, 00:13:36.336 "memory_domains": [ 00:13:36.336 { 00:13:36.336 "dma_device_id": "system", 00:13:36.336 "dma_device_type": 1 00:13:36.336 }, 00:13:36.336 { 00:13:36.336 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:36.336 "dma_device_type": 2 00:13:36.336 } 00:13:36.336 ], 00:13:36.336 "driver_specific": {} 00:13:36.336 }' 00:13:36.336 23:35:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:36.336 23:35:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:36.336 23:35:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:36.336 23:35:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:36.336 23:35:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:36.336 23:35:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:36.336 23:35:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:36.336 23:35:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:36.336 23:35:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:36.336 23:35:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:36.594 23:35:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:36.594 23:35:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:36.594 23:35:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:36.594 23:35:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:36.594 23:35:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:36.594 23:35:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:36.594 "name": "BaseBdev2", 00:13:36.594 "aliases": [ 00:13:36.594 "6cf1b8dc-77d3-4d3f-b773-e7dfaf62b523" 00:13:36.594 ], 00:13:36.594 "product_name": "Malloc disk", 00:13:36.594 "block_size": 512, 00:13:36.594 "num_blocks": 65536, 00:13:36.594 "uuid": "6cf1b8dc-77d3-4d3f-b773-e7dfaf62b523", 00:13:36.594 "assigned_rate_limits": { 00:13:36.594 "rw_ios_per_sec": 0, 00:13:36.594 "rw_mbytes_per_sec": 0, 00:13:36.594 "r_mbytes_per_sec": 0, 00:13:36.594 "w_mbytes_per_sec": 0 00:13:36.594 }, 00:13:36.595 "claimed": true, 00:13:36.595 "claim_type": "exclusive_write", 00:13:36.595 "zoned": false, 00:13:36.595 "supported_io_types": { 00:13:36.595 "read": true, 00:13:36.595 "write": true, 00:13:36.595 "unmap": true, 00:13:36.595 "flush": true, 00:13:36.595 "reset": true, 00:13:36.595 "nvme_admin": false, 00:13:36.595 "nvme_io": false, 00:13:36.595 "nvme_io_md": false, 00:13:36.595 "write_zeroes": true, 00:13:36.595 "zcopy": true, 00:13:36.595 "get_zone_info": false, 00:13:36.595 "zone_management": false, 00:13:36.595 "zone_append": false, 00:13:36.595 "compare": false, 00:13:36.595 "compare_and_write": false, 00:13:36.595 "abort": true, 00:13:36.595 "seek_hole": false, 00:13:36.595 "seek_data": false, 00:13:36.595 "copy": true, 00:13:36.595 "nvme_iov_md": false 00:13:36.595 }, 00:13:36.595 "memory_domains": [ 00:13:36.595 { 00:13:36.595 "dma_device_id": "system", 00:13:36.595 "dma_device_type": 1 00:13:36.595 }, 00:13:36.595 { 00:13:36.595 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:36.595 "dma_device_type": 2 00:13:36.595 } 00:13:36.595 ], 00:13:36.595 "driver_specific": {} 00:13:36.595 }' 00:13:36.595 23:35:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:36.595 23:35:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:36.853 23:35:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:36.853 23:35:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:36.853 23:35:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:36.853 23:35:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:36.853 23:35:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:36.853 23:35:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:36.853 23:35:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:36.853 23:35:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:36.853 23:35:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:37.111 23:35:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:37.111 23:35:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:37.111 23:35:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:13:37.111 23:35:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:37.111 23:35:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:37.111 "name": "BaseBdev3", 00:13:37.111 "aliases": [ 00:13:37.111 "042b5d6e-428d-4a14-b6f1-376c79e9d418" 00:13:37.111 ], 00:13:37.111 "product_name": "Malloc disk", 00:13:37.111 "block_size": 512, 00:13:37.111 "num_blocks": 65536, 00:13:37.111 "uuid": "042b5d6e-428d-4a14-b6f1-376c79e9d418", 00:13:37.111 "assigned_rate_limits": { 00:13:37.111 "rw_ios_per_sec": 0, 00:13:37.111 "rw_mbytes_per_sec": 0, 00:13:37.111 "r_mbytes_per_sec": 0, 00:13:37.111 "w_mbytes_per_sec": 0 00:13:37.111 }, 00:13:37.111 "claimed": true, 00:13:37.111 "claim_type": "exclusive_write", 00:13:37.111 "zoned": false, 00:13:37.111 "supported_io_types": { 00:13:37.111 "read": true, 00:13:37.111 "write": true, 00:13:37.111 "unmap": true, 00:13:37.111 "flush": true, 00:13:37.111 "reset": true, 00:13:37.111 "nvme_admin": false, 00:13:37.111 "nvme_io": false, 00:13:37.111 "nvme_io_md": false, 00:13:37.111 "write_zeroes": true, 00:13:37.111 "zcopy": true, 00:13:37.111 "get_zone_info": false, 00:13:37.111 "zone_management": false, 00:13:37.111 "zone_append": false, 00:13:37.111 "compare": false, 00:13:37.111 "compare_and_write": false, 00:13:37.111 "abort": true, 00:13:37.111 "seek_hole": false, 00:13:37.111 "seek_data": false, 00:13:37.111 "copy": true, 00:13:37.111 "nvme_iov_md": false 00:13:37.111 }, 00:13:37.111 "memory_domains": [ 00:13:37.111 { 00:13:37.111 "dma_device_id": "system", 00:13:37.111 "dma_device_type": 1 00:13:37.111 }, 00:13:37.111 { 00:13:37.111 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:37.111 "dma_device_type": 2 00:13:37.111 } 00:13:37.111 ], 00:13:37.111 "driver_specific": {} 00:13:37.111 }' 00:13:37.111 23:35:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:37.111 23:35:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:37.370 23:35:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:37.370 23:35:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:37.370 23:35:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:37.370 23:35:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:37.370 23:35:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:37.370 23:35:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:37.370 23:35:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:37.370 23:35:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:37.370 23:35:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:37.628 23:35:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:37.628 23:35:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:37.628 [2024-07-24 23:35:22.525501] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:37.628 [2024-07-24 23:35:22.525522] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:37.628 [2024-07-24 23:35:22.525561] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:37.628 [2024-07-24 23:35:22.525737] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:37.628 [2024-07-24 23:35:22.525742] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x935d10 name Existed_Raid, state offline 00:13:37.628 23:35:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 292744 00:13:37.628 23:35:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@950 -- # '[' -z 292744 ']' 00:13:37.628 23:35:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # kill -0 292744 00:13:37.628 23:35:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # uname 00:13:37.628 23:35:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:13:37.628 23:35:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 292744 00:13:37.628 23:35:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:13:37.628 23:35:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:13:37.628 23:35:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # echo 'killing process with pid 292744' 00:13:37.628 killing process with pid 292744 00:13:37.628 23:35:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@969 -- # kill 292744 00:13:37.628 [2024-07-24 23:35:22.588449] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:37.628 23:35:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@974 -- # wait 292744 00:13:37.628 [2024-07-24 23:35:22.611974] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:37.887 23:35:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:13:37.887 00:13:37.887 real 0m21.284s 00:13:37.887 user 0m39.666s 00:13:37.887 sys 0m3.311s 00:13:37.887 23:35:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1126 -- # xtrace_disable 00:13:37.887 23:35:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:37.887 ************************************ 00:13:37.887 END TEST raid_state_function_test_sb 00:13:37.887 ************************************ 00:13:37.887 23:35:22 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 3 00:13:37.887 23:35:22 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:13:37.887 23:35:22 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:13:37.887 23:35:22 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:37.887 ************************************ 00:13:37.887 START TEST raid_superblock_test 00:13:37.887 ************************************ 00:13:37.887 23:35:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # raid_superblock_test raid1 3 00:13:37.887 23:35:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:13:37.887 23:35:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:13:37.887 23:35:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:13:37.887 23:35:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:13:37.887 23:35:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:13:37.887 23:35:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:13:37.887 23:35:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:13:37.887 23:35:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:13:37.887 23:35:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:13:37.887 23:35:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:13:37.887 23:35:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:13:37.887 23:35:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:13:37.887 23:35:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:13:37.887 23:35:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:13:37.887 23:35:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:13:37.887 23:35:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=296860 00:13:37.887 23:35:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 296860 /var/tmp/spdk-raid.sock 00:13:37.887 23:35:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@831 -- # '[' -z 296860 ']' 00:13:37.887 23:35:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:37.887 23:35:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:13:37.887 23:35:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:37.887 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:37.887 23:35:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:13:37.887 23:35:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:13:37.887 23:35:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:38.146 [2024-07-24 23:35:22.901779] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:13:38.146 [2024-07-24 23:35:22.901817] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid296860 ] 00:13:38.146 [2024-07-24 23:35:22.965045] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:38.146 [2024-07-24 23:35:23.042447] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:38.146 [2024-07-24 23:35:23.091756] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:38.146 [2024-07-24 23:35:23.091780] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:38.711 23:35:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:13:38.711 23:35:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@864 -- # return 0 00:13:38.711 23:35:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:13:38.711 23:35:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:13:38.711 23:35:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:13:38.711 23:35:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:13:38.711 23:35:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:13:38.711 23:35:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:13:38.711 23:35:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:13:38.711 23:35:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:13:38.711 23:35:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:13:38.968 malloc1 00:13:38.968 23:35:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:13:39.225 [2024-07-24 23:35:24.011591] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:13:39.225 [2024-07-24 23:35:24.011623] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:39.225 [2024-07-24 23:35:24.011634] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16f0dd0 00:13:39.225 [2024-07-24 23:35:24.011640] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:39.225 [2024-07-24 23:35:24.012685] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:39.225 [2024-07-24 23:35:24.012706] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:13:39.225 pt1 00:13:39.225 23:35:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:13:39.225 23:35:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:13:39.225 23:35:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:13:39.225 23:35:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:13:39.225 23:35:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:13:39.225 23:35:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:13:39.225 23:35:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:13:39.225 23:35:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:13:39.225 23:35:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:13:39.225 malloc2 00:13:39.225 23:35:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:13:39.482 [2024-07-24 23:35:24.343937] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:13:39.482 [2024-07-24 23:35:24.343986] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:39.482 [2024-07-24 23:35:24.343998] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16f18d0 00:13:39.482 [2024-07-24 23:35:24.344004] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:39.482 [2024-07-24 23:35:24.345041] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:39.482 [2024-07-24 23:35:24.345062] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:13:39.482 pt2 00:13:39.482 23:35:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:13:39.482 23:35:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:13:39.482 23:35:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:13:39.482 23:35:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:13:39.482 23:35:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:13:39.482 23:35:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:13:39.482 23:35:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:13:39.482 23:35:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:13:39.482 23:35:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:13:39.740 malloc3 00:13:39.740 23:35:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:13:39.740 [2024-07-24 23:35:24.684286] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:13:39.740 [2024-07-24 23:35:24.684319] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:39.740 [2024-07-24 23:35:24.684329] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17b2740 00:13:39.740 [2024-07-24 23:35:24.684335] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:39.740 [2024-07-24 23:35:24.685331] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:39.740 [2024-07-24 23:35:24.685350] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:13:39.740 pt3 00:13:39.740 23:35:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:13:39.740 23:35:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:13:39.740 23:35:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:13:39.998 [2024-07-24 23:35:24.840709] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:13:39.998 [2024-07-24 23:35:24.841542] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:39.998 [2024-07-24 23:35:24.841578] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:13:39.998 [2024-07-24 23:35:24.841677] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x17b2d20 00:13:39.998 [2024-07-24 23:35:24.841683] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:39.998 [2024-07-24 23:35:24.841806] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16f31c0 00:13:39.998 [2024-07-24 23:35:24.841906] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x17b2d20 00:13:39.998 [2024-07-24 23:35:24.841911] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x17b2d20 00:13:39.998 [2024-07-24 23:35:24.841979] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:39.998 23:35:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:13:39.999 23:35:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:39.999 23:35:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:39.999 23:35:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:39.999 23:35:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:39.999 23:35:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:39.999 23:35:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:39.999 23:35:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:39.999 23:35:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:39.999 23:35:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:39.999 23:35:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:39.999 23:35:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:40.272 23:35:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:40.272 "name": "raid_bdev1", 00:13:40.272 "uuid": "1b775eb9-ddee-4a6d-8940-6035b8461585", 00:13:40.272 "strip_size_kb": 0, 00:13:40.272 "state": "online", 00:13:40.272 "raid_level": "raid1", 00:13:40.272 "superblock": true, 00:13:40.272 "num_base_bdevs": 3, 00:13:40.272 "num_base_bdevs_discovered": 3, 00:13:40.272 "num_base_bdevs_operational": 3, 00:13:40.272 "base_bdevs_list": [ 00:13:40.272 { 00:13:40.272 "name": "pt1", 00:13:40.272 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:40.272 "is_configured": true, 00:13:40.272 "data_offset": 2048, 00:13:40.272 "data_size": 63488 00:13:40.272 }, 00:13:40.272 { 00:13:40.272 "name": "pt2", 00:13:40.272 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:40.272 "is_configured": true, 00:13:40.272 "data_offset": 2048, 00:13:40.272 "data_size": 63488 00:13:40.272 }, 00:13:40.272 { 00:13:40.272 "name": "pt3", 00:13:40.272 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:40.272 "is_configured": true, 00:13:40.272 "data_offset": 2048, 00:13:40.272 "data_size": 63488 00:13:40.272 } 00:13:40.272 ] 00:13:40.272 }' 00:13:40.272 23:35:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:40.272 23:35:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:40.554 23:35:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:13:40.554 23:35:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:13:40.554 23:35:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:40.554 23:35:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:40.554 23:35:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:40.554 23:35:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:40.554 23:35:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:40.554 23:35:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:40.812 [2024-07-24 23:35:25.675012] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:40.812 23:35:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:40.812 "name": "raid_bdev1", 00:13:40.812 "aliases": [ 00:13:40.812 "1b775eb9-ddee-4a6d-8940-6035b8461585" 00:13:40.812 ], 00:13:40.812 "product_name": "Raid Volume", 00:13:40.812 "block_size": 512, 00:13:40.812 "num_blocks": 63488, 00:13:40.812 "uuid": "1b775eb9-ddee-4a6d-8940-6035b8461585", 00:13:40.812 "assigned_rate_limits": { 00:13:40.812 "rw_ios_per_sec": 0, 00:13:40.812 "rw_mbytes_per_sec": 0, 00:13:40.812 "r_mbytes_per_sec": 0, 00:13:40.812 "w_mbytes_per_sec": 0 00:13:40.812 }, 00:13:40.812 "claimed": false, 00:13:40.812 "zoned": false, 00:13:40.812 "supported_io_types": { 00:13:40.812 "read": true, 00:13:40.812 "write": true, 00:13:40.812 "unmap": false, 00:13:40.812 "flush": false, 00:13:40.812 "reset": true, 00:13:40.812 "nvme_admin": false, 00:13:40.812 "nvme_io": false, 00:13:40.812 "nvme_io_md": false, 00:13:40.812 "write_zeroes": true, 00:13:40.812 "zcopy": false, 00:13:40.812 "get_zone_info": false, 00:13:40.812 "zone_management": false, 00:13:40.812 "zone_append": false, 00:13:40.812 "compare": false, 00:13:40.812 "compare_and_write": false, 00:13:40.812 "abort": false, 00:13:40.812 "seek_hole": false, 00:13:40.812 "seek_data": false, 00:13:40.812 "copy": false, 00:13:40.812 "nvme_iov_md": false 00:13:40.812 }, 00:13:40.812 "memory_domains": [ 00:13:40.812 { 00:13:40.812 "dma_device_id": "system", 00:13:40.812 "dma_device_type": 1 00:13:40.812 }, 00:13:40.812 { 00:13:40.812 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:40.812 "dma_device_type": 2 00:13:40.812 }, 00:13:40.812 { 00:13:40.812 "dma_device_id": "system", 00:13:40.812 "dma_device_type": 1 00:13:40.812 }, 00:13:40.812 { 00:13:40.812 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:40.812 "dma_device_type": 2 00:13:40.812 }, 00:13:40.812 { 00:13:40.812 "dma_device_id": "system", 00:13:40.812 "dma_device_type": 1 00:13:40.812 }, 00:13:40.812 { 00:13:40.812 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:40.812 "dma_device_type": 2 00:13:40.812 } 00:13:40.812 ], 00:13:40.812 "driver_specific": { 00:13:40.812 "raid": { 00:13:40.812 "uuid": "1b775eb9-ddee-4a6d-8940-6035b8461585", 00:13:40.812 "strip_size_kb": 0, 00:13:40.812 "state": "online", 00:13:40.812 "raid_level": "raid1", 00:13:40.812 "superblock": true, 00:13:40.812 "num_base_bdevs": 3, 00:13:40.812 "num_base_bdevs_discovered": 3, 00:13:40.812 "num_base_bdevs_operational": 3, 00:13:40.812 "base_bdevs_list": [ 00:13:40.812 { 00:13:40.812 "name": "pt1", 00:13:40.812 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:40.812 "is_configured": true, 00:13:40.812 "data_offset": 2048, 00:13:40.812 "data_size": 63488 00:13:40.812 }, 00:13:40.812 { 00:13:40.812 "name": "pt2", 00:13:40.812 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:40.812 "is_configured": true, 00:13:40.812 "data_offset": 2048, 00:13:40.812 "data_size": 63488 00:13:40.812 }, 00:13:40.812 { 00:13:40.812 "name": "pt3", 00:13:40.812 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:40.812 "is_configured": true, 00:13:40.812 "data_offset": 2048, 00:13:40.812 "data_size": 63488 00:13:40.812 } 00:13:40.812 ] 00:13:40.812 } 00:13:40.812 } 00:13:40.812 }' 00:13:40.812 23:35:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:40.812 23:35:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:13:40.812 pt2 00:13:40.812 pt3' 00:13:40.812 23:35:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:40.812 23:35:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:13:40.812 23:35:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:41.070 23:35:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:41.070 "name": "pt1", 00:13:41.070 "aliases": [ 00:13:41.070 "00000000-0000-0000-0000-000000000001" 00:13:41.070 ], 00:13:41.070 "product_name": "passthru", 00:13:41.070 "block_size": 512, 00:13:41.070 "num_blocks": 65536, 00:13:41.070 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:41.070 "assigned_rate_limits": { 00:13:41.070 "rw_ios_per_sec": 0, 00:13:41.070 "rw_mbytes_per_sec": 0, 00:13:41.070 "r_mbytes_per_sec": 0, 00:13:41.070 "w_mbytes_per_sec": 0 00:13:41.070 }, 00:13:41.070 "claimed": true, 00:13:41.070 "claim_type": "exclusive_write", 00:13:41.070 "zoned": false, 00:13:41.070 "supported_io_types": { 00:13:41.070 "read": true, 00:13:41.070 "write": true, 00:13:41.070 "unmap": true, 00:13:41.070 "flush": true, 00:13:41.070 "reset": true, 00:13:41.070 "nvme_admin": false, 00:13:41.070 "nvme_io": false, 00:13:41.070 "nvme_io_md": false, 00:13:41.070 "write_zeroes": true, 00:13:41.070 "zcopy": true, 00:13:41.070 "get_zone_info": false, 00:13:41.070 "zone_management": false, 00:13:41.070 "zone_append": false, 00:13:41.070 "compare": false, 00:13:41.070 "compare_and_write": false, 00:13:41.070 "abort": true, 00:13:41.070 "seek_hole": false, 00:13:41.070 "seek_data": false, 00:13:41.070 "copy": true, 00:13:41.070 "nvme_iov_md": false 00:13:41.070 }, 00:13:41.070 "memory_domains": [ 00:13:41.071 { 00:13:41.071 "dma_device_id": "system", 00:13:41.071 "dma_device_type": 1 00:13:41.071 }, 00:13:41.071 { 00:13:41.071 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:41.071 "dma_device_type": 2 00:13:41.071 } 00:13:41.071 ], 00:13:41.071 "driver_specific": { 00:13:41.071 "passthru": { 00:13:41.071 "name": "pt1", 00:13:41.071 "base_bdev_name": "malloc1" 00:13:41.071 } 00:13:41.071 } 00:13:41.071 }' 00:13:41.071 23:35:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:41.071 23:35:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:41.071 23:35:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:41.071 23:35:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:41.071 23:35:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:41.329 23:35:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:41.329 23:35:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:41.329 23:35:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:41.329 23:35:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:41.329 23:35:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:41.329 23:35:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:41.329 23:35:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:41.329 23:35:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:41.329 23:35:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:41.329 23:35:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:13:41.588 23:35:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:41.588 "name": "pt2", 00:13:41.588 "aliases": [ 00:13:41.588 "00000000-0000-0000-0000-000000000002" 00:13:41.588 ], 00:13:41.588 "product_name": "passthru", 00:13:41.588 "block_size": 512, 00:13:41.588 "num_blocks": 65536, 00:13:41.588 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:41.588 "assigned_rate_limits": { 00:13:41.588 "rw_ios_per_sec": 0, 00:13:41.588 "rw_mbytes_per_sec": 0, 00:13:41.588 "r_mbytes_per_sec": 0, 00:13:41.588 "w_mbytes_per_sec": 0 00:13:41.588 }, 00:13:41.588 "claimed": true, 00:13:41.588 "claim_type": "exclusive_write", 00:13:41.588 "zoned": false, 00:13:41.588 "supported_io_types": { 00:13:41.588 "read": true, 00:13:41.588 "write": true, 00:13:41.588 "unmap": true, 00:13:41.588 "flush": true, 00:13:41.588 "reset": true, 00:13:41.588 "nvme_admin": false, 00:13:41.588 "nvme_io": false, 00:13:41.588 "nvme_io_md": false, 00:13:41.588 "write_zeroes": true, 00:13:41.588 "zcopy": true, 00:13:41.588 "get_zone_info": false, 00:13:41.588 "zone_management": false, 00:13:41.588 "zone_append": false, 00:13:41.588 "compare": false, 00:13:41.588 "compare_and_write": false, 00:13:41.588 "abort": true, 00:13:41.588 "seek_hole": false, 00:13:41.588 "seek_data": false, 00:13:41.588 "copy": true, 00:13:41.588 "nvme_iov_md": false 00:13:41.588 }, 00:13:41.588 "memory_domains": [ 00:13:41.588 { 00:13:41.588 "dma_device_id": "system", 00:13:41.588 "dma_device_type": 1 00:13:41.588 }, 00:13:41.588 { 00:13:41.588 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:41.588 "dma_device_type": 2 00:13:41.588 } 00:13:41.588 ], 00:13:41.588 "driver_specific": { 00:13:41.588 "passthru": { 00:13:41.588 "name": "pt2", 00:13:41.588 "base_bdev_name": "malloc2" 00:13:41.588 } 00:13:41.588 } 00:13:41.588 }' 00:13:41.588 23:35:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:41.588 23:35:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:41.588 23:35:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:41.588 23:35:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:41.588 23:35:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:41.588 23:35:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:41.588 23:35:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:41.846 23:35:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:41.846 23:35:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:41.846 23:35:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:41.846 23:35:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:41.846 23:35:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:41.846 23:35:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:41.846 23:35:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:41.846 23:35:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:13:42.104 23:35:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:42.104 "name": "pt3", 00:13:42.104 "aliases": [ 00:13:42.104 "00000000-0000-0000-0000-000000000003" 00:13:42.104 ], 00:13:42.104 "product_name": "passthru", 00:13:42.104 "block_size": 512, 00:13:42.104 "num_blocks": 65536, 00:13:42.104 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:42.104 "assigned_rate_limits": { 00:13:42.104 "rw_ios_per_sec": 0, 00:13:42.104 "rw_mbytes_per_sec": 0, 00:13:42.104 "r_mbytes_per_sec": 0, 00:13:42.104 "w_mbytes_per_sec": 0 00:13:42.104 }, 00:13:42.104 "claimed": true, 00:13:42.104 "claim_type": "exclusive_write", 00:13:42.104 "zoned": false, 00:13:42.104 "supported_io_types": { 00:13:42.104 "read": true, 00:13:42.104 "write": true, 00:13:42.104 "unmap": true, 00:13:42.104 "flush": true, 00:13:42.104 "reset": true, 00:13:42.104 "nvme_admin": false, 00:13:42.104 "nvme_io": false, 00:13:42.104 "nvme_io_md": false, 00:13:42.104 "write_zeroes": true, 00:13:42.104 "zcopy": true, 00:13:42.104 "get_zone_info": false, 00:13:42.104 "zone_management": false, 00:13:42.104 "zone_append": false, 00:13:42.104 "compare": false, 00:13:42.104 "compare_and_write": false, 00:13:42.104 "abort": true, 00:13:42.104 "seek_hole": false, 00:13:42.104 "seek_data": false, 00:13:42.104 "copy": true, 00:13:42.104 "nvme_iov_md": false 00:13:42.104 }, 00:13:42.104 "memory_domains": [ 00:13:42.104 { 00:13:42.104 "dma_device_id": "system", 00:13:42.104 "dma_device_type": 1 00:13:42.104 }, 00:13:42.104 { 00:13:42.104 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:42.104 "dma_device_type": 2 00:13:42.104 } 00:13:42.104 ], 00:13:42.104 "driver_specific": { 00:13:42.105 "passthru": { 00:13:42.105 "name": "pt3", 00:13:42.105 "base_bdev_name": "malloc3" 00:13:42.105 } 00:13:42.105 } 00:13:42.105 }' 00:13:42.105 23:35:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:42.105 23:35:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:42.105 23:35:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:42.105 23:35:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:42.105 23:35:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:42.105 23:35:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:42.105 23:35:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:42.105 23:35:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:42.105 23:35:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:42.363 23:35:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:42.363 23:35:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:42.363 23:35:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:42.363 23:35:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:42.363 23:35:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:13:42.363 [2024-07-24 23:35:27.339327] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:42.363 23:35:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=1b775eb9-ddee-4a6d-8940-6035b8461585 00:13:42.363 23:35:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 1b775eb9-ddee-4a6d-8940-6035b8461585 ']' 00:13:42.363 23:35:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:42.621 [2024-07-24 23:35:27.507583] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:42.621 [2024-07-24 23:35:27.507594] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:42.621 [2024-07-24 23:35:27.507626] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:42.621 [2024-07-24 23:35:27.507674] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:42.621 [2024-07-24 23:35:27.507680] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17b2d20 name raid_bdev1, state offline 00:13:42.621 23:35:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:42.621 23:35:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:13:42.879 23:35:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:13:42.879 23:35:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:13:42.879 23:35:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:13:42.879 23:35:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:13:42.879 23:35:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:13:42.879 23:35:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:13:43.136 23:35:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:13:43.136 23:35:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:13:43.394 23:35:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:13:43.394 23:35:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:13:43.394 23:35:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:13:43.394 23:35:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:13:43.394 23:35:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # local es=0 00:13:43.394 23:35:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:13:43.394 23:35:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:43.394 23:35:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:13:43.394 23:35:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:43.394 23:35:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:13:43.394 23:35:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:43.394 23:35:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:13:43.394 23:35:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:43.394 23:35:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:13:43.394 23:35:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:13:43.652 [2024-07-24 23:35:28.510144] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:13:43.652 [2024-07-24 23:35:28.511132] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:13:43.652 [2024-07-24 23:35:28.511161] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:13:43.652 [2024-07-24 23:35:28.511192] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:13:43.652 [2024-07-24 23:35:28.511217] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:13:43.652 [2024-07-24 23:35:28.511229] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:13:43.652 [2024-07-24 23:35:28.511238] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:43.652 [2024-07-24 23:35:28.511244] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16f1250 name raid_bdev1, state configuring 00:13:43.652 request: 00:13:43.652 { 00:13:43.652 "name": "raid_bdev1", 00:13:43.652 "raid_level": "raid1", 00:13:43.652 "base_bdevs": [ 00:13:43.652 "malloc1", 00:13:43.652 "malloc2", 00:13:43.652 "malloc3" 00:13:43.652 ], 00:13:43.652 "superblock": false, 00:13:43.652 "method": "bdev_raid_create", 00:13:43.652 "req_id": 1 00:13:43.652 } 00:13:43.652 Got JSON-RPC error response 00:13:43.652 response: 00:13:43.652 { 00:13:43.652 "code": -17, 00:13:43.652 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:13:43.652 } 00:13:43.652 23:35:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # es=1 00:13:43.652 23:35:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:13:43.652 23:35:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:13:43.652 23:35:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:13:43.652 23:35:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:43.652 23:35:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:13:43.909 23:35:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:13:43.909 23:35:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:13:43.909 23:35:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:13:43.909 [2024-07-24 23:35:28.846982] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:13:43.909 [2024-07-24 23:35:28.847010] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:43.909 [2024-07-24 23:35:28.847021] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16f1000 00:13:43.909 [2024-07-24 23:35:28.847028] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:43.909 [2024-07-24 23:35:28.848163] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:43.909 [2024-07-24 23:35:28.848184] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:13:43.909 [2024-07-24 23:35:28.848225] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:13:43.909 [2024-07-24 23:35:28.848243] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:13:43.909 pt1 00:13:43.909 23:35:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:13:43.909 23:35:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:43.909 23:35:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:43.909 23:35:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:43.909 23:35:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:43.909 23:35:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:43.909 23:35:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:43.909 23:35:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:43.909 23:35:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:43.909 23:35:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:43.909 23:35:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:43.909 23:35:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:44.166 23:35:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:44.166 "name": "raid_bdev1", 00:13:44.166 "uuid": "1b775eb9-ddee-4a6d-8940-6035b8461585", 00:13:44.166 "strip_size_kb": 0, 00:13:44.166 "state": "configuring", 00:13:44.166 "raid_level": "raid1", 00:13:44.166 "superblock": true, 00:13:44.166 "num_base_bdevs": 3, 00:13:44.166 "num_base_bdevs_discovered": 1, 00:13:44.166 "num_base_bdevs_operational": 3, 00:13:44.166 "base_bdevs_list": [ 00:13:44.166 { 00:13:44.166 "name": "pt1", 00:13:44.166 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:44.166 "is_configured": true, 00:13:44.166 "data_offset": 2048, 00:13:44.166 "data_size": 63488 00:13:44.166 }, 00:13:44.166 { 00:13:44.166 "name": null, 00:13:44.166 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:44.166 "is_configured": false, 00:13:44.166 "data_offset": 2048, 00:13:44.166 "data_size": 63488 00:13:44.166 }, 00:13:44.166 { 00:13:44.166 "name": null, 00:13:44.166 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:44.166 "is_configured": false, 00:13:44.166 "data_offset": 2048, 00:13:44.166 "data_size": 63488 00:13:44.166 } 00:13:44.166 ] 00:13:44.166 }' 00:13:44.166 23:35:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:44.166 23:35:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:44.731 23:35:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:13:44.731 23:35:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:13:44.731 [2024-07-24 23:35:29.661128] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:13:44.731 [2024-07-24 23:35:29.661162] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:44.731 [2024-07-24 23:35:29.661173] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16e8530 00:13:44.731 [2024-07-24 23:35:29.661179] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:44.731 [2024-07-24 23:35:29.661409] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:44.731 [2024-07-24 23:35:29.661421] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:13:44.731 [2024-07-24 23:35:29.661461] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:13:44.731 [2024-07-24 23:35:29.661480] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:44.731 pt2 00:13:44.731 23:35:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:13:44.988 [2024-07-24 23:35:29.829568] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:13:44.988 23:35:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:13:44.989 23:35:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:44.989 23:35:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:44.989 23:35:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:44.989 23:35:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:44.989 23:35:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:44.989 23:35:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:44.989 23:35:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:44.989 23:35:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:44.989 23:35:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:44.989 23:35:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:44.989 23:35:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:45.246 23:35:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:45.246 "name": "raid_bdev1", 00:13:45.246 "uuid": "1b775eb9-ddee-4a6d-8940-6035b8461585", 00:13:45.246 "strip_size_kb": 0, 00:13:45.246 "state": "configuring", 00:13:45.246 "raid_level": "raid1", 00:13:45.246 "superblock": true, 00:13:45.246 "num_base_bdevs": 3, 00:13:45.246 "num_base_bdevs_discovered": 1, 00:13:45.246 "num_base_bdevs_operational": 3, 00:13:45.246 "base_bdevs_list": [ 00:13:45.246 { 00:13:45.246 "name": "pt1", 00:13:45.246 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:45.246 "is_configured": true, 00:13:45.246 "data_offset": 2048, 00:13:45.246 "data_size": 63488 00:13:45.246 }, 00:13:45.246 { 00:13:45.246 "name": null, 00:13:45.246 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:45.246 "is_configured": false, 00:13:45.246 "data_offset": 2048, 00:13:45.246 "data_size": 63488 00:13:45.246 }, 00:13:45.246 { 00:13:45.246 "name": null, 00:13:45.246 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:45.246 "is_configured": false, 00:13:45.246 "data_offset": 2048, 00:13:45.246 "data_size": 63488 00:13:45.246 } 00:13:45.246 ] 00:13:45.246 }' 00:13:45.246 23:35:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:45.246 23:35:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:45.811 23:35:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:13:45.811 23:35:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:13:45.811 23:35:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:13:45.811 [2024-07-24 23:35:30.671743] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:13:45.811 [2024-07-24 23:35:30.671780] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:45.811 [2024-07-24 23:35:30.671793] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16f1230 00:13:45.811 [2024-07-24 23:35:30.671798] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:45.811 [2024-07-24 23:35:30.672037] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:45.811 [2024-07-24 23:35:30.672049] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:13:45.811 [2024-07-24 23:35:30.672091] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:13:45.811 [2024-07-24 23:35:30.672105] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:45.811 pt2 00:13:45.811 23:35:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:13:45.811 23:35:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:13:45.811 23:35:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:13:46.069 [2024-07-24 23:35:30.840174] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:13:46.069 [2024-07-24 23:35:30.840198] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:46.069 [2024-07-24 23:35:30.840206] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16e7e50 00:13:46.069 [2024-07-24 23:35:30.840212] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:46.069 [2024-07-24 23:35:30.840414] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:46.069 [2024-07-24 23:35:30.840424] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:13:46.069 [2024-07-24 23:35:30.840460] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:13:46.069 [2024-07-24 23:35:30.840476] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:13:46.069 [2024-07-24 23:35:30.840548] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x16ea3b0 00:13:46.069 [2024-07-24 23:35:30.840553] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:46.069 [2024-07-24 23:35:30.840661] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16f4270 00:13:46.069 [2024-07-24 23:35:30.840744] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x16ea3b0 00:13:46.069 [2024-07-24 23:35:30.840749] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x16ea3b0 00:13:46.069 [2024-07-24 23:35:30.840811] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:46.069 pt3 00:13:46.069 23:35:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:13:46.069 23:35:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:13:46.069 23:35:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:13:46.069 23:35:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:46.069 23:35:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:46.069 23:35:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:46.069 23:35:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:46.069 23:35:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:46.069 23:35:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:46.069 23:35:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:46.069 23:35:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:46.069 23:35:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:46.069 23:35:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:46.069 23:35:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:46.069 23:35:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:46.069 "name": "raid_bdev1", 00:13:46.069 "uuid": "1b775eb9-ddee-4a6d-8940-6035b8461585", 00:13:46.069 "strip_size_kb": 0, 00:13:46.069 "state": "online", 00:13:46.069 "raid_level": "raid1", 00:13:46.069 "superblock": true, 00:13:46.069 "num_base_bdevs": 3, 00:13:46.069 "num_base_bdevs_discovered": 3, 00:13:46.069 "num_base_bdevs_operational": 3, 00:13:46.069 "base_bdevs_list": [ 00:13:46.069 { 00:13:46.069 "name": "pt1", 00:13:46.069 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:46.069 "is_configured": true, 00:13:46.069 "data_offset": 2048, 00:13:46.069 "data_size": 63488 00:13:46.069 }, 00:13:46.069 { 00:13:46.069 "name": "pt2", 00:13:46.069 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:46.069 "is_configured": true, 00:13:46.069 "data_offset": 2048, 00:13:46.069 "data_size": 63488 00:13:46.069 }, 00:13:46.069 { 00:13:46.069 "name": "pt3", 00:13:46.069 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:46.069 "is_configured": true, 00:13:46.069 "data_offset": 2048, 00:13:46.069 "data_size": 63488 00:13:46.069 } 00:13:46.069 ] 00:13:46.069 }' 00:13:46.069 23:35:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:46.069 23:35:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:46.634 23:35:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:13:46.634 23:35:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:13:46.634 23:35:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:46.634 23:35:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:46.634 23:35:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:46.634 23:35:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:46.634 23:35:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:46.634 23:35:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:46.892 [2024-07-24 23:35:31.682575] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:46.892 23:35:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:46.892 "name": "raid_bdev1", 00:13:46.892 "aliases": [ 00:13:46.892 "1b775eb9-ddee-4a6d-8940-6035b8461585" 00:13:46.892 ], 00:13:46.892 "product_name": "Raid Volume", 00:13:46.892 "block_size": 512, 00:13:46.892 "num_blocks": 63488, 00:13:46.892 "uuid": "1b775eb9-ddee-4a6d-8940-6035b8461585", 00:13:46.892 "assigned_rate_limits": { 00:13:46.892 "rw_ios_per_sec": 0, 00:13:46.892 "rw_mbytes_per_sec": 0, 00:13:46.892 "r_mbytes_per_sec": 0, 00:13:46.892 "w_mbytes_per_sec": 0 00:13:46.892 }, 00:13:46.892 "claimed": false, 00:13:46.892 "zoned": false, 00:13:46.892 "supported_io_types": { 00:13:46.892 "read": true, 00:13:46.892 "write": true, 00:13:46.892 "unmap": false, 00:13:46.892 "flush": false, 00:13:46.892 "reset": true, 00:13:46.892 "nvme_admin": false, 00:13:46.892 "nvme_io": false, 00:13:46.892 "nvme_io_md": false, 00:13:46.892 "write_zeroes": true, 00:13:46.892 "zcopy": false, 00:13:46.892 "get_zone_info": false, 00:13:46.892 "zone_management": false, 00:13:46.892 "zone_append": false, 00:13:46.892 "compare": false, 00:13:46.892 "compare_and_write": false, 00:13:46.892 "abort": false, 00:13:46.892 "seek_hole": false, 00:13:46.892 "seek_data": false, 00:13:46.892 "copy": false, 00:13:46.892 "nvme_iov_md": false 00:13:46.892 }, 00:13:46.892 "memory_domains": [ 00:13:46.892 { 00:13:46.892 "dma_device_id": "system", 00:13:46.892 "dma_device_type": 1 00:13:46.892 }, 00:13:46.892 { 00:13:46.892 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:46.892 "dma_device_type": 2 00:13:46.892 }, 00:13:46.892 { 00:13:46.892 "dma_device_id": "system", 00:13:46.892 "dma_device_type": 1 00:13:46.892 }, 00:13:46.892 { 00:13:46.892 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:46.892 "dma_device_type": 2 00:13:46.892 }, 00:13:46.892 { 00:13:46.892 "dma_device_id": "system", 00:13:46.892 "dma_device_type": 1 00:13:46.892 }, 00:13:46.892 { 00:13:46.892 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:46.892 "dma_device_type": 2 00:13:46.892 } 00:13:46.892 ], 00:13:46.892 "driver_specific": { 00:13:46.892 "raid": { 00:13:46.892 "uuid": "1b775eb9-ddee-4a6d-8940-6035b8461585", 00:13:46.892 "strip_size_kb": 0, 00:13:46.892 "state": "online", 00:13:46.892 "raid_level": "raid1", 00:13:46.892 "superblock": true, 00:13:46.892 "num_base_bdevs": 3, 00:13:46.892 "num_base_bdevs_discovered": 3, 00:13:46.892 "num_base_bdevs_operational": 3, 00:13:46.892 "base_bdevs_list": [ 00:13:46.892 { 00:13:46.892 "name": "pt1", 00:13:46.892 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:46.892 "is_configured": true, 00:13:46.892 "data_offset": 2048, 00:13:46.892 "data_size": 63488 00:13:46.892 }, 00:13:46.892 { 00:13:46.892 "name": "pt2", 00:13:46.892 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:46.892 "is_configured": true, 00:13:46.892 "data_offset": 2048, 00:13:46.892 "data_size": 63488 00:13:46.892 }, 00:13:46.892 { 00:13:46.892 "name": "pt3", 00:13:46.892 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:46.892 "is_configured": true, 00:13:46.892 "data_offset": 2048, 00:13:46.892 "data_size": 63488 00:13:46.892 } 00:13:46.892 ] 00:13:46.892 } 00:13:46.892 } 00:13:46.892 }' 00:13:46.892 23:35:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:46.893 23:35:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:13:46.893 pt2 00:13:46.893 pt3' 00:13:46.893 23:35:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:46.893 23:35:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:13:46.893 23:35:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:47.150 23:35:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:47.150 "name": "pt1", 00:13:47.150 "aliases": [ 00:13:47.150 "00000000-0000-0000-0000-000000000001" 00:13:47.150 ], 00:13:47.150 "product_name": "passthru", 00:13:47.150 "block_size": 512, 00:13:47.150 "num_blocks": 65536, 00:13:47.150 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:47.150 "assigned_rate_limits": { 00:13:47.150 "rw_ios_per_sec": 0, 00:13:47.150 "rw_mbytes_per_sec": 0, 00:13:47.150 "r_mbytes_per_sec": 0, 00:13:47.150 "w_mbytes_per_sec": 0 00:13:47.150 }, 00:13:47.150 "claimed": true, 00:13:47.150 "claim_type": "exclusive_write", 00:13:47.150 "zoned": false, 00:13:47.150 "supported_io_types": { 00:13:47.150 "read": true, 00:13:47.150 "write": true, 00:13:47.150 "unmap": true, 00:13:47.150 "flush": true, 00:13:47.150 "reset": true, 00:13:47.150 "nvme_admin": false, 00:13:47.150 "nvme_io": false, 00:13:47.150 "nvme_io_md": false, 00:13:47.150 "write_zeroes": true, 00:13:47.150 "zcopy": true, 00:13:47.150 "get_zone_info": false, 00:13:47.150 "zone_management": false, 00:13:47.150 "zone_append": false, 00:13:47.150 "compare": false, 00:13:47.150 "compare_and_write": false, 00:13:47.150 "abort": true, 00:13:47.150 "seek_hole": false, 00:13:47.150 "seek_data": false, 00:13:47.150 "copy": true, 00:13:47.150 "nvme_iov_md": false 00:13:47.150 }, 00:13:47.150 "memory_domains": [ 00:13:47.150 { 00:13:47.150 "dma_device_id": "system", 00:13:47.150 "dma_device_type": 1 00:13:47.150 }, 00:13:47.150 { 00:13:47.150 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:47.150 "dma_device_type": 2 00:13:47.150 } 00:13:47.150 ], 00:13:47.150 "driver_specific": { 00:13:47.150 "passthru": { 00:13:47.150 "name": "pt1", 00:13:47.150 "base_bdev_name": "malloc1" 00:13:47.150 } 00:13:47.150 } 00:13:47.150 }' 00:13:47.150 23:35:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:47.150 23:35:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:47.150 23:35:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:47.150 23:35:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:47.150 23:35:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:47.150 23:35:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:47.150 23:35:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:47.150 23:35:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:47.150 23:35:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:47.150 23:35:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:47.408 23:35:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:47.408 23:35:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:47.408 23:35:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:47.408 23:35:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:47.408 23:35:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:13:47.408 23:35:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:47.408 "name": "pt2", 00:13:47.408 "aliases": [ 00:13:47.408 "00000000-0000-0000-0000-000000000002" 00:13:47.408 ], 00:13:47.408 "product_name": "passthru", 00:13:47.408 "block_size": 512, 00:13:47.408 "num_blocks": 65536, 00:13:47.408 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:47.408 "assigned_rate_limits": { 00:13:47.408 "rw_ios_per_sec": 0, 00:13:47.408 "rw_mbytes_per_sec": 0, 00:13:47.408 "r_mbytes_per_sec": 0, 00:13:47.408 "w_mbytes_per_sec": 0 00:13:47.408 }, 00:13:47.408 "claimed": true, 00:13:47.408 "claim_type": "exclusive_write", 00:13:47.408 "zoned": false, 00:13:47.408 "supported_io_types": { 00:13:47.408 "read": true, 00:13:47.408 "write": true, 00:13:47.408 "unmap": true, 00:13:47.408 "flush": true, 00:13:47.408 "reset": true, 00:13:47.408 "nvme_admin": false, 00:13:47.408 "nvme_io": false, 00:13:47.408 "nvme_io_md": false, 00:13:47.408 "write_zeroes": true, 00:13:47.408 "zcopy": true, 00:13:47.408 "get_zone_info": false, 00:13:47.408 "zone_management": false, 00:13:47.408 "zone_append": false, 00:13:47.408 "compare": false, 00:13:47.408 "compare_and_write": false, 00:13:47.408 "abort": true, 00:13:47.408 "seek_hole": false, 00:13:47.408 "seek_data": false, 00:13:47.408 "copy": true, 00:13:47.408 "nvme_iov_md": false 00:13:47.408 }, 00:13:47.408 "memory_domains": [ 00:13:47.408 { 00:13:47.408 "dma_device_id": "system", 00:13:47.408 "dma_device_type": 1 00:13:47.408 }, 00:13:47.408 { 00:13:47.408 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:47.408 "dma_device_type": 2 00:13:47.408 } 00:13:47.408 ], 00:13:47.408 "driver_specific": { 00:13:47.408 "passthru": { 00:13:47.408 "name": "pt2", 00:13:47.408 "base_bdev_name": "malloc2" 00:13:47.408 } 00:13:47.408 } 00:13:47.408 }' 00:13:47.408 23:35:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:47.665 23:35:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:47.665 23:35:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:47.665 23:35:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:47.665 23:35:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:47.665 23:35:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:47.665 23:35:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:47.665 23:35:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:47.665 23:35:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:47.665 23:35:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:47.665 23:35:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:47.922 23:35:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:47.922 23:35:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:47.922 23:35:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:13:47.922 23:35:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:47.922 23:35:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:47.922 "name": "pt3", 00:13:47.922 "aliases": [ 00:13:47.922 "00000000-0000-0000-0000-000000000003" 00:13:47.922 ], 00:13:47.922 "product_name": "passthru", 00:13:47.922 "block_size": 512, 00:13:47.922 "num_blocks": 65536, 00:13:47.922 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:47.922 "assigned_rate_limits": { 00:13:47.922 "rw_ios_per_sec": 0, 00:13:47.922 "rw_mbytes_per_sec": 0, 00:13:47.922 "r_mbytes_per_sec": 0, 00:13:47.922 "w_mbytes_per_sec": 0 00:13:47.922 }, 00:13:47.922 "claimed": true, 00:13:47.922 "claim_type": "exclusive_write", 00:13:47.922 "zoned": false, 00:13:47.922 "supported_io_types": { 00:13:47.922 "read": true, 00:13:47.922 "write": true, 00:13:47.922 "unmap": true, 00:13:47.922 "flush": true, 00:13:47.922 "reset": true, 00:13:47.922 "nvme_admin": false, 00:13:47.922 "nvme_io": false, 00:13:47.922 "nvme_io_md": false, 00:13:47.922 "write_zeroes": true, 00:13:47.922 "zcopy": true, 00:13:47.922 "get_zone_info": false, 00:13:47.922 "zone_management": false, 00:13:47.922 "zone_append": false, 00:13:47.922 "compare": false, 00:13:47.922 "compare_and_write": false, 00:13:47.922 "abort": true, 00:13:47.922 "seek_hole": false, 00:13:47.922 "seek_data": false, 00:13:47.922 "copy": true, 00:13:47.922 "nvme_iov_md": false 00:13:47.922 }, 00:13:47.922 "memory_domains": [ 00:13:47.922 { 00:13:47.922 "dma_device_id": "system", 00:13:47.922 "dma_device_type": 1 00:13:47.922 }, 00:13:47.922 { 00:13:47.922 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:47.922 "dma_device_type": 2 00:13:47.922 } 00:13:47.922 ], 00:13:47.922 "driver_specific": { 00:13:47.922 "passthru": { 00:13:47.922 "name": "pt3", 00:13:47.922 "base_bdev_name": "malloc3" 00:13:47.922 } 00:13:47.922 } 00:13:47.922 }' 00:13:47.922 23:35:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:47.922 23:35:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:48.180 23:35:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:48.180 23:35:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:48.180 23:35:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:48.180 23:35:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:48.180 23:35:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:48.180 23:35:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:48.180 23:35:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:48.180 23:35:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:48.180 23:35:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:48.180 23:35:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:48.180 23:35:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:48.180 23:35:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:13:48.438 [2024-07-24 23:35:33.314781] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:48.438 23:35:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 1b775eb9-ddee-4a6d-8940-6035b8461585 '!=' 1b775eb9-ddee-4a6d-8940-6035b8461585 ']' 00:13:48.438 23:35:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:13:48.438 23:35:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:48.438 23:35:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:13:48.438 23:35:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:13:48.695 [2024-07-24 23:35:33.483047] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:13:48.695 23:35:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:13:48.695 23:35:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:48.695 23:35:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:48.695 23:35:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:48.695 23:35:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:48.695 23:35:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:48.695 23:35:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:48.695 23:35:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:48.695 23:35:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:48.695 23:35:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:48.695 23:35:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:48.695 23:35:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:48.695 23:35:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:48.695 "name": "raid_bdev1", 00:13:48.695 "uuid": "1b775eb9-ddee-4a6d-8940-6035b8461585", 00:13:48.695 "strip_size_kb": 0, 00:13:48.695 "state": "online", 00:13:48.695 "raid_level": "raid1", 00:13:48.695 "superblock": true, 00:13:48.695 "num_base_bdevs": 3, 00:13:48.695 "num_base_bdevs_discovered": 2, 00:13:48.695 "num_base_bdevs_operational": 2, 00:13:48.695 "base_bdevs_list": [ 00:13:48.695 { 00:13:48.695 "name": null, 00:13:48.695 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:48.695 "is_configured": false, 00:13:48.695 "data_offset": 2048, 00:13:48.695 "data_size": 63488 00:13:48.695 }, 00:13:48.695 { 00:13:48.695 "name": "pt2", 00:13:48.695 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:48.695 "is_configured": true, 00:13:48.695 "data_offset": 2048, 00:13:48.695 "data_size": 63488 00:13:48.695 }, 00:13:48.695 { 00:13:48.695 "name": "pt3", 00:13:48.695 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:48.695 "is_configured": true, 00:13:48.695 "data_offset": 2048, 00:13:48.695 "data_size": 63488 00:13:48.695 } 00:13:48.695 ] 00:13:48.695 }' 00:13:48.695 23:35:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:48.695 23:35:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:49.262 23:35:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:49.520 [2024-07-24 23:35:34.305161] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:49.520 [2024-07-24 23:35:34.305182] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:49.520 [2024-07-24 23:35:34.305217] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:49.520 [2024-07-24 23:35:34.305253] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:49.520 [2024-07-24 23:35:34.305260] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16ea3b0 name raid_bdev1, state offline 00:13:49.520 23:35:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:49.520 23:35:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:13:49.520 23:35:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:13:49.520 23:35:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:13:49.520 23:35:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:13:49.520 23:35:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:13:49.520 23:35:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:13:49.778 23:35:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:13:49.778 23:35:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:13:49.778 23:35:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:13:50.036 23:35:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:13:50.036 23:35:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:13:50.036 23:35:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:13:50.036 23:35:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:13:50.036 23:35:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:13:50.036 [2024-07-24 23:35:35.002936] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:13:50.036 [2024-07-24 23:35:35.002971] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:50.036 [2024-07-24 23:35:35.002981] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16e8150 00:13:50.036 [2024-07-24 23:35:35.002987] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:50.036 [2024-07-24 23:35:35.004101] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:50.036 [2024-07-24 23:35:35.004121] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:13:50.036 [2024-07-24 23:35:35.004162] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:13:50.036 [2024-07-24 23:35:35.004179] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:50.036 pt2 00:13:50.036 23:35:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:13:50.036 23:35:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:50.036 23:35:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:50.036 23:35:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:50.036 23:35:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:50.036 23:35:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:50.036 23:35:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:50.036 23:35:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:50.036 23:35:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:50.036 23:35:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:50.036 23:35:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:50.036 23:35:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:50.294 23:35:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:50.294 "name": "raid_bdev1", 00:13:50.294 "uuid": "1b775eb9-ddee-4a6d-8940-6035b8461585", 00:13:50.294 "strip_size_kb": 0, 00:13:50.294 "state": "configuring", 00:13:50.294 "raid_level": "raid1", 00:13:50.294 "superblock": true, 00:13:50.294 "num_base_bdevs": 3, 00:13:50.294 "num_base_bdevs_discovered": 1, 00:13:50.294 "num_base_bdevs_operational": 2, 00:13:50.294 "base_bdevs_list": [ 00:13:50.294 { 00:13:50.294 "name": null, 00:13:50.294 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:50.294 "is_configured": false, 00:13:50.294 "data_offset": 2048, 00:13:50.294 "data_size": 63488 00:13:50.294 }, 00:13:50.294 { 00:13:50.294 "name": "pt2", 00:13:50.294 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:50.294 "is_configured": true, 00:13:50.294 "data_offset": 2048, 00:13:50.294 "data_size": 63488 00:13:50.294 }, 00:13:50.294 { 00:13:50.294 "name": null, 00:13:50.294 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:50.294 "is_configured": false, 00:13:50.294 "data_offset": 2048, 00:13:50.294 "data_size": 63488 00:13:50.294 } 00:13:50.294 ] 00:13:50.294 }' 00:13:50.294 23:35:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:50.294 23:35:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:50.858 23:35:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:13:50.858 23:35:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:13:50.858 23:35:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=2 00:13:50.858 23:35:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:13:50.858 [2024-07-24 23:35:35.829077] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:13:50.858 [2024-07-24 23:35:35.829109] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:50.858 [2024-07-24 23:35:35.829120] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16f3f80 00:13:50.858 [2024-07-24 23:35:35.829126] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:50.858 [2024-07-24 23:35:35.829360] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:50.858 [2024-07-24 23:35:35.829370] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:13:50.858 [2024-07-24 23:35:35.829409] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:13:50.858 [2024-07-24 23:35:35.829422] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:13:50.858 [2024-07-24 23:35:35.829496] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x16e7720 00:13:50.858 [2024-07-24 23:35:35.829502] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:50.858 [2024-07-24 23:35:35.829614] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17b2b30 00:13:50.858 [2024-07-24 23:35:35.829701] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x16e7720 00:13:50.858 [2024-07-24 23:35:35.829707] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x16e7720 00:13:50.858 [2024-07-24 23:35:35.829771] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:50.858 pt3 00:13:50.858 23:35:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:13:50.858 23:35:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:50.858 23:35:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:50.858 23:35:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:50.858 23:35:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:50.858 23:35:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:50.858 23:35:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:50.858 23:35:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:50.858 23:35:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:50.858 23:35:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:50.858 23:35:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:50.858 23:35:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:51.116 23:35:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:51.116 "name": "raid_bdev1", 00:13:51.116 "uuid": "1b775eb9-ddee-4a6d-8940-6035b8461585", 00:13:51.116 "strip_size_kb": 0, 00:13:51.116 "state": "online", 00:13:51.116 "raid_level": "raid1", 00:13:51.116 "superblock": true, 00:13:51.116 "num_base_bdevs": 3, 00:13:51.117 "num_base_bdevs_discovered": 2, 00:13:51.117 "num_base_bdevs_operational": 2, 00:13:51.117 "base_bdevs_list": [ 00:13:51.117 { 00:13:51.117 "name": null, 00:13:51.117 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:51.117 "is_configured": false, 00:13:51.117 "data_offset": 2048, 00:13:51.117 "data_size": 63488 00:13:51.117 }, 00:13:51.117 { 00:13:51.117 "name": "pt2", 00:13:51.117 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:51.117 "is_configured": true, 00:13:51.117 "data_offset": 2048, 00:13:51.117 "data_size": 63488 00:13:51.117 }, 00:13:51.117 { 00:13:51.117 "name": "pt3", 00:13:51.117 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:51.117 "is_configured": true, 00:13:51.117 "data_offset": 2048, 00:13:51.117 "data_size": 63488 00:13:51.117 } 00:13:51.117 ] 00:13:51.117 }' 00:13:51.117 23:35:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:51.117 23:35:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:51.682 23:35:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:51.682 [2024-07-24 23:35:36.659212] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:51.682 [2024-07-24 23:35:36.659229] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:51.682 [2024-07-24 23:35:36.659263] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:51.682 [2024-07-24 23:35:36.659297] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:51.682 [2024-07-24 23:35:36.659303] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16e7720 name raid_bdev1, state offline 00:13:51.682 23:35:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:51.940 23:35:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:13:51.940 23:35:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:13:51.940 23:35:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:13:51.940 23:35:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 3 -gt 2 ']' 00:13:51.940 23:35:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@533 -- # i=2 00:13:51.940 23:35:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@534 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:13:52.197 23:35:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:13:52.197 [2024-07-24 23:35:37.196587] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:13:52.197 [2024-07-24 23:35:37.196617] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:52.198 [2024-07-24 23:35:37.196627] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16f3490 00:13:52.198 [2024-07-24 23:35:37.196633] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:52.455 [2024-07-24 23:35:37.197825] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:52.455 [2024-07-24 23:35:37.197845] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:13:52.456 [2024-07-24 23:35:37.197889] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:13:52.456 [2024-07-24 23:35:37.197907] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:13:52.456 [2024-07-24 23:35:37.197975] bdev_raid.c:3639:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:13:52.456 [2024-07-24 23:35:37.197982] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:52.456 [2024-07-24 23:35:37.197989] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16f2dd0 name raid_bdev1, state configuring 00:13:52.456 [2024-07-24 23:35:37.198004] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:52.456 pt1 00:13:52.456 23:35:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 3 -gt 2 ']' 00:13:52.456 23:35:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@544 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:13:52.456 23:35:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:52.456 23:35:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:52.456 23:35:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:52.456 23:35:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:52.456 23:35:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:52.456 23:35:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:52.456 23:35:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:52.456 23:35:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:52.456 23:35:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:52.456 23:35:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:52.456 23:35:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:52.456 23:35:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:52.456 "name": "raid_bdev1", 00:13:52.456 "uuid": "1b775eb9-ddee-4a6d-8940-6035b8461585", 00:13:52.456 "strip_size_kb": 0, 00:13:52.456 "state": "configuring", 00:13:52.456 "raid_level": "raid1", 00:13:52.456 "superblock": true, 00:13:52.456 "num_base_bdevs": 3, 00:13:52.456 "num_base_bdevs_discovered": 1, 00:13:52.456 "num_base_bdevs_operational": 2, 00:13:52.456 "base_bdevs_list": [ 00:13:52.456 { 00:13:52.456 "name": null, 00:13:52.456 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:52.456 "is_configured": false, 00:13:52.456 "data_offset": 2048, 00:13:52.456 "data_size": 63488 00:13:52.456 }, 00:13:52.456 { 00:13:52.456 "name": "pt2", 00:13:52.456 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:52.456 "is_configured": true, 00:13:52.456 "data_offset": 2048, 00:13:52.456 "data_size": 63488 00:13:52.456 }, 00:13:52.456 { 00:13:52.456 "name": null, 00:13:52.456 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:52.456 "is_configured": false, 00:13:52.456 "data_offset": 2048, 00:13:52.456 "data_size": 63488 00:13:52.456 } 00:13:52.456 ] 00:13:52.456 }' 00:13:52.456 23:35:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:52.456 23:35:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:53.021 23:35:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs configuring 00:13:53.021 23:35:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:13:53.280 23:35:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # [[ false == \f\a\l\s\e ]] 00:13:53.280 23:35:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@548 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:13:53.280 [2024-07-24 23:35:38.219354] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:13:53.280 [2024-07-24 23:35:38.219390] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:53.280 [2024-07-24 23:35:38.219401] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16e8930 00:13:53.280 [2024-07-24 23:35:38.219407] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:53.280 [2024-07-24 23:35:38.219649] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:53.280 [2024-07-24 23:35:38.219661] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:13:53.280 [2024-07-24 23:35:38.219701] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:13:53.280 [2024-07-24 23:35:38.219714] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:13:53.280 [2024-07-24 23:35:38.219780] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x16f3050 00:13:53.280 [2024-07-24 23:35:38.219786] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:53.280 [2024-07-24 23:35:38.219891] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16eeb00 00:13:53.280 [2024-07-24 23:35:38.219974] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x16f3050 00:13:53.280 [2024-07-24 23:35:38.219978] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x16f3050 00:13:53.280 [2024-07-24 23:35:38.220041] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:53.280 pt3 00:13:53.280 23:35:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:13:53.280 23:35:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:53.280 23:35:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:53.280 23:35:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:53.280 23:35:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:53.280 23:35:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:53.280 23:35:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:53.280 23:35:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:53.280 23:35:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:53.280 23:35:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:53.280 23:35:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:53.280 23:35:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:53.538 23:35:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:53.538 "name": "raid_bdev1", 00:13:53.538 "uuid": "1b775eb9-ddee-4a6d-8940-6035b8461585", 00:13:53.538 "strip_size_kb": 0, 00:13:53.538 "state": "online", 00:13:53.538 "raid_level": "raid1", 00:13:53.538 "superblock": true, 00:13:53.538 "num_base_bdevs": 3, 00:13:53.538 "num_base_bdevs_discovered": 2, 00:13:53.538 "num_base_bdevs_operational": 2, 00:13:53.538 "base_bdevs_list": [ 00:13:53.538 { 00:13:53.538 "name": null, 00:13:53.538 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:53.538 "is_configured": false, 00:13:53.538 "data_offset": 2048, 00:13:53.538 "data_size": 63488 00:13:53.538 }, 00:13:53.538 { 00:13:53.538 "name": "pt2", 00:13:53.538 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:53.538 "is_configured": true, 00:13:53.538 "data_offset": 2048, 00:13:53.538 "data_size": 63488 00:13:53.538 }, 00:13:53.538 { 00:13:53.538 "name": "pt3", 00:13:53.538 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:53.538 "is_configured": true, 00:13:53.538 "data_offset": 2048, 00:13:53.538 "data_size": 63488 00:13:53.538 } 00:13:53.538 ] 00:13:53.538 }' 00:13:53.538 23:35:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:53.538 23:35:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:54.104 23:35:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:13:54.104 23:35:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:13:54.104 23:35:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:13:54.104 23:35:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:54.104 23:35:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:13:54.362 [2024-07-24 23:35:39.190025] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:54.362 23:35:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' 1b775eb9-ddee-4a6d-8940-6035b8461585 '!=' 1b775eb9-ddee-4a6d-8940-6035b8461585 ']' 00:13:54.363 23:35:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 296860 00:13:54.363 23:35:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@950 -- # '[' -z 296860 ']' 00:13:54.363 23:35:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # kill -0 296860 00:13:54.363 23:35:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # uname 00:13:54.363 23:35:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:13:54.363 23:35:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 296860 00:13:54.363 23:35:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:13:54.363 23:35:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:13:54.363 23:35:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 296860' 00:13:54.363 killing process with pid 296860 00:13:54.363 23:35:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@969 -- # kill 296860 00:13:54.363 [2024-07-24 23:35:39.247001] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:54.363 [2024-07-24 23:35:39.247039] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:54.363 [2024-07-24 23:35:39.247075] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:54.363 [2024-07-24 23:35:39.247080] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16f3050 name raid_bdev1, state offline 00:13:54.363 23:35:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@974 -- # wait 296860 00:13:54.363 [2024-07-24 23:35:39.271172] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:54.621 23:35:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:13:54.621 00:13:54.621 real 0m16.591s 00:13:54.621 user 0m30.706s 00:13:54.621 sys 0m2.618s 00:13:54.621 23:35:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:13:54.621 23:35:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:54.621 ************************************ 00:13:54.621 END TEST raid_superblock_test 00:13:54.621 ************************************ 00:13:54.621 23:35:39 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 3 read 00:13:54.621 23:35:39 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:13:54.621 23:35:39 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:13:54.621 23:35:39 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:54.621 ************************************ 00:13:54.621 START TEST raid_read_error_test 00:13:54.621 ************************************ 00:13:54.621 23:35:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test raid1 3 read 00:13:54.621 23:35:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:13:54.621 23:35:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:13:54.621 23:35:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:13:54.621 23:35:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:13:54.621 23:35:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:54.621 23:35:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:13:54.621 23:35:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:54.621 23:35:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:54.621 23:35:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:13:54.621 23:35:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:54.621 23:35:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:54.621 23:35:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:13:54.621 23:35:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:54.621 23:35:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:54.621 23:35:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:13:54.621 23:35:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:13:54.621 23:35:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:13:54.621 23:35:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:13:54.621 23:35:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:13:54.621 23:35:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:13:54.621 23:35:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:13:54.621 23:35:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:13:54.621 23:35:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:13:54.621 23:35:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:13:54.621 23:35:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.j4aK8FWKRj 00:13:54.621 23:35:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=300053 00:13:54.621 23:35:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 300053 /var/tmp/spdk-raid.sock 00:13:54.621 23:35:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:13:54.621 23:35:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@831 -- # '[' -z 300053 ']' 00:13:54.621 23:35:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:54.621 23:35:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:13:54.621 23:35:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:54.621 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:54.621 23:35:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:13:54.621 23:35:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:54.621 [2024-07-24 23:35:39.568974] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:13:54.622 [2024-07-24 23:35:39.569014] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid300053 ] 00:13:54.879 [2024-07-24 23:35:39.631827] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:54.879 [2024-07-24 23:35:39.709985] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:54.879 [2024-07-24 23:35:39.767491] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:54.879 [2024-07-24 23:35:39.767520] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:55.445 23:35:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:13:55.445 23:35:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@864 -- # return 0 00:13:55.445 23:35:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:55.445 23:35:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:13:55.702 BaseBdev1_malloc 00:13:55.702 23:35:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:13:55.702 true 00:13:55.702 23:35:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:13:55.958 [2024-07-24 23:35:40.843729] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:13:55.958 [2024-07-24 23:35:40.843760] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:55.958 [2024-07-24 23:35:40.843772] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x267b550 00:13:55.958 [2024-07-24 23:35:40.843778] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:55.958 [2024-07-24 23:35:40.845016] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:55.958 [2024-07-24 23:35:40.845038] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:13:55.958 BaseBdev1 00:13:55.958 23:35:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:55.958 23:35:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:13:56.215 BaseBdev2_malloc 00:13:56.215 23:35:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:13:56.215 true 00:13:56.215 23:35:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:13:56.473 [2024-07-24 23:35:41.320408] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:13:56.473 [2024-07-24 23:35:41.320436] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:56.473 [2024-07-24 23:35:41.320446] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x267fd90 00:13:56.473 [2024-07-24 23:35:41.320452] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:56.473 [2024-07-24 23:35:41.321441] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:56.473 [2024-07-24 23:35:41.321462] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:13:56.473 BaseBdev2 00:13:56.473 23:35:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:56.473 23:35:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:13:56.759 BaseBdev3_malloc 00:13:56.759 23:35:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:13:56.759 true 00:13:56.759 23:35:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:13:57.027 [2024-07-24 23:35:41.829136] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:13:57.027 [2024-07-24 23:35:41.829168] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:57.027 [2024-07-24 23:35:41.829180] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2682050 00:13:57.027 [2024-07-24 23:35:41.829186] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:57.027 [2024-07-24 23:35:41.830283] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:57.027 [2024-07-24 23:35:41.830304] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:13:57.027 BaseBdev3 00:13:57.027 23:35:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:13:57.027 [2024-07-24 23:35:41.993582] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:57.027 [2024-07-24 23:35:41.994466] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:57.027 [2024-07-24 23:35:41.994523] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:57.027 [2024-07-24 23:35:41.994670] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x2683700 00:13:57.027 [2024-07-24 23:35:41.994678] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:57.027 [2024-07-24 23:35:41.994814] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26832a0 00:13:57.027 [2024-07-24 23:35:41.994920] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2683700 00:13:57.027 [2024-07-24 23:35:41.994925] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2683700 00:13:57.027 [2024-07-24 23:35:41.994994] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:57.027 23:35:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:13:57.027 23:35:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:57.027 23:35:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:57.027 23:35:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:57.027 23:35:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:57.027 23:35:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:57.027 23:35:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:57.027 23:35:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:57.027 23:35:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:57.027 23:35:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:57.027 23:35:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:57.027 23:35:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:57.285 23:35:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:57.285 "name": "raid_bdev1", 00:13:57.285 "uuid": "07c6c816-4b24-4b1f-8143-8ae324f5edc2", 00:13:57.285 "strip_size_kb": 0, 00:13:57.285 "state": "online", 00:13:57.285 "raid_level": "raid1", 00:13:57.285 "superblock": true, 00:13:57.285 "num_base_bdevs": 3, 00:13:57.285 "num_base_bdevs_discovered": 3, 00:13:57.285 "num_base_bdevs_operational": 3, 00:13:57.285 "base_bdevs_list": [ 00:13:57.285 { 00:13:57.285 "name": "BaseBdev1", 00:13:57.285 "uuid": "f456de28-732f-50f5-8c3c-ed856f8d2608", 00:13:57.285 "is_configured": true, 00:13:57.285 "data_offset": 2048, 00:13:57.285 "data_size": 63488 00:13:57.285 }, 00:13:57.285 { 00:13:57.285 "name": "BaseBdev2", 00:13:57.285 "uuid": "26556208-bf89-5f31-8bbc-0015cf8527c3", 00:13:57.285 "is_configured": true, 00:13:57.285 "data_offset": 2048, 00:13:57.285 "data_size": 63488 00:13:57.285 }, 00:13:57.285 { 00:13:57.285 "name": "BaseBdev3", 00:13:57.285 "uuid": "49d155b3-1e50-5285-985a-91256a5c7829", 00:13:57.285 "is_configured": true, 00:13:57.285 "data_offset": 2048, 00:13:57.285 "data_size": 63488 00:13:57.285 } 00:13:57.285 ] 00:13:57.285 }' 00:13:57.285 23:35:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:57.285 23:35:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:57.850 23:35:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:13:57.850 23:35:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:13:57.850 [2024-07-24 23:35:42.755799] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x24d1280 00:13:58.783 23:35:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:13:59.040 23:35:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:13:59.041 23:35:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:13:59.041 23:35:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:13:59.041 23:35:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:13:59.041 23:35:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:13:59.041 23:35:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:59.041 23:35:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:59.041 23:35:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:59.041 23:35:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:59.041 23:35:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:59.041 23:35:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:59.041 23:35:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:59.041 23:35:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:59.041 23:35:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:59.041 23:35:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:59.041 23:35:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:59.041 23:35:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:59.041 "name": "raid_bdev1", 00:13:59.041 "uuid": "07c6c816-4b24-4b1f-8143-8ae324f5edc2", 00:13:59.041 "strip_size_kb": 0, 00:13:59.041 "state": "online", 00:13:59.041 "raid_level": "raid1", 00:13:59.041 "superblock": true, 00:13:59.041 "num_base_bdevs": 3, 00:13:59.041 "num_base_bdevs_discovered": 3, 00:13:59.041 "num_base_bdevs_operational": 3, 00:13:59.041 "base_bdevs_list": [ 00:13:59.041 { 00:13:59.041 "name": "BaseBdev1", 00:13:59.041 "uuid": "f456de28-732f-50f5-8c3c-ed856f8d2608", 00:13:59.041 "is_configured": true, 00:13:59.041 "data_offset": 2048, 00:13:59.041 "data_size": 63488 00:13:59.041 }, 00:13:59.041 { 00:13:59.041 "name": "BaseBdev2", 00:13:59.041 "uuid": "26556208-bf89-5f31-8bbc-0015cf8527c3", 00:13:59.041 "is_configured": true, 00:13:59.041 "data_offset": 2048, 00:13:59.041 "data_size": 63488 00:13:59.041 }, 00:13:59.041 { 00:13:59.041 "name": "BaseBdev3", 00:13:59.041 "uuid": "49d155b3-1e50-5285-985a-91256a5c7829", 00:13:59.041 "is_configured": true, 00:13:59.041 "data_offset": 2048, 00:13:59.041 "data_size": 63488 00:13:59.041 } 00:13:59.041 ] 00:13:59.041 }' 00:13:59.041 23:35:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:59.041 23:35:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:59.605 23:35:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:59.863 [2024-07-24 23:35:44.705259] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:59.863 [2024-07-24 23:35:44.705293] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:59.863 [2024-07-24 23:35:44.707326] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:59.863 [2024-07-24 23:35:44.707348] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:59.863 [2024-07-24 23:35:44.707409] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:59.863 [2024-07-24 23:35:44.707415] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2683700 name raid_bdev1, state offline 00:13:59.863 0 00:13:59.863 23:35:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 300053 00:13:59.863 23:35:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@950 -- # '[' -z 300053 ']' 00:13:59.863 23:35:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # kill -0 300053 00:13:59.863 23:35:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # uname 00:13:59.863 23:35:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:13:59.863 23:35:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 300053 00:13:59.863 23:35:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:13:59.863 23:35:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:13:59.863 23:35:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 300053' 00:13:59.863 killing process with pid 300053 00:13:59.863 23:35:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@969 -- # kill 300053 00:13:59.863 [2024-07-24 23:35:44.765406] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:59.863 23:35:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@974 -- # wait 300053 00:13:59.863 [2024-07-24 23:35:44.783964] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:00.121 23:35:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.j4aK8FWKRj 00:14:00.121 23:35:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:14:00.121 23:35:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:14:00.121 23:35:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:14:00.121 23:35:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:14:00.121 23:35:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:00.121 23:35:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:14:00.121 23:35:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:14:00.121 00:14:00.121 real 0m5.465s 00:14:00.121 user 0m8.488s 00:14:00.121 sys 0m0.799s 00:14:00.121 23:35:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:14:00.121 23:35:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:00.121 ************************************ 00:14:00.121 END TEST raid_read_error_test 00:14:00.121 ************************************ 00:14:00.121 23:35:45 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 3 write 00:14:00.121 23:35:45 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:14:00.121 23:35:45 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:14:00.121 23:35:45 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:00.121 ************************************ 00:14:00.121 START TEST raid_write_error_test 00:14:00.121 ************************************ 00:14:00.121 23:35:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test raid1 3 write 00:14:00.121 23:35:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:14:00.121 23:35:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:14:00.121 23:35:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:14:00.121 23:35:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:14:00.121 23:35:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:00.121 23:35:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:14:00.121 23:35:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:00.121 23:35:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:00.121 23:35:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:14:00.121 23:35:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:00.121 23:35:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:00.121 23:35:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:14:00.121 23:35:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:00.121 23:35:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:00.121 23:35:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:14:00.121 23:35:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:14:00.121 23:35:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:14:00.121 23:35:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:14:00.121 23:35:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:14:00.121 23:35:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:14:00.121 23:35:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:14:00.121 23:35:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:14:00.121 23:35:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:14:00.121 23:35:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:14:00.121 23:35:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.RXQ9IW8dfI 00:14:00.121 23:35:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:14:00.121 23:35:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=301064 00:14:00.121 23:35:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 301064 /var/tmp/spdk-raid.sock 00:14:00.121 23:35:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@831 -- # '[' -z 301064 ']' 00:14:00.121 23:35:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:00.121 23:35:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:14:00.121 23:35:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:00.121 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:00.121 23:35:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:14:00.121 23:35:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:00.121 [2024-07-24 23:35:45.090609] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:14:00.121 [2024-07-24 23:35:45.090647] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid301064 ] 00:14:00.379 [2024-07-24 23:35:45.152610] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:00.379 [2024-07-24 23:35:45.231255] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:00.379 [2024-07-24 23:35:45.289443] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:00.379 [2024-07-24 23:35:45.289473] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:00.945 23:35:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:14:00.945 23:35:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@864 -- # return 0 00:14:00.945 23:35:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:00.945 23:35:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:14:01.204 BaseBdev1_malloc 00:14:01.204 23:35:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:14:01.462 true 00:14:01.462 23:35:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:14:01.462 [2024-07-24 23:35:46.361354] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:14:01.462 [2024-07-24 23:35:46.361389] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:01.462 [2024-07-24 23:35:46.361402] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd20550 00:14:01.462 [2024-07-24 23:35:46.361408] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:01.462 [2024-07-24 23:35:46.362592] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:01.462 [2024-07-24 23:35:46.362613] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:14:01.462 BaseBdev1 00:14:01.462 23:35:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:01.462 23:35:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:14:01.720 BaseBdev2_malloc 00:14:01.720 23:35:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:14:01.720 true 00:14:01.720 23:35:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:14:01.978 [2024-07-24 23:35:46.825994] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:14:01.978 [2024-07-24 23:35:46.826025] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:01.978 [2024-07-24 23:35:46.826039] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd24d90 00:14:01.978 [2024-07-24 23:35:46.826046] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:01.978 [2024-07-24 23:35:46.827053] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:01.978 [2024-07-24 23:35:46.827075] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:14:01.978 BaseBdev2 00:14:01.978 23:35:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:01.978 23:35:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:14:02.236 BaseBdev3_malloc 00:14:02.236 23:35:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:14:02.236 true 00:14:02.236 23:35:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:14:02.493 [2024-07-24 23:35:47.318839] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:14:02.493 [2024-07-24 23:35:47.318870] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:02.493 [2024-07-24 23:35:47.318883] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd27050 00:14:02.493 [2024-07-24 23:35:47.318888] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:02.493 [2024-07-24 23:35:47.319914] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:02.493 [2024-07-24 23:35:47.319934] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:14:02.493 BaseBdev3 00:14:02.493 23:35:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:14:02.493 [2024-07-24 23:35:47.487296] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:02.493 [2024-07-24 23:35:47.488182] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:02.493 [2024-07-24 23:35:47.488228] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:02.493 [2024-07-24 23:35:47.488365] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xd28700 00:14:02.493 [2024-07-24 23:35:47.488372] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:14:02.493 [2024-07-24 23:35:47.488527] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd282a0 00:14:02.493 [2024-07-24 23:35:47.488636] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xd28700 00:14:02.493 [2024-07-24 23:35:47.488642] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xd28700 00:14:02.493 [2024-07-24 23:35:47.488711] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:02.750 23:35:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:14:02.750 23:35:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:02.750 23:35:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:02.750 23:35:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:02.750 23:35:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:02.750 23:35:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:02.750 23:35:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:02.750 23:35:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:02.750 23:35:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:02.750 23:35:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:02.750 23:35:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:02.750 23:35:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:02.750 23:35:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:02.750 "name": "raid_bdev1", 00:14:02.750 "uuid": "66bc730e-5e58-4ab0-8f03-7c2521ecf041", 00:14:02.750 "strip_size_kb": 0, 00:14:02.750 "state": "online", 00:14:02.750 "raid_level": "raid1", 00:14:02.750 "superblock": true, 00:14:02.750 "num_base_bdevs": 3, 00:14:02.750 "num_base_bdevs_discovered": 3, 00:14:02.750 "num_base_bdevs_operational": 3, 00:14:02.750 "base_bdevs_list": [ 00:14:02.750 { 00:14:02.750 "name": "BaseBdev1", 00:14:02.750 "uuid": "c3d8c523-e1c2-5f64-a8e1-939e395955af", 00:14:02.750 "is_configured": true, 00:14:02.750 "data_offset": 2048, 00:14:02.750 "data_size": 63488 00:14:02.750 }, 00:14:02.750 { 00:14:02.750 "name": "BaseBdev2", 00:14:02.750 "uuid": "e67603fa-2135-5659-8910-26229c1b292c", 00:14:02.750 "is_configured": true, 00:14:02.750 "data_offset": 2048, 00:14:02.750 "data_size": 63488 00:14:02.750 }, 00:14:02.750 { 00:14:02.750 "name": "BaseBdev3", 00:14:02.750 "uuid": "bf746307-6a75-524a-97bc-7cd0c682019c", 00:14:02.750 "is_configured": true, 00:14:02.750 "data_offset": 2048, 00:14:02.750 "data_size": 63488 00:14:02.750 } 00:14:02.750 ] 00:14:02.750 }' 00:14:02.750 23:35:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:02.750 23:35:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:03.316 23:35:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:14:03.316 23:35:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:14:03.316 [2024-07-24 23:35:48.233437] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb76280 00:14:04.250 23:35:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:14:04.508 [2024-07-24 23:35:49.317532] bdev_raid.c:2247:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:14:04.508 [2024-07-24 23:35:49.317572] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:04.508 [2024-07-24 23:35:49.317741] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0xb76280 00:14:04.508 23:35:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:14:04.508 23:35:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:14:04.508 23:35:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:14:04.508 23:35:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=2 00:14:04.508 23:35:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:14:04.508 23:35:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:04.508 23:35:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:04.508 23:35:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:04.508 23:35:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:04.508 23:35:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:04.508 23:35:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:04.508 23:35:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:04.508 23:35:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:04.508 23:35:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:04.508 23:35:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:04.508 23:35:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:04.767 23:35:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:04.767 "name": "raid_bdev1", 00:14:04.767 "uuid": "66bc730e-5e58-4ab0-8f03-7c2521ecf041", 00:14:04.767 "strip_size_kb": 0, 00:14:04.767 "state": "online", 00:14:04.767 "raid_level": "raid1", 00:14:04.767 "superblock": true, 00:14:04.767 "num_base_bdevs": 3, 00:14:04.767 "num_base_bdevs_discovered": 2, 00:14:04.767 "num_base_bdevs_operational": 2, 00:14:04.767 "base_bdevs_list": [ 00:14:04.767 { 00:14:04.767 "name": null, 00:14:04.767 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:04.767 "is_configured": false, 00:14:04.767 "data_offset": 2048, 00:14:04.767 "data_size": 63488 00:14:04.767 }, 00:14:04.767 { 00:14:04.767 "name": "BaseBdev2", 00:14:04.767 "uuid": "e67603fa-2135-5659-8910-26229c1b292c", 00:14:04.767 "is_configured": true, 00:14:04.767 "data_offset": 2048, 00:14:04.767 "data_size": 63488 00:14:04.767 }, 00:14:04.767 { 00:14:04.767 "name": "BaseBdev3", 00:14:04.767 "uuid": "bf746307-6a75-524a-97bc-7cd0c682019c", 00:14:04.767 "is_configured": true, 00:14:04.767 "data_offset": 2048, 00:14:04.767 "data_size": 63488 00:14:04.767 } 00:14:04.767 ] 00:14:04.767 }' 00:14:04.767 23:35:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:04.767 23:35:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:05.025 23:35:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:05.284 [2024-07-24 23:35:50.156640] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:05.284 [2024-07-24 23:35:50.156676] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:05.284 [2024-07-24 23:35:50.158649] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:05.284 [2024-07-24 23:35:50.158671] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:05.284 [2024-07-24 23:35:50.158717] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:05.284 [2024-07-24 23:35:50.158723] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd28700 name raid_bdev1, state offline 00:14:05.284 0 00:14:05.284 23:35:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 301064 00:14:05.284 23:35:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@950 -- # '[' -z 301064 ']' 00:14:05.284 23:35:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # kill -0 301064 00:14:05.284 23:35:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # uname 00:14:05.284 23:35:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:14:05.284 23:35:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 301064 00:14:05.284 23:35:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:14:05.284 23:35:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:14:05.284 23:35:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 301064' 00:14:05.284 killing process with pid 301064 00:14:05.284 23:35:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@969 -- # kill 301064 00:14:05.284 [2024-07-24 23:35:50.225250] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:05.284 23:35:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@974 -- # wait 301064 00:14:05.284 [2024-07-24 23:35:50.243362] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:05.543 23:35:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.RXQ9IW8dfI 00:14:05.543 23:35:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:14:05.543 23:35:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:14:05.543 23:35:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:14:05.543 23:35:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:14:05.543 23:35:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:05.543 23:35:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:14:05.543 23:35:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:14:05.543 00:14:05.543 real 0m5.387s 00:14:05.543 user 0m8.380s 00:14:05.543 sys 0m0.779s 00:14:05.543 23:35:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:14:05.543 23:35:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:05.543 ************************************ 00:14:05.543 END TEST raid_write_error_test 00:14:05.543 ************************************ 00:14:05.543 23:35:50 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:14:05.543 23:35:50 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:14:05.543 23:35:50 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 4 false 00:14:05.543 23:35:50 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:14:05.543 23:35:50 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:14:05.543 23:35:50 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:05.543 ************************************ 00:14:05.543 START TEST raid_state_function_test 00:14:05.543 ************************************ 00:14:05.543 23:35:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # raid_state_function_test raid0 4 false 00:14:05.543 23:35:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:14:05.543 23:35:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:14:05.543 23:35:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:14:05.543 23:35:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:14:05.543 23:35:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:14:05.543 23:35:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:05.543 23:35:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:14:05.543 23:35:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:05.543 23:35:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:05.543 23:35:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:14:05.543 23:35:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:05.543 23:35:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:05.543 23:35:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:14:05.543 23:35:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:05.543 23:35:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:05.543 23:35:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:14:05.543 23:35:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:05.543 23:35:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:05.543 23:35:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:14:05.543 23:35:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:14:05.543 23:35:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:14:05.543 23:35:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:14:05.543 23:35:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:14:05.543 23:35:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:14:05.543 23:35:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:14:05.543 23:35:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:14:05.543 23:35:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:14:05.543 23:35:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:14:05.543 23:35:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:14:05.543 23:35:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=302070 00:14:05.543 23:35:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 302070' 00:14:05.543 Process raid pid: 302070 00:14:05.543 23:35:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:14:05.543 23:35:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 302070 /var/tmp/spdk-raid.sock 00:14:05.543 23:35:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@831 -- # '[' -z 302070 ']' 00:14:05.543 23:35:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:05.543 23:35:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:14:05.543 23:35:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:05.543 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:05.543 23:35:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:14:05.543 23:35:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:05.802 [2024-07-24 23:35:50.551231] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:14:05.802 [2024-07-24 23:35:50.551270] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:14:05.802 [2024-07-24 23:35:50.614451] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:05.802 [2024-07-24 23:35:50.693333] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:05.802 [2024-07-24 23:35:50.749874] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:05.802 [2024-07-24 23:35:50.749901] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:06.368 23:35:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:14:06.368 23:35:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@864 -- # return 0 00:14:06.368 23:35:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:14:06.627 [2024-07-24 23:35:51.497565] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:06.627 [2024-07-24 23:35:51.497595] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:06.627 [2024-07-24 23:35:51.497605] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:06.627 [2024-07-24 23:35:51.497611] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:06.627 [2024-07-24 23:35:51.497618] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:06.627 [2024-07-24 23:35:51.497624] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:06.627 [2024-07-24 23:35:51.497627] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:14:06.627 [2024-07-24 23:35:51.497633] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:14:06.627 23:35:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:06.627 23:35:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:06.627 23:35:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:06.627 23:35:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:06.627 23:35:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:06.627 23:35:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:06.627 23:35:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:06.627 23:35:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:06.627 23:35:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:06.627 23:35:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:06.627 23:35:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:06.627 23:35:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:06.885 23:35:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:06.885 "name": "Existed_Raid", 00:14:06.885 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:06.885 "strip_size_kb": 64, 00:14:06.885 "state": "configuring", 00:14:06.885 "raid_level": "raid0", 00:14:06.885 "superblock": false, 00:14:06.885 "num_base_bdevs": 4, 00:14:06.885 "num_base_bdevs_discovered": 0, 00:14:06.885 "num_base_bdevs_operational": 4, 00:14:06.885 "base_bdevs_list": [ 00:14:06.885 { 00:14:06.885 "name": "BaseBdev1", 00:14:06.885 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:06.885 "is_configured": false, 00:14:06.885 "data_offset": 0, 00:14:06.885 "data_size": 0 00:14:06.885 }, 00:14:06.885 { 00:14:06.885 "name": "BaseBdev2", 00:14:06.885 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:06.885 "is_configured": false, 00:14:06.885 "data_offset": 0, 00:14:06.885 "data_size": 0 00:14:06.885 }, 00:14:06.885 { 00:14:06.885 "name": "BaseBdev3", 00:14:06.886 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:06.886 "is_configured": false, 00:14:06.886 "data_offset": 0, 00:14:06.886 "data_size": 0 00:14:06.886 }, 00:14:06.886 { 00:14:06.886 "name": "BaseBdev4", 00:14:06.886 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:06.886 "is_configured": false, 00:14:06.886 "data_offset": 0, 00:14:06.886 "data_size": 0 00:14:06.886 } 00:14:06.886 ] 00:14:06.886 }' 00:14:06.886 23:35:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:06.886 23:35:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:07.453 23:35:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:07.453 [2024-07-24 23:35:52.335656] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:07.453 [2024-07-24 23:35:52.335683] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17f0b50 name Existed_Raid, state configuring 00:14:07.453 23:35:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:14:07.711 [2024-07-24 23:35:52.512125] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:07.711 [2024-07-24 23:35:52.512142] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:07.711 [2024-07-24 23:35:52.512147] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:07.711 [2024-07-24 23:35:52.512156] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:07.711 [2024-07-24 23:35:52.512160] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:07.711 [2024-07-24 23:35:52.512164] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:07.711 [2024-07-24 23:35:52.512168] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:14:07.711 [2024-07-24 23:35:52.512173] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:14:07.711 23:35:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:07.711 [2024-07-24 23:35:52.696724] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:07.711 BaseBdev1 00:14:07.969 23:35:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:14:07.969 23:35:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:14:07.969 23:35:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:14:07.969 23:35:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:14:07.969 23:35:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:14:07.969 23:35:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:14:07.969 23:35:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:07.969 23:35:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:08.227 [ 00:14:08.227 { 00:14:08.227 "name": "BaseBdev1", 00:14:08.227 "aliases": [ 00:14:08.227 "3062aea6-de4b-4b3b-82dd-81849f52f42c" 00:14:08.227 ], 00:14:08.227 "product_name": "Malloc disk", 00:14:08.227 "block_size": 512, 00:14:08.227 "num_blocks": 65536, 00:14:08.227 "uuid": "3062aea6-de4b-4b3b-82dd-81849f52f42c", 00:14:08.227 "assigned_rate_limits": { 00:14:08.227 "rw_ios_per_sec": 0, 00:14:08.227 "rw_mbytes_per_sec": 0, 00:14:08.227 "r_mbytes_per_sec": 0, 00:14:08.227 "w_mbytes_per_sec": 0 00:14:08.227 }, 00:14:08.227 "claimed": true, 00:14:08.227 "claim_type": "exclusive_write", 00:14:08.227 "zoned": false, 00:14:08.227 "supported_io_types": { 00:14:08.227 "read": true, 00:14:08.227 "write": true, 00:14:08.227 "unmap": true, 00:14:08.227 "flush": true, 00:14:08.227 "reset": true, 00:14:08.227 "nvme_admin": false, 00:14:08.227 "nvme_io": false, 00:14:08.227 "nvme_io_md": false, 00:14:08.227 "write_zeroes": true, 00:14:08.227 "zcopy": true, 00:14:08.227 "get_zone_info": false, 00:14:08.227 "zone_management": false, 00:14:08.227 "zone_append": false, 00:14:08.227 "compare": false, 00:14:08.227 "compare_and_write": false, 00:14:08.227 "abort": true, 00:14:08.227 "seek_hole": false, 00:14:08.227 "seek_data": false, 00:14:08.227 "copy": true, 00:14:08.227 "nvme_iov_md": false 00:14:08.227 }, 00:14:08.227 "memory_domains": [ 00:14:08.227 { 00:14:08.227 "dma_device_id": "system", 00:14:08.227 "dma_device_type": 1 00:14:08.227 }, 00:14:08.227 { 00:14:08.227 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:08.227 "dma_device_type": 2 00:14:08.227 } 00:14:08.227 ], 00:14:08.227 "driver_specific": {} 00:14:08.227 } 00:14:08.227 ] 00:14:08.227 23:35:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:14:08.227 23:35:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:08.228 23:35:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:08.228 23:35:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:08.228 23:35:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:08.228 23:35:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:08.228 23:35:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:08.228 23:35:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:08.228 23:35:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:08.228 23:35:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:08.228 23:35:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:08.228 23:35:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:08.228 23:35:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:08.228 23:35:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:08.228 "name": "Existed_Raid", 00:14:08.228 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:08.228 "strip_size_kb": 64, 00:14:08.228 "state": "configuring", 00:14:08.228 "raid_level": "raid0", 00:14:08.228 "superblock": false, 00:14:08.228 "num_base_bdevs": 4, 00:14:08.228 "num_base_bdevs_discovered": 1, 00:14:08.228 "num_base_bdevs_operational": 4, 00:14:08.228 "base_bdevs_list": [ 00:14:08.228 { 00:14:08.228 "name": "BaseBdev1", 00:14:08.228 "uuid": "3062aea6-de4b-4b3b-82dd-81849f52f42c", 00:14:08.228 "is_configured": true, 00:14:08.228 "data_offset": 0, 00:14:08.228 "data_size": 65536 00:14:08.228 }, 00:14:08.228 { 00:14:08.228 "name": "BaseBdev2", 00:14:08.228 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:08.228 "is_configured": false, 00:14:08.228 "data_offset": 0, 00:14:08.228 "data_size": 0 00:14:08.228 }, 00:14:08.228 { 00:14:08.228 "name": "BaseBdev3", 00:14:08.228 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:08.228 "is_configured": false, 00:14:08.228 "data_offset": 0, 00:14:08.228 "data_size": 0 00:14:08.228 }, 00:14:08.228 { 00:14:08.228 "name": "BaseBdev4", 00:14:08.228 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:08.228 "is_configured": false, 00:14:08.228 "data_offset": 0, 00:14:08.228 "data_size": 0 00:14:08.228 } 00:14:08.228 ] 00:14:08.228 }' 00:14:08.228 23:35:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:08.228 23:35:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:08.793 23:35:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:09.050 [2024-07-24 23:35:53.855713] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:09.050 [2024-07-24 23:35:53.855742] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17f03a0 name Existed_Raid, state configuring 00:14:09.050 23:35:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:14:09.050 [2024-07-24 23:35:54.024159] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:09.050 [2024-07-24 23:35:54.025169] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:09.050 [2024-07-24 23:35:54.025193] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:09.050 [2024-07-24 23:35:54.025199] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:09.050 [2024-07-24 23:35:54.025204] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:09.050 [2024-07-24 23:35:54.025209] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:14:09.051 [2024-07-24 23:35:54.025214] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:14:09.051 23:35:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:14:09.051 23:35:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:09.051 23:35:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:09.051 23:35:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:09.051 23:35:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:09.051 23:35:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:09.051 23:35:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:09.051 23:35:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:09.051 23:35:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:09.051 23:35:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:09.051 23:35:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:09.051 23:35:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:09.051 23:35:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:09.051 23:35:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:09.308 23:35:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:09.308 "name": "Existed_Raid", 00:14:09.308 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:09.308 "strip_size_kb": 64, 00:14:09.308 "state": "configuring", 00:14:09.308 "raid_level": "raid0", 00:14:09.308 "superblock": false, 00:14:09.308 "num_base_bdevs": 4, 00:14:09.308 "num_base_bdevs_discovered": 1, 00:14:09.308 "num_base_bdevs_operational": 4, 00:14:09.308 "base_bdevs_list": [ 00:14:09.308 { 00:14:09.308 "name": "BaseBdev1", 00:14:09.308 "uuid": "3062aea6-de4b-4b3b-82dd-81849f52f42c", 00:14:09.308 "is_configured": true, 00:14:09.308 "data_offset": 0, 00:14:09.308 "data_size": 65536 00:14:09.308 }, 00:14:09.308 { 00:14:09.308 "name": "BaseBdev2", 00:14:09.308 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:09.308 "is_configured": false, 00:14:09.308 "data_offset": 0, 00:14:09.308 "data_size": 0 00:14:09.308 }, 00:14:09.308 { 00:14:09.308 "name": "BaseBdev3", 00:14:09.308 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:09.308 "is_configured": false, 00:14:09.308 "data_offset": 0, 00:14:09.308 "data_size": 0 00:14:09.308 }, 00:14:09.308 { 00:14:09.308 "name": "BaseBdev4", 00:14:09.308 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:09.308 "is_configured": false, 00:14:09.308 "data_offset": 0, 00:14:09.308 "data_size": 0 00:14:09.308 } 00:14:09.308 ] 00:14:09.308 }' 00:14:09.308 23:35:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:09.308 23:35:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:09.873 23:35:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:09.873 [2024-07-24 23:35:54.848923] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:09.873 BaseBdev2 00:14:09.873 23:35:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:14:09.873 23:35:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:14:09.873 23:35:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:14:09.873 23:35:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:14:09.873 23:35:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:14:09.873 23:35:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:14:09.873 23:35:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:10.130 23:35:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:10.388 [ 00:14:10.388 { 00:14:10.388 "name": "BaseBdev2", 00:14:10.388 "aliases": [ 00:14:10.388 "eceda47f-3c7c-4f04-a16c-3bc89d18fc8c" 00:14:10.388 ], 00:14:10.388 "product_name": "Malloc disk", 00:14:10.388 "block_size": 512, 00:14:10.388 "num_blocks": 65536, 00:14:10.388 "uuid": "eceda47f-3c7c-4f04-a16c-3bc89d18fc8c", 00:14:10.388 "assigned_rate_limits": { 00:14:10.388 "rw_ios_per_sec": 0, 00:14:10.388 "rw_mbytes_per_sec": 0, 00:14:10.388 "r_mbytes_per_sec": 0, 00:14:10.388 "w_mbytes_per_sec": 0 00:14:10.388 }, 00:14:10.388 "claimed": true, 00:14:10.388 "claim_type": "exclusive_write", 00:14:10.388 "zoned": false, 00:14:10.388 "supported_io_types": { 00:14:10.388 "read": true, 00:14:10.388 "write": true, 00:14:10.388 "unmap": true, 00:14:10.388 "flush": true, 00:14:10.388 "reset": true, 00:14:10.388 "nvme_admin": false, 00:14:10.388 "nvme_io": false, 00:14:10.388 "nvme_io_md": false, 00:14:10.388 "write_zeroes": true, 00:14:10.388 "zcopy": true, 00:14:10.388 "get_zone_info": false, 00:14:10.388 "zone_management": false, 00:14:10.388 "zone_append": false, 00:14:10.388 "compare": false, 00:14:10.388 "compare_and_write": false, 00:14:10.388 "abort": true, 00:14:10.388 "seek_hole": false, 00:14:10.388 "seek_data": false, 00:14:10.388 "copy": true, 00:14:10.388 "nvme_iov_md": false 00:14:10.388 }, 00:14:10.388 "memory_domains": [ 00:14:10.388 { 00:14:10.388 "dma_device_id": "system", 00:14:10.388 "dma_device_type": 1 00:14:10.388 }, 00:14:10.388 { 00:14:10.388 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:10.388 "dma_device_type": 2 00:14:10.388 } 00:14:10.388 ], 00:14:10.388 "driver_specific": {} 00:14:10.388 } 00:14:10.388 ] 00:14:10.388 23:35:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:14:10.388 23:35:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:10.388 23:35:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:10.388 23:35:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:10.388 23:35:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:10.388 23:35:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:10.388 23:35:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:10.388 23:35:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:10.388 23:35:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:10.388 23:35:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:10.388 23:35:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:10.388 23:35:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:10.388 23:35:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:10.388 23:35:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:10.388 23:35:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:10.647 23:35:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:10.647 "name": "Existed_Raid", 00:14:10.647 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:10.647 "strip_size_kb": 64, 00:14:10.647 "state": "configuring", 00:14:10.647 "raid_level": "raid0", 00:14:10.647 "superblock": false, 00:14:10.647 "num_base_bdevs": 4, 00:14:10.647 "num_base_bdevs_discovered": 2, 00:14:10.647 "num_base_bdevs_operational": 4, 00:14:10.647 "base_bdevs_list": [ 00:14:10.647 { 00:14:10.647 "name": "BaseBdev1", 00:14:10.647 "uuid": "3062aea6-de4b-4b3b-82dd-81849f52f42c", 00:14:10.647 "is_configured": true, 00:14:10.647 "data_offset": 0, 00:14:10.647 "data_size": 65536 00:14:10.647 }, 00:14:10.647 { 00:14:10.647 "name": "BaseBdev2", 00:14:10.647 "uuid": "eceda47f-3c7c-4f04-a16c-3bc89d18fc8c", 00:14:10.647 "is_configured": true, 00:14:10.647 "data_offset": 0, 00:14:10.647 "data_size": 65536 00:14:10.647 }, 00:14:10.647 { 00:14:10.647 "name": "BaseBdev3", 00:14:10.647 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:10.647 "is_configured": false, 00:14:10.647 "data_offset": 0, 00:14:10.647 "data_size": 0 00:14:10.647 }, 00:14:10.647 { 00:14:10.647 "name": "BaseBdev4", 00:14:10.647 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:10.647 "is_configured": false, 00:14:10.647 "data_offset": 0, 00:14:10.647 "data_size": 0 00:14:10.647 } 00:14:10.647 ] 00:14:10.647 }' 00:14:10.647 23:35:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:10.647 23:35:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:10.905 23:35:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:11.162 [2024-07-24 23:35:56.006474] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:11.162 BaseBdev3 00:14:11.162 23:35:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:14:11.162 23:35:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:14:11.163 23:35:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:14:11.163 23:35:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:14:11.163 23:35:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:14:11.163 23:35:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:14:11.163 23:35:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:11.420 23:35:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:11.420 [ 00:14:11.420 { 00:14:11.420 "name": "BaseBdev3", 00:14:11.420 "aliases": [ 00:14:11.420 "7ee761cf-7016-4bd7-bfe9-406968781967" 00:14:11.420 ], 00:14:11.420 "product_name": "Malloc disk", 00:14:11.420 "block_size": 512, 00:14:11.420 "num_blocks": 65536, 00:14:11.420 "uuid": "7ee761cf-7016-4bd7-bfe9-406968781967", 00:14:11.420 "assigned_rate_limits": { 00:14:11.420 "rw_ios_per_sec": 0, 00:14:11.420 "rw_mbytes_per_sec": 0, 00:14:11.420 "r_mbytes_per_sec": 0, 00:14:11.420 "w_mbytes_per_sec": 0 00:14:11.420 }, 00:14:11.420 "claimed": true, 00:14:11.420 "claim_type": "exclusive_write", 00:14:11.420 "zoned": false, 00:14:11.420 "supported_io_types": { 00:14:11.420 "read": true, 00:14:11.420 "write": true, 00:14:11.420 "unmap": true, 00:14:11.420 "flush": true, 00:14:11.420 "reset": true, 00:14:11.420 "nvme_admin": false, 00:14:11.420 "nvme_io": false, 00:14:11.420 "nvme_io_md": false, 00:14:11.420 "write_zeroes": true, 00:14:11.420 "zcopy": true, 00:14:11.420 "get_zone_info": false, 00:14:11.420 "zone_management": false, 00:14:11.420 "zone_append": false, 00:14:11.420 "compare": false, 00:14:11.420 "compare_and_write": false, 00:14:11.420 "abort": true, 00:14:11.420 "seek_hole": false, 00:14:11.420 "seek_data": false, 00:14:11.420 "copy": true, 00:14:11.420 "nvme_iov_md": false 00:14:11.420 }, 00:14:11.420 "memory_domains": [ 00:14:11.420 { 00:14:11.420 "dma_device_id": "system", 00:14:11.420 "dma_device_type": 1 00:14:11.420 }, 00:14:11.420 { 00:14:11.420 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:11.420 "dma_device_type": 2 00:14:11.420 } 00:14:11.420 ], 00:14:11.420 "driver_specific": {} 00:14:11.420 } 00:14:11.420 ] 00:14:11.420 23:35:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:14:11.420 23:35:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:11.420 23:35:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:11.421 23:35:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:11.421 23:35:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:11.421 23:35:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:11.421 23:35:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:11.421 23:35:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:11.421 23:35:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:11.421 23:35:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:11.421 23:35:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:11.421 23:35:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:11.421 23:35:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:11.421 23:35:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:11.421 23:35:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:11.704 23:35:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:11.704 "name": "Existed_Raid", 00:14:11.704 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:11.704 "strip_size_kb": 64, 00:14:11.704 "state": "configuring", 00:14:11.704 "raid_level": "raid0", 00:14:11.704 "superblock": false, 00:14:11.704 "num_base_bdevs": 4, 00:14:11.704 "num_base_bdevs_discovered": 3, 00:14:11.704 "num_base_bdevs_operational": 4, 00:14:11.704 "base_bdevs_list": [ 00:14:11.704 { 00:14:11.704 "name": "BaseBdev1", 00:14:11.704 "uuid": "3062aea6-de4b-4b3b-82dd-81849f52f42c", 00:14:11.704 "is_configured": true, 00:14:11.704 "data_offset": 0, 00:14:11.704 "data_size": 65536 00:14:11.704 }, 00:14:11.704 { 00:14:11.704 "name": "BaseBdev2", 00:14:11.704 "uuid": "eceda47f-3c7c-4f04-a16c-3bc89d18fc8c", 00:14:11.704 "is_configured": true, 00:14:11.704 "data_offset": 0, 00:14:11.704 "data_size": 65536 00:14:11.704 }, 00:14:11.704 { 00:14:11.704 "name": "BaseBdev3", 00:14:11.704 "uuid": "7ee761cf-7016-4bd7-bfe9-406968781967", 00:14:11.704 "is_configured": true, 00:14:11.704 "data_offset": 0, 00:14:11.704 "data_size": 65536 00:14:11.704 }, 00:14:11.704 { 00:14:11.704 "name": "BaseBdev4", 00:14:11.704 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:11.704 "is_configured": false, 00:14:11.704 "data_offset": 0, 00:14:11.704 "data_size": 0 00:14:11.704 } 00:14:11.704 ] 00:14:11.704 }' 00:14:11.704 23:35:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:11.704 23:35:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:12.270 23:35:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:14:12.270 [2024-07-24 23:35:57.164086] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:14:12.270 [2024-07-24 23:35:57.164112] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x17f13d0 00:14:12.270 [2024-07-24 23:35:57.164116] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:14:12.270 [2024-07-24 23:35:57.164253] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17f7880 00:14:12.270 [2024-07-24 23:35:57.164331] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x17f13d0 00:14:12.270 [2024-07-24 23:35:57.164336] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x17f13d0 00:14:12.270 [2024-07-24 23:35:57.164442] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:12.270 BaseBdev4 00:14:12.270 23:35:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:14:12.270 23:35:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev4 00:14:12.270 23:35:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:14:12.270 23:35:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:14:12.270 23:35:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:14:12.270 23:35:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:14:12.270 23:35:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:12.527 23:35:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:14:12.527 [ 00:14:12.527 { 00:14:12.527 "name": "BaseBdev4", 00:14:12.527 "aliases": [ 00:14:12.527 "568a9d81-1601-46e3-b9cd-c2fa86530beb" 00:14:12.527 ], 00:14:12.527 "product_name": "Malloc disk", 00:14:12.527 "block_size": 512, 00:14:12.527 "num_blocks": 65536, 00:14:12.527 "uuid": "568a9d81-1601-46e3-b9cd-c2fa86530beb", 00:14:12.527 "assigned_rate_limits": { 00:14:12.527 "rw_ios_per_sec": 0, 00:14:12.527 "rw_mbytes_per_sec": 0, 00:14:12.527 "r_mbytes_per_sec": 0, 00:14:12.527 "w_mbytes_per_sec": 0 00:14:12.527 }, 00:14:12.527 "claimed": true, 00:14:12.527 "claim_type": "exclusive_write", 00:14:12.527 "zoned": false, 00:14:12.527 "supported_io_types": { 00:14:12.527 "read": true, 00:14:12.527 "write": true, 00:14:12.527 "unmap": true, 00:14:12.527 "flush": true, 00:14:12.527 "reset": true, 00:14:12.527 "nvme_admin": false, 00:14:12.527 "nvme_io": false, 00:14:12.527 "nvme_io_md": false, 00:14:12.527 "write_zeroes": true, 00:14:12.527 "zcopy": true, 00:14:12.527 "get_zone_info": false, 00:14:12.527 "zone_management": false, 00:14:12.527 "zone_append": false, 00:14:12.527 "compare": false, 00:14:12.527 "compare_and_write": false, 00:14:12.527 "abort": true, 00:14:12.527 "seek_hole": false, 00:14:12.527 "seek_data": false, 00:14:12.527 "copy": true, 00:14:12.527 "nvme_iov_md": false 00:14:12.527 }, 00:14:12.527 "memory_domains": [ 00:14:12.527 { 00:14:12.527 "dma_device_id": "system", 00:14:12.527 "dma_device_type": 1 00:14:12.527 }, 00:14:12.527 { 00:14:12.527 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:12.527 "dma_device_type": 2 00:14:12.527 } 00:14:12.527 ], 00:14:12.528 "driver_specific": {} 00:14:12.528 } 00:14:12.528 ] 00:14:12.528 23:35:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:14:12.528 23:35:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:12.528 23:35:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:12.528 23:35:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:14:12.528 23:35:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:12.786 23:35:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:12.786 23:35:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:12.786 23:35:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:12.786 23:35:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:12.786 23:35:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:12.786 23:35:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:12.786 23:35:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:12.786 23:35:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:12.786 23:35:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:12.786 23:35:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:12.786 23:35:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:12.786 "name": "Existed_Raid", 00:14:12.786 "uuid": "f7cdc9a5-a9fc-4f3c-be2b-5f665e7493d8", 00:14:12.786 "strip_size_kb": 64, 00:14:12.786 "state": "online", 00:14:12.786 "raid_level": "raid0", 00:14:12.786 "superblock": false, 00:14:12.786 "num_base_bdevs": 4, 00:14:12.786 "num_base_bdevs_discovered": 4, 00:14:12.786 "num_base_bdevs_operational": 4, 00:14:12.786 "base_bdevs_list": [ 00:14:12.786 { 00:14:12.786 "name": "BaseBdev1", 00:14:12.786 "uuid": "3062aea6-de4b-4b3b-82dd-81849f52f42c", 00:14:12.786 "is_configured": true, 00:14:12.786 "data_offset": 0, 00:14:12.786 "data_size": 65536 00:14:12.786 }, 00:14:12.786 { 00:14:12.786 "name": "BaseBdev2", 00:14:12.786 "uuid": "eceda47f-3c7c-4f04-a16c-3bc89d18fc8c", 00:14:12.786 "is_configured": true, 00:14:12.786 "data_offset": 0, 00:14:12.786 "data_size": 65536 00:14:12.786 }, 00:14:12.786 { 00:14:12.786 "name": "BaseBdev3", 00:14:12.786 "uuid": "7ee761cf-7016-4bd7-bfe9-406968781967", 00:14:12.786 "is_configured": true, 00:14:12.786 "data_offset": 0, 00:14:12.786 "data_size": 65536 00:14:12.786 }, 00:14:12.786 { 00:14:12.786 "name": "BaseBdev4", 00:14:12.786 "uuid": "568a9d81-1601-46e3-b9cd-c2fa86530beb", 00:14:12.786 "is_configured": true, 00:14:12.786 "data_offset": 0, 00:14:12.786 "data_size": 65536 00:14:12.786 } 00:14:12.786 ] 00:14:12.786 }' 00:14:12.786 23:35:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:12.786 23:35:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:13.353 23:35:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:14:13.353 23:35:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:13.353 23:35:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:13.353 23:35:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:13.353 23:35:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:13.353 23:35:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:13.353 23:35:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:13.353 23:35:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:13.353 [2024-07-24 23:35:58.335344] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:13.644 23:35:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:13.645 "name": "Existed_Raid", 00:14:13.645 "aliases": [ 00:14:13.645 "f7cdc9a5-a9fc-4f3c-be2b-5f665e7493d8" 00:14:13.645 ], 00:14:13.645 "product_name": "Raid Volume", 00:14:13.645 "block_size": 512, 00:14:13.645 "num_blocks": 262144, 00:14:13.645 "uuid": "f7cdc9a5-a9fc-4f3c-be2b-5f665e7493d8", 00:14:13.645 "assigned_rate_limits": { 00:14:13.645 "rw_ios_per_sec": 0, 00:14:13.645 "rw_mbytes_per_sec": 0, 00:14:13.645 "r_mbytes_per_sec": 0, 00:14:13.645 "w_mbytes_per_sec": 0 00:14:13.645 }, 00:14:13.645 "claimed": false, 00:14:13.645 "zoned": false, 00:14:13.645 "supported_io_types": { 00:14:13.645 "read": true, 00:14:13.645 "write": true, 00:14:13.645 "unmap": true, 00:14:13.645 "flush": true, 00:14:13.645 "reset": true, 00:14:13.645 "nvme_admin": false, 00:14:13.645 "nvme_io": false, 00:14:13.645 "nvme_io_md": false, 00:14:13.645 "write_zeroes": true, 00:14:13.645 "zcopy": false, 00:14:13.645 "get_zone_info": false, 00:14:13.645 "zone_management": false, 00:14:13.645 "zone_append": false, 00:14:13.645 "compare": false, 00:14:13.645 "compare_and_write": false, 00:14:13.645 "abort": false, 00:14:13.645 "seek_hole": false, 00:14:13.645 "seek_data": false, 00:14:13.645 "copy": false, 00:14:13.645 "nvme_iov_md": false 00:14:13.645 }, 00:14:13.645 "memory_domains": [ 00:14:13.645 { 00:14:13.645 "dma_device_id": "system", 00:14:13.645 "dma_device_type": 1 00:14:13.645 }, 00:14:13.645 { 00:14:13.645 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:13.645 "dma_device_type": 2 00:14:13.645 }, 00:14:13.645 { 00:14:13.645 "dma_device_id": "system", 00:14:13.645 "dma_device_type": 1 00:14:13.645 }, 00:14:13.645 { 00:14:13.645 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:13.645 "dma_device_type": 2 00:14:13.645 }, 00:14:13.645 { 00:14:13.645 "dma_device_id": "system", 00:14:13.645 "dma_device_type": 1 00:14:13.645 }, 00:14:13.645 { 00:14:13.645 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:13.645 "dma_device_type": 2 00:14:13.645 }, 00:14:13.645 { 00:14:13.645 "dma_device_id": "system", 00:14:13.645 "dma_device_type": 1 00:14:13.645 }, 00:14:13.645 { 00:14:13.645 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:13.645 "dma_device_type": 2 00:14:13.645 } 00:14:13.645 ], 00:14:13.645 "driver_specific": { 00:14:13.645 "raid": { 00:14:13.645 "uuid": "f7cdc9a5-a9fc-4f3c-be2b-5f665e7493d8", 00:14:13.645 "strip_size_kb": 64, 00:14:13.645 "state": "online", 00:14:13.645 "raid_level": "raid0", 00:14:13.645 "superblock": false, 00:14:13.645 "num_base_bdevs": 4, 00:14:13.645 "num_base_bdevs_discovered": 4, 00:14:13.645 "num_base_bdevs_operational": 4, 00:14:13.645 "base_bdevs_list": [ 00:14:13.645 { 00:14:13.645 "name": "BaseBdev1", 00:14:13.645 "uuid": "3062aea6-de4b-4b3b-82dd-81849f52f42c", 00:14:13.645 "is_configured": true, 00:14:13.645 "data_offset": 0, 00:14:13.645 "data_size": 65536 00:14:13.645 }, 00:14:13.645 { 00:14:13.645 "name": "BaseBdev2", 00:14:13.645 "uuid": "eceda47f-3c7c-4f04-a16c-3bc89d18fc8c", 00:14:13.645 "is_configured": true, 00:14:13.645 "data_offset": 0, 00:14:13.645 "data_size": 65536 00:14:13.645 }, 00:14:13.645 { 00:14:13.645 "name": "BaseBdev3", 00:14:13.645 "uuid": "7ee761cf-7016-4bd7-bfe9-406968781967", 00:14:13.645 "is_configured": true, 00:14:13.645 "data_offset": 0, 00:14:13.645 "data_size": 65536 00:14:13.645 }, 00:14:13.645 { 00:14:13.645 "name": "BaseBdev4", 00:14:13.645 "uuid": "568a9d81-1601-46e3-b9cd-c2fa86530beb", 00:14:13.645 "is_configured": true, 00:14:13.645 "data_offset": 0, 00:14:13.645 "data_size": 65536 00:14:13.645 } 00:14:13.645 ] 00:14:13.645 } 00:14:13.645 } 00:14:13.645 }' 00:14:13.645 23:35:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:13.645 23:35:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:14:13.645 BaseBdev2 00:14:13.645 BaseBdev3 00:14:13.645 BaseBdev4' 00:14:13.645 23:35:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:13.645 23:35:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:14:13.645 23:35:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:13.645 23:35:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:13.645 "name": "BaseBdev1", 00:14:13.645 "aliases": [ 00:14:13.645 "3062aea6-de4b-4b3b-82dd-81849f52f42c" 00:14:13.645 ], 00:14:13.645 "product_name": "Malloc disk", 00:14:13.645 "block_size": 512, 00:14:13.645 "num_blocks": 65536, 00:14:13.645 "uuid": "3062aea6-de4b-4b3b-82dd-81849f52f42c", 00:14:13.645 "assigned_rate_limits": { 00:14:13.645 "rw_ios_per_sec": 0, 00:14:13.645 "rw_mbytes_per_sec": 0, 00:14:13.645 "r_mbytes_per_sec": 0, 00:14:13.645 "w_mbytes_per_sec": 0 00:14:13.645 }, 00:14:13.645 "claimed": true, 00:14:13.645 "claim_type": "exclusive_write", 00:14:13.645 "zoned": false, 00:14:13.645 "supported_io_types": { 00:14:13.645 "read": true, 00:14:13.645 "write": true, 00:14:13.645 "unmap": true, 00:14:13.645 "flush": true, 00:14:13.645 "reset": true, 00:14:13.645 "nvme_admin": false, 00:14:13.645 "nvme_io": false, 00:14:13.645 "nvme_io_md": false, 00:14:13.645 "write_zeroes": true, 00:14:13.645 "zcopy": true, 00:14:13.645 "get_zone_info": false, 00:14:13.645 "zone_management": false, 00:14:13.645 "zone_append": false, 00:14:13.645 "compare": false, 00:14:13.645 "compare_and_write": false, 00:14:13.645 "abort": true, 00:14:13.645 "seek_hole": false, 00:14:13.645 "seek_data": false, 00:14:13.645 "copy": true, 00:14:13.645 "nvme_iov_md": false 00:14:13.645 }, 00:14:13.645 "memory_domains": [ 00:14:13.645 { 00:14:13.645 "dma_device_id": "system", 00:14:13.645 "dma_device_type": 1 00:14:13.645 }, 00:14:13.645 { 00:14:13.645 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:13.645 "dma_device_type": 2 00:14:13.645 } 00:14:13.645 ], 00:14:13.645 "driver_specific": {} 00:14:13.645 }' 00:14:13.645 23:35:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:13.645 23:35:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:13.907 23:35:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:13.907 23:35:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:13.907 23:35:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:13.907 23:35:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:13.907 23:35:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:13.907 23:35:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:13.907 23:35:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:13.907 23:35:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:13.907 23:35:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:13.907 23:35:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:13.907 23:35:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:13.907 23:35:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:13.907 23:35:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:14.165 23:35:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:14.165 "name": "BaseBdev2", 00:14:14.165 "aliases": [ 00:14:14.165 "eceda47f-3c7c-4f04-a16c-3bc89d18fc8c" 00:14:14.165 ], 00:14:14.165 "product_name": "Malloc disk", 00:14:14.165 "block_size": 512, 00:14:14.165 "num_blocks": 65536, 00:14:14.165 "uuid": "eceda47f-3c7c-4f04-a16c-3bc89d18fc8c", 00:14:14.165 "assigned_rate_limits": { 00:14:14.165 "rw_ios_per_sec": 0, 00:14:14.165 "rw_mbytes_per_sec": 0, 00:14:14.165 "r_mbytes_per_sec": 0, 00:14:14.166 "w_mbytes_per_sec": 0 00:14:14.166 }, 00:14:14.166 "claimed": true, 00:14:14.166 "claim_type": "exclusive_write", 00:14:14.166 "zoned": false, 00:14:14.166 "supported_io_types": { 00:14:14.166 "read": true, 00:14:14.166 "write": true, 00:14:14.166 "unmap": true, 00:14:14.166 "flush": true, 00:14:14.166 "reset": true, 00:14:14.166 "nvme_admin": false, 00:14:14.166 "nvme_io": false, 00:14:14.166 "nvme_io_md": false, 00:14:14.166 "write_zeroes": true, 00:14:14.166 "zcopy": true, 00:14:14.166 "get_zone_info": false, 00:14:14.166 "zone_management": false, 00:14:14.166 "zone_append": false, 00:14:14.166 "compare": false, 00:14:14.166 "compare_and_write": false, 00:14:14.166 "abort": true, 00:14:14.166 "seek_hole": false, 00:14:14.166 "seek_data": false, 00:14:14.166 "copy": true, 00:14:14.166 "nvme_iov_md": false 00:14:14.166 }, 00:14:14.166 "memory_domains": [ 00:14:14.166 { 00:14:14.166 "dma_device_id": "system", 00:14:14.166 "dma_device_type": 1 00:14:14.166 }, 00:14:14.166 { 00:14:14.166 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:14.166 "dma_device_type": 2 00:14:14.166 } 00:14:14.166 ], 00:14:14.166 "driver_specific": {} 00:14:14.166 }' 00:14:14.166 23:35:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:14.166 23:35:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:14.166 23:35:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:14.166 23:35:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:14.425 23:35:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:14.425 23:35:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:14.425 23:35:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:14.425 23:35:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:14.425 23:35:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:14.425 23:35:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:14.425 23:35:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:14.425 23:35:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:14.425 23:35:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:14.425 23:35:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:14.425 23:35:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:14.684 23:35:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:14.684 "name": "BaseBdev3", 00:14:14.684 "aliases": [ 00:14:14.684 "7ee761cf-7016-4bd7-bfe9-406968781967" 00:14:14.684 ], 00:14:14.684 "product_name": "Malloc disk", 00:14:14.684 "block_size": 512, 00:14:14.684 "num_blocks": 65536, 00:14:14.684 "uuid": "7ee761cf-7016-4bd7-bfe9-406968781967", 00:14:14.684 "assigned_rate_limits": { 00:14:14.684 "rw_ios_per_sec": 0, 00:14:14.684 "rw_mbytes_per_sec": 0, 00:14:14.684 "r_mbytes_per_sec": 0, 00:14:14.684 "w_mbytes_per_sec": 0 00:14:14.684 }, 00:14:14.684 "claimed": true, 00:14:14.684 "claim_type": "exclusive_write", 00:14:14.684 "zoned": false, 00:14:14.684 "supported_io_types": { 00:14:14.684 "read": true, 00:14:14.684 "write": true, 00:14:14.684 "unmap": true, 00:14:14.684 "flush": true, 00:14:14.684 "reset": true, 00:14:14.684 "nvme_admin": false, 00:14:14.684 "nvme_io": false, 00:14:14.684 "nvme_io_md": false, 00:14:14.684 "write_zeroes": true, 00:14:14.684 "zcopy": true, 00:14:14.684 "get_zone_info": false, 00:14:14.684 "zone_management": false, 00:14:14.684 "zone_append": false, 00:14:14.684 "compare": false, 00:14:14.684 "compare_and_write": false, 00:14:14.684 "abort": true, 00:14:14.684 "seek_hole": false, 00:14:14.684 "seek_data": false, 00:14:14.684 "copy": true, 00:14:14.684 "nvme_iov_md": false 00:14:14.684 }, 00:14:14.684 "memory_domains": [ 00:14:14.684 { 00:14:14.684 "dma_device_id": "system", 00:14:14.684 "dma_device_type": 1 00:14:14.684 }, 00:14:14.684 { 00:14:14.684 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:14.684 "dma_device_type": 2 00:14:14.684 } 00:14:14.684 ], 00:14:14.684 "driver_specific": {} 00:14:14.684 }' 00:14:14.684 23:35:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:14.684 23:35:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:14.684 23:35:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:14.684 23:35:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:14.684 23:35:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:14.684 23:35:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:14.684 23:35:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:14.942 23:35:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:14.942 23:35:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:14.942 23:35:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:14.942 23:35:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:14.942 23:35:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:14.942 23:35:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:14.943 23:35:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:14:14.943 23:35:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:15.202 23:36:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:15.202 "name": "BaseBdev4", 00:14:15.202 "aliases": [ 00:14:15.202 "568a9d81-1601-46e3-b9cd-c2fa86530beb" 00:14:15.202 ], 00:14:15.202 "product_name": "Malloc disk", 00:14:15.202 "block_size": 512, 00:14:15.202 "num_blocks": 65536, 00:14:15.202 "uuid": "568a9d81-1601-46e3-b9cd-c2fa86530beb", 00:14:15.202 "assigned_rate_limits": { 00:14:15.202 "rw_ios_per_sec": 0, 00:14:15.202 "rw_mbytes_per_sec": 0, 00:14:15.202 "r_mbytes_per_sec": 0, 00:14:15.202 "w_mbytes_per_sec": 0 00:14:15.202 }, 00:14:15.202 "claimed": true, 00:14:15.202 "claim_type": "exclusive_write", 00:14:15.202 "zoned": false, 00:14:15.202 "supported_io_types": { 00:14:15.202 "read": true, 00:14:15.202 "write": true, 00:14:15.202 "unmap": true, 00:14:15.202 "flush": true, 00:14:15.202 "reset": true, 00:14:15.202 "nvme_admin": false, 00:14:15.202 "nvme_io": false, 00:14:15.202 "nvme_io_md": false, 00:14:15.202 "write_zeroes": true, 00:14:15.202 "zcopy": true, 00:14:15.202 "get_zone_info": false, 00:14:15.202 "zone_management": false, 00:14:15.202 "zone_append": false, 00:14:15.202 "compare": false, 00:14:15.202 "compare_and_write": false, 00:14:15.202 "abort": true, 00:14:15.202 "seek_hole": false, 00:14:15.202 "seek_data": false, 00:14:15.202 "copy": true, 00:14:15.202 "nvme_iov_md": false 00:14:15.202 }, 00:14:15.202 "memory_domains": [ 00:14:15.202 { 00:14:15.202 "dma_device_id": "system", 00:14:15.202 "dma_device_type": 1 00:14:15.202 }, 00:14:15.202 { 00:14:15.202 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:15.202 "dma_device_type": 2 00:14:15.202 } 00:14:15.202 ], 00:14:15.202 "driver_specific": {} 00:14:15.202 }' 00:14:15.202 23:36:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:15.202 23:36:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:15.202 23:36:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:15.202 23:36:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:15.202 23:36:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:15.202 23:36:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:15.202 23:36:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:15.461 23:36:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:15.461 23:36:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:15.461 23:36:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:15.461 23:36:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:15.461 23:36:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:15.461 23:36:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:15.720 [2024-07-24 23:36:00.472701] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:15.720 [2024-07-24 23:36:00.472721] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:15.720 [2024-07-24 23:36:00.472755] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:15.720 23:36:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:14:15.720 23:36:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:14:15.720 23:36:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:15.720 23:36:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:14:15.720 23:36:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:14:15.720 23:36:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 3 00:14:15.720 23:36:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:15.720 23:36:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:14:15.720 23:36:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:15.720 23:36:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:15.720 23:36:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:15.720 23:36:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:15.720 23:36:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:15.720 23:36:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:15.720 23:36:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:15.720 23:36:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:15.720 23:36:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:15.720 23:36:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:15.720 "name": "Existed_Raid", 00:14:15.720 "uuid": "f7cdc9a5-a9fc-4f3c-be2b-5f665e7493d8", 00:14:15.720 "strip_size_kb": 64, 00:14:15.720 "state": "offline", 00:14:15.720 "raid_level": "raid0", 00:14:15.720 "superblock": false, 00:14:15.720 "num_base_bdevs": 4, 00:14:15.720 "num_base_bdevs_discovered": 3, 00:14:15.720 "num_base_bdevs_operational": 3, 00:14:15.720 "base_bdevs_list": [ 00:14:15.720 { 00:14:15.720 "name": null, 00:14:15.720 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:15.720 "is_configured": false, 00:14:15.720 "data_offset": 0, 00:14:15.720 "data_size": 65536 00:14:15.720 }, 00:14:15.720 { 00:14:15.720 "name": "BaseBdev2", 00:14:15.720 "uuid": "eceda47f-3c7c-4f04-a16c-3bc89d18fc8c", 00:14:15.720 "is_configured": true, 00:14:15.720 "data_offset": 0, 00:14:15.720 "data_size": 65536 00:14:15.720 }, 00:14:15.720 { 00:14:15.720 "name": "BaseBdev3", 00:14:15.720 "uuid": "7ee761cf-7016-4bd7-bfe9-406968781967", 00:14:15.720 "is_configured": true, 00:14:15.720 "data_offset": 0, 00:14:15.720 "data_size": 65536 00:14:15.720 }, 00:14:15.720 { 00:14:15.720 "name": "BaseBdev4", 00:14:15.720 "uuid": "568a9d81-1601-46e3-b9cd-c2fa86530beb", 00:14:15.720 "is_configured": true, 00:14:15.720 "data_offset": 0, 00:14:15.720 "data_size": 65536 00:14:15.720 } 00:14:15.720 ] 00:14:15.720 }' 00:14:15.720 23:36:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:15.720 23:36:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:16.287 23:36:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:14:16.287 23:36:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:16.287 23:36:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:16.287 23:36:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:16.546 23:36:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:16.546 23:36:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:16.546 23:36:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:14:16.546 [2024-07-24 23:36:01.492174] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:16.546 23:36:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:16.546 23:36:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:16.546 23:36:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:16.546 23:36:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:16.804 23:36:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:16.804 23:36:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:16.804 23:36:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:14:17.063 [2024-07-24 23:36:01.834909] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:17.063 23:36:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:17.063 23:36:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:17.063 23:36:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:17.063 23:36:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:17.063 23:36:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:17.063 23:36:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:17.063 23:36:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:14:17.321 [2024-07-24 23:36:02.185647] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:14:17.321 [2024-07-24 23:36:02.185677] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17f13d0 name Existed_Raid, state offline 00:14:17.321 23:36:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:17.321 23:36:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:17.321 23:36:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:17.321 23:36:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:14:17.579 23:36:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:14:17.579 23:36:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:14:17.579 23:36:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:14:17.579 23:36:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:14:17.579 23:36:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:17.579 23:36:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:17.579 BaseBdev2 00:14:17.579 23:36:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:14:17.579 23:36:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:14:17.579 23:36:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:14:17.579 23:36:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:14:17.579 23:36:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:14:17.579 23:36:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:14:17.579 23:36:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:17.838 23:36:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:18.097 [ 00:14:18.097 { 00:14:18.097 "name": "BaseBdev2", 00:14:18.097 "aliases": [ 00:14:18.097 "4fb43d2b-24ce-45c1-b48b-aece9424a1a5" 00:14:18.097 ], 00:14:18.097 "product_name": "Malloc disk", 00:14:18.097 "block_size": 512, 00:14:18.097 "num_blocks": 65536, 00:14:18.097 "uuid": "4fb43d2b-24ce-45c1-b48b-aece9424a1a5", 00:14:18.097 "assigned_rate_limits": { 00:14:18.098 "rw_ios_per_sec": 0, 00:14:18.098 "rw_mbytes_per_sec": 0, 00:14:18.098 "r_mbytes_per_sec": 0, 00:14:18.098 "w_mbytes_per_sec": 0 00:14:18.098 }, 00:14:18.098 "claimed": false, 00:14:18.098 "zoned": false, 00:14:18.098 "supported_io_types": { 00:14:18.098 "read": true, 00:14:18.098 "write": true, 00:14:18.098 "unmap": true, 00:14:18.098 "flush": true, 00:14:18.098 "reset": true, 00:14:18.098 "nvme_admin": false, 00:14:18.098 "nvme_io": false, 00:14:18.098 "nvme_io_md": false, 00:14:18.098 "write_zeroes": true, 00:14:18.098 "zcopy": true, 00:14:18.098 "get_zone_info": false, 00:14:18.098 "zone_management": false, 00:14:18.098 "zone_append": false, 00:14:18.098 "compare": false, 00:14:18.098 "compare_and_write": false, 00:14:18.098 "abort": true, 00:14:18.098 "seek_hole": false, 00:14:18.098 "seek_data": false, 00:14:18.098 "copy": true, 00:14:18.098 "nvme_iov_md": false 00:14:18.098 }, 00:14:18.098 "memory_domains": [ 00:14:18.098 { 00:14:18.098 "dma_device_id": "system", 00:14:18.098 "dma_device_type": 1 00:14:18.098 }, 00:14:18.098 { 00:14:18.098 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:18.098 "dma_device_type": 2 00:14:18.098 } 00:14:18.098 ], 00:14:18.098 "driver_specific": {} 00:14:18.098 } 00:14:18.098 ] 00:14:18.098 23:36:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:14:18.098 23:36:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:18.098 23:36:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:18.098 23:36:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:18.098 BaseBdev3 00:14:18.098 23:36:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:14:18.098 23:36:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:14:18.098 23:36:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:14:18.098 23:36:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:14:18.098 23:36:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:14:18.098 23:36:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:14:18.098 23:36:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:18.357 23:36:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:18.357 [ 00:14:18.357 { 00:14:18.357 "name": "BaseBdev3", 00:14:18.357 "aliases": [ 00:14:18.357 "53473a8b-5fb4-4151-ae68-65bf62fa4d95" 00:14:18.357 ], 00:14:18.357 "product_name": "Malloc disk", 00:14:18.357 "block_size": 512, 00:14:18.357 "num_blocks": 65536, 00:14:18.357 "uuid": "53473a8b-5fb4-4151-ae68-65bf62fa4d95", 00:14:18.357 "assigned_rate_limits": { 00:14:18.357 "rw_ios_per_sec": 0, 00:14:18.357 "rw_mbytes_per_sec": 0, 00:14:18.357 "r_mbytes_per_sec": 0, 00:14:18.357 "w_mbytes_per_sec": 0 00:14:18.357 }, 00:14:18.357 "claimed": false, 00:14:18.357 "zoned": false, 00:14:18.357 "supported_io_types": { 00:14:18.357 "read": true, 00:14:18.357 "write": true, 00:14:18.357 "unmap": true, 00:14:18.357 "flush": true, 00:14:18.357 "reset": true, 00:14:18.357 "nvme_admin": false, 00:14:18.357 "nvme_io": false, 00:14:18.357 "nvme_io_md": false, 00:14:18.357 "write_zeroes": true, 00:14:18.357 "zcopy": true, 00:14:18.357 "get_zone_info": false, 00:14:18.357 "zone_management": false, 00:14:18.357 "zone_append": false, 00:14:18.357 "compare": false, 00:14:18.357 "compare_and_write": false, 00:14:18.357 "abort": true, 00:14:18.357 "seek_hole": false, 00:14:18.357 "seek_data": false, 00:14:18.357 "copy": true, 00:14:18.357 "nvme_iov_md": false 00:14:18.357 }, 00:14:18.357 "memory_domains": [ 00:14:18.357 { 00:14:18.357 "dma_device_id": "system", 00:14:18.357 "dma_device_type": 1 00:14:18.357 }, 00:14:18.357 { 00:14:18.357 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:18.357 "dma_device_type": 2 00:14:18.357 } 00:14:18.357 ], 00:14:18.357 "driver_specific": {} 00:14:18.357 } 00:14:18.357 ] 00:14:18.615 23:36:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:14:18.615 23:36:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:18.615 23:36:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:18.615 23:36:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:14:18.615 BaseBdev4 00:14:18.615 23:36:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:14:18.615 23:36:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev4 00:14:18.615 23:36:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:14:18.615 23:36:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:14:18.615 23:36:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:14:18.615 23:36:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:14:18.615 23:36:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:18.873 23:36:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:14:18.873 [ 00:14:18.873 { 00:14:18.873 "name": "BaseBdev4", 00:14:18.873 "aliases": [ 00:14:18.873 "a2486a5c-7317-4cd0-b1bd-0afa4d8d66dc" 00:14:18.873 ], 00:14:18.873 "product_name": "Malloc disk", 00:14:18.873 "block_size": 512, 00:14:18.873 "num_blocks": 65536, 00:14:18.873 "uuid": "a2486a5c-7317-4cd0-b1bd-0afa4d8d66dc", 00:14:18.873 "assigned_rate_limits": { 00:14:18.873 "rw_ios_per_sec": 0, 00:14:18.873 "rw_mbytes_per_sec": 0, 00:14:18.873 "r_mbytes_per_sec": 0, 00:14:18.873 "w_mbytes_per_sec": 0 00:14:18.873 }, 00:14:18.873 "claimed": false, 00:14:18.873 "zoned": false, 00:14:18.873 "supported_io_types": { 00:14:18.873 "read": true, 00:14:18.873 "write": true, 00:14:18.873 "unmap": true, 00:14:18.873 "flush": true, 00:14:18.873 "reset": true, 00:14:18.873 "nvme_admin": false, 00:14:18.873 "nvme_io": false, 00:14:18.873 "nvme_io_md": false, 00:14:18.873 "write_zeroes": true, 00:14:18.873 "zcopy": true, 00:14:18.873 "get_zone_info": false, 00:14:18.873 "zone_management": false, 00:14:18.873 "zone_append": false, 00:14:18.873 "compare": false, 00:14:18.873 "compare_and_write": false, 00:14:18.873 "abort": true, 00:14:18.873 "seek_hole": false, 00:14:18.873 "seek_data": false, 00:14:18.873 "copy": true, 00:14:18.873 "nvme_iov_md": false 00:14:18.873 }, 00:14:18.874 "memory_domains": [ 00:14:18.874 { 00:14:18.874 "dma_device_id": "system", 00:14:18.874 "dma_device_type": 1 00:14:18.874 }, 00:14:18.874 { 00:14:18.874 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:18.874 "dma_device_type": 2 00:14:18.874 } 00:14:18.874 ], 00:14:18.874 "driver_specific": {} 00:14:18.874 } 00:14:18.874 ] 00:14:18.874 23:36:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:14:18.874 23:36:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:18.874 23:36:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:18.874 23:36:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:14:19.133 [2024-07-24 23:36:03.999282] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:19.133 [2024-07-24 23:36:03.999310] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:19.133 [2024-07-24 23:36:03.999322] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:19.133 [2024-07-24 23:36:04.000488] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:19.133 [2024-07-24 23:36:04.000519] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:14:19.133 23:36:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:19.133 23:36:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:19.133 23:36:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:19.133 23:36:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:19.133 23:36:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:19.133 23:36:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:19.133 23:36:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:19.133 23:36:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:19.133 23:36:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:19.133 23:36:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:19.133 23:36:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:19.133 23:36:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:19.392 23:36:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:19.392 "name": "Existed_Raid", 00:14:19.392 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:19.392 "strip_size_kb": 64, 00:14:19.392 "state": "configuring", 00:14:19.392 "raid_level": "raid0", 00:14:19.392 "superblock": false, 00:14:19.392 "num_base_bdevs": 4, 00:14:19.392 "num_base_bdevs_discovered": 3, 00:14:19.392 "num_base_bdevs_operational": 4, 00:14:19.392 "base_bdevs_list": [ 00:14:19.392 { 00:14:19.392 "name": "BaseBdev1", 00:14:19.392 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:19.392 "is_configured": false, 00:14:19.392 "data_offset": 0, 00:14:19.392 "data_size": 0 00:14:19.392 }, 00:14:19.392 { 00:14:19.392 "name": "BaseBdev2", 00:14:19.392 "uuid": "4fb43d2b-24ce-45c1-b48b-aece9424a1a5", 00:14:19.392 "is_configured": true, 00:14:19.392 "data_offset": 0, 00:14:19.392 "data_size": 65536 00:14:19.392 }, 00:14:19.392 { 00:14:19.392 "name": "BaseBdev3", 00:14:19.392 "uuid": "53473a8b-5fb4-4151-ae68-65bf62fa4d95", 00:14:19.392 "is_configured": true, 00:14:19.392 "data_offset": 0, 00:14:19.392 "data_size": 65536 00:14:19.392 }, 00:14:19.392 { 00:14:19.392 "name": "BaseBdev4", 00:14:19.392 "uuid": "a2486a5c-7317-4cd0-b1bd-0afa4d8d66dc", 00:14:19.392 "is_configured": true, 00:14:19.392 "data_offset": 0, 00:14:19.392 "data_size": 65536 00:14:19.392 } 00:14:19.392 ] 00:14:19.392 }' 00:14:19.392 23:36:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:19.392 23:36:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:19.959 23:36:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:14:19.959 [2024-07-24 23:36:04.809367] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:19.959 23:36:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:19.959 23:36:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:19.959 23:36:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:19.959 23:36:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:19.959 23:36:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:19.959 23:36:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:19.959 23:36:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:19.959 23:36:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:19.959 23:36:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:19.959 23:36:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:19.959 23:36:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:19.959 23:36:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:20.217 23:36:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:20.217 "name": "Existed_Raid", 00:14:20.217 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:20.217 "strip_size_kb": 64, 00:14:20.217 "state": "configuring", 00:14:20.217 "raid_level": "raid0", 00:14:20.217 "superblock": false, 00:14:20.217 "num_base_bdevs": 4, 00:14:20.217 "num_base_bdevs_discovered": 2, 00:14:20.217 "num_base_bdevs_operational": 4, 00:14:20.217 "base_bdevs_list": [ 00:14:20.217 { 00:14:20.217 "name": "BaseBdev1", 00:14:20.217 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:20.217 "is_configured": false, 00:14:20.217 "data_offset": 0, 00:14:20.217 "data_size": 0 00:14:20.218 }, 00:14:20.218 { 00:14:20.218 "name": null, 00:14:20.218 "uuid": "4fb43d2b-24ce-45c1-b48b-aece9424a1a5", 00:14:20.218 "is_configured": false, 00:14:20.218 "data_offset": 0, 00:14:20.218 "data_size": 65536 00:14:20.218 }, 00:14:20.218 { 00:14:20.218 "name": "BaseBdev3", 00:14:20.218 "uuid": "53473a8b-5fb4-4151-ae68-65bf62fa4d95", 00:14:20.218 "is_configured": true, 00:14:20.218 "data_offset": 0, 00:14:20.218 "data_size": 65536 00:14:20.218 }, 00:14:20.218 { 00:14:20.218 "name": "BaseBdev4", 00:14:20.218 "uuid": "a2486a5c-7317-4cd0-b1bd-0afa4d8d66dc", 00:14:20.218 "is_configured": true, 00:14:20.218 "data_offset": 0, 00:14:20.218 "data_size": 65536 00:14:20.218 } 00:14:20.218 ] 00:14:20.218 }' 00:14:20.218 23:36:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:20.218 23:36:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:20.476 23:36:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:20.476 23:36:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:20.734 23:36:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:14:20.734 23:36:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:20.993 [2024-07-24 23:36:05.794496] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:20.993 BaseBdev1 00:14:20.993 23:36:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:14:20.993 23:36:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:14:20.993 23:36:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:14:20.993 23:36:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:14:20.993 23:36:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:14:20.993 23:36:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:14:20.993 23:36:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:20.993 23:36:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:21.252 [ 00:14:21.252 { 00:14:21.252 "name": "BaseBdev1", 00:14:21.252 "aliases": [ 00:14:21.252 "cb71e775-aad8-4ebc-8210-a18e9d691afc" 00:14:21.252 ], 00:14:21.252 "product_name": "Malloc disk", 00:14:21.252 "block_size": 512, 00:14:21.252 "num_blocks": 65536, 00:14:21.252 "uuid": "cb71e775-aad8-4ebc-8210-a18e9d691afc", 00:14:21.252 "assigned_rate_limits": { 00:14:21.252 "rw_ios_per_sec": 0, 00:14:21.252 "rw_mbytes_per_sec": 0, 00:14:21.252 "r_mbytes_per_sec": 0, 00:14:21.252 "w_mbytes_per_sec": 0 00:14:21.252 }, 00:14:21.252 "claimed": true, 00:14:21.252 "claim_type": "exclusive_write", 00:14:21.252 "zoned": false, 00:14:21.252 "supported_io_types": { 00:14:21.252 "read": true, 00:14:21.252 "write": true, 00:14:21.252 "unmap": true, 00:14:21.252 "flush": true, 00:14:21.252 "reset": true, 00:14:21.252 "nvme_admin": false, 00:14:21.252 "nvme_io": false, 00:14:21.252 "nvme_io_md": false, 00:14:21.252 "write_zeroes": true, 00:14:21.252 "zcopy": true, 00:14:21.252 "get_zone_info": false, 00:14:21.252 "zone_management": false, 00:14:21.252 "zone_append": false, 00:14:21.252 "compare": false, 00:14:21.252 "compare_and_write": false, 00:14:21.252 "abort": true, 00:14:21.252 "seek_hole": false, 00:14:21.252 "seek_data": false, 00:14:21.252 "copy": true, 00:14:21.252 "nvme_iov_md": false 00:14:21.252 }, 00:14:21.252 "memory_domains": [ 00:14:21.252 { 00:14:21.252 "dma_device_id": "system", 00:14:21.252 "dma_device_type": 1 00:14:21.252 }, 00:14:21.252 { 00:14:21.252 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:21.252 "dma_device_type": 2 00:14:21.252 } 00:14:21.252 ], 00:14:21.252 "driver_specific": {} 00:14:21.252 } 00:14:21.252 ] 00:14:21.252 23:36:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:14:21.252 23:36:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:21.252 23:36:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:21.252 23:36:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:21.252 23:36:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:21.252 23:36:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:21.252 23:36:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:21.252 23:36:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:21.252 23:36:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:21.252 23:36:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:21.252 23:36:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:21.252 23:36:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:21.252 23:36:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:21.510 23:36:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:21.510 "name": "Existed_Raid", 00:14:21.510 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:21.510 "strip_size_kb": 64, 00:14:21.510 "state": "configuring", 00:14:21.510 "raid_level": "raid0", 00:14:21.510 "superblock": false, 00:14:21.510 "num_base_bdevs": 4, 00:14:21.510 "num_base_bdevs_discovered": 3, 00:14:21.510 "num_base_bdevs_operational": 4, 00:14:21.510 "base_bdevs_list": [ 00:14:21.510 { 00:14:21.510 "name": "BaseBdev1", 00:14:21.510 "uuid": "cb71e775-aad8-4ebc-8210-a18e9d691afc", 00:14:21.510 "is_configured": true, 00:14:21.510 "data_offset": 0, 00:14:21.510 "data_size": 65536 00:14:21.510 }, 00:14:21.510 { 00:14:21.510 "name": null, 00:14:21.510 "uuid": "4fb43d2b-24ce-45c1-b48b-aece9424a1a5", 00:14:21.510 "is_configured": false, 00:14:21.510 "data_offset": 0, 00:14:21.510 "data_size": 65536 00:14:21.510 }, 00:14:21.510 { 00:14:21.510 "name": "BaseBdev3", 00:14:21.510 "uuid": "53473a8b-5fb4-4151-ae68-65bf62fa4d95", 00:14:21.510 "is_configured": true, 00:14:21.510 "data_offset": 0, 00:14:21.510 "data_size": 65536 00:14:21.510 }, 00:14:21.510 { 00:14:21.510 "name": "BaseBdev4", 00:14:21.510 "uuid": "a2486a5c-7317-4cd0-b1bd-0afa4d8d66dc", 00:14:21.510 "is_configured": true, 00:14:21.510 "data_offset": 0, 00:14:21.510 "data_size": 65536 00:14:21.510 } 00:14:21.510 ] 00:14:21.510 }' 00:14:21.510 23:36:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:21.510 23:36:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:21.768 23:36:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:21.768 23:36:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:22.026 23:36:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:14:22.026 23:36:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:14:22.284 [2024-07-24 23:36:07.077844] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:22.285 23:36:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:22.285 23:36:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:22.285 23:36:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:22.285 23:36:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:22.285 23:36:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:22.285 23:36:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:22.285 23:36:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:22.285 23:36:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:22.285 23:36:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:22.285 23:36:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:22.285 23:36:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:22.285 23:36:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:22.285 23:36:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:22.285 "name": "Existed_Raid", 00:14:22.285 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:22.285 "strip_size_kb": 64, 00:14:22.285 "state": "configuring", 00:14:22.285 "raid_level": "raid0", 00:14:22.285 "superblock": false, 00:14:22.285 "num_base_bdevs": 4, 00:14:22.285 "num_base_bdevs_discovered": 2, 00:14:22.285 "num_base_bdevs_operational": 4, 00:14:22.285 "base_bdevs_list": [ 00:14:22.285 { 00:14:22.285 "name": "BaseBdev1", 00:14:22.285 "uuid": "cb71e775-aad8-4ebc-8210-a18e9d691afc", 00:14:22.285 "is_configured": true, 00:14:22.285 "data_offset": 0, 00:14:22.285 "data_size": 65536 00:14:22.285 }, 00:14:22.285 { 00:14:22.285 "name": null, 00:14:22.285 "uuid": "4fb43d2b-24ce-45c1-b48b-aece9424a1a5", 00:14:22.285 "is_configured": false, 00:14:22.285 "data_offset": 0, 00:14:22.285 "data_size": 65536 00:14:22.285 }, 00:14:22.285 { 00:14:22.285 "name": null, 00:14:22.285 "uuid": "53473a8b-5fb4-4151-ae68-65bf62fa4d95", 00:14:22.285 "is_configured": false, 00:14:22.285 "data_offset": 0, 00:14:22.285 "data_size": 65536 00:14:22.285 }, 00:14:22.285 { 00:14:22.285 "name": "BaseBdev4", 00:14:22.285 "uuid": "a2486a5c-7317-4cd0-b1bd-0afa4d8d66dc", 00:14:22.285 "is_configured": true, 00:14:22.285 "data_offset": 0, 00:14:22.285 "data_size": 65536 00:14:22.285 } 00:14:22.285 ] 00:14:22.285 }' 00:14:22.285 23:36:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:22.285 23:36:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:22.852 23:36:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:22.852 23:36:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:23.110 23:36:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:14:23.110 23:36:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:14:23.110 [2024-07-24 23:36:08.100505] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:23.368 23:36:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:23.368 23:36:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:23.368 23:36:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:23.368 23:36:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:23.368 23:36:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:23.368 23:36:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:23.368 23:36:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:23.368 23:36:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:23.368 23:36:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:23.368 23:36:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:23.368 23:36:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:23.368 23:36:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:23.368 23:36:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:23.368 "name": "Existed_Raid", 00:14:23.368 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:23.368 "strip_size_kb": 64, 00:14:23.368 "state": "configuring", 00:14:23.368 "raid_level": "raid0", 00:14:23.368 "superblock": false, 00:14:23.368 "num_base_bdevs": 4, 00:14:23.368 "num_base_bdevs_discovered": 3, 00:14:23.368 "num_base_bdevs_operational": 4, 00:14:23.368 "base_bdevs_list": [ 00:14:23.368 { 00:14:23.368 "name": "BaseBdev1", 00:14:23.368 "uuid": "cb71e775-aad8-4ebc-8210-a18e9d691afc", 00:14:23.368 "is_configured": true, 00:14:23.368 "data_offset": 0, 00:14:23.368 "data_size": 65536 00:14:23.368 }, 00:14:23.368 { 00:14:23.368 "name": null, 00:14:23.368 "uuid": "4fb43d2b-24ce-45c1-b48b-aece9424a1a5", 00:14:23.368 "is_configured": false, 00:14:23.368 "data_offset": 0, 00:14:23.368 "data_size": 65536 00:14:23.368 }, 00:14:23.368 { 00:14:23.368 "name": "BaseBdev3", 00:14:23.368 "uuid": "53473a8b-5fb4-4151-ae68-65bf62fa4d95", 00:14:23.368 "is_configured": true, 00:14:23.368 "data_offset": 0, 00:14:23.368 "data_size": 65536 00:14:23.368 }, 00:14:23.368 { 00:14:23.368 "name": "BaseBdev4", 00:14:23.368 "uuid": "a2486a5c-7317-4cd0-b1bd-0afa4d8d66dc", 00:14:23.368 "is_configured": true, 00:14:23.368 "data_offset": 0, 00:14:23.368 "data_size": 65536 00:14:23.368 } 00:14:23.368 ] 00:14:23.368 }' 00:14:23.368 23:36:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:23.368 23:36:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:23.935 23:36:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:23.935 23:36:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:24.193 23:36:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:14:24.193 23:36:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:24.193 [2024-07-24 23:36:09.147237] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:24.193 23:36:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:24.193 23:36:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:24.193 23:36:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:24.193 23:36:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:24.193 23:36:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:24.193 23:36:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:24.193 23:36:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:24.193 23:36:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:24.193 23:36:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:24.193 23:36:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:24.193 23:36:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:24.193 23:36:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:24.452 23:36:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:24.452 "name": "Existed_Raid", 00:14:24.452 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:24.452 "strip_size_kb": 64, 00:14:24.452 "state": "configuring", 00:14:24.452 "raid_level": "raid0", 00:14:24.452 "superblock": false, 00:14:24.452 "num_base_bdevs": 4, 00:14:24.452 "num_base_bdevs_discovered": 2, 00:14:24.452 "num_base_bdevs_operational": 4, 00:14:24.452 "base_bdevs_list": [ 00:14:24.452 { 00:14:24.452 "name": null, 00:14:24.452 "uuid": "cb71e775-aad8-4ebc-8210-a18e9d691afc", 00:14:24.452 "is_configured": false, 00:14:24.452 "data_offset": 0, 00:14:24.452 "data_size": 65536 00:14:24.452 }, 00:14:24.452 { 00:14:24.452 "name": null, 00:14:24.452 "uuid": "4fb43d2b-24ce-45c1-b48b-aece9424a1a5", 00:14:24.452 "is_configured": false, 00:14:24.452 "data_offset": 0, 00:14:24.452 "data_size": 65536 00:14:24.452 }, 00:14:24.452 { 00:14:24.452 "name": "BaseBdev3", 00:14:24.452 "uuid": "53473a8b-5fb4-4151-ae68-65bf62fa4d95", 00:14:24.452 "is_configured": true, 00:14:24.452 "data_offset": 0, 00:14:24.452 "data_size": 65536 00:14:24.452 }, 00:14:24.452 { 00:14:24.452 "name": "BaseBdev4", 00:14:24.452 "uuid": "a2486a5c-7317-4cd0-b1bd-0afa4d8d66dc", 00:14:24.452 "is_configured": true, 00:14:24.452 "data_offset": 0, 00:14:24.452 "data_size": 65536 00:14:24.452 } 00:14:24.452 ] 00:14:24.452 }' 00:14:24.452 23:36:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:24.452 23:36:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:25.018 23:36:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:25.018 23:36:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:25.277 23:36:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:14:25.277 23:36:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:14:25.277 [2024-07-24 23:36:10.187720] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:25.277 23:36:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:25.277 23:36:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:25.277 23:36:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:25.277 23:36:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:25.277 23:36:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:25.277 23:36:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:25.277 23:36:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:25.277 23:36:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:25.277 23:36:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:25.277 23:36:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:25.277 23:36:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:25.277 23:36:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:25.535 23:36:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:25.535 "name": "Existed_Raid", 00:14:25.535 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:25.535 "strip_size_kb": 64, 00:14:25.535 "state": "configuring", 00:14:25.535 "raid_level": "raid0", 00:14:25.535 "superblock": false, 00:14:25.535 "num_base_bdevs": 4, 00:14:25.535 "num_base_bdevs_discovered": 3, 00:14:25.535 "num_base_bdevs_operational": 4, 00:14:25.535 "base_bdevs_list": [ 00:14:25.535 { 00:14:25.535 "name": null, 00:14:25.535 "uuid": "cb71e775-aad8-4ebc-8210-a18e9d691afc", 00:14:25.535 "is_configured": false, 00:14:25.535 "data_offset": 0, 00:14:25.535 "data_size": 65536 00:14:25.535 }, 00:14:25.535 { 00:14:25.535 "name": "BaseBdev2", 00:14:25.535 "uuid": "4fb43d2b-24ce-45c1-b48b-aece9424a1a5", 00:14:25.535 "is_configured": true, 00:14:25.535 "data_offset": 0, 00:14:25.535 "data_size": 65536 00:14:25.535 }, 00:14:25.536 { 00:14:25.536 "name": "BaseBdev3", 00:14:25.536 "uuid": "53473a8b-5fb4-4151-ae68-65bf62fa4d95", 00:14:25.536 "is_configured": true, 00:14:25.536 "data_offset": 0, 00:14:25.536 "data_size": 65536 00:14:25.536 }, 00:14:25.536 { 00:14:25.536 "name": "BaseBdev4", 00:14:25.536 "uuid": "a2486a5c-7317-4cd0-b1bd-0afa4d8d66dc", 00:14:25.536 "is_configured": true, 00:14:25.536 "data_offset": 0, 00:14:25.536 "data_size": 65536 00:14:25.536 } 00:14:25.536 ] 00:14:25.536 }' 00:14:25.536 23:36:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:25.536 23:36:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:26.101 23:36:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:26.101 23:36:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:26.101 23:36:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:14:26.101 23:36:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:14:26.101 23:36:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:26.358 23:36:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u cb71e775-aad8-4ebc-8210-a18e9d691afc 00:14:26.358 [2024-07-24 23:36:11.353363] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:14:26.358 [2024-07-24 23:36:11.353390] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x17f57e0 00:14:26.358 [2024-07-24 23:36:11.353393] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:14:26.358 [2024-07-24 23:36:11.353528] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17ef580 00:14:26.358 [2024-07-24 23:36:11.353605] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x17f57e0 00:14:26.358 [2024-07-24 23:36:11.353609] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x17f57e0 00:14:26.358 [2024-07-24 23:36:11.353736] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:26.358 NewBaseBdev 00:14:26.616 23:36:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:14:26.616 23:36:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=NewBaseBdev 00:14:26.616 23:36:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:14:26.616 23:36:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:14:26.616 23:36:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:14:26.616 23:36:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:14:26.617 23:36:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:26.617 23:36:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:14:26.876 [ 00:14:26.876 { 00:14:26.876 "name": "NewBaseBdev", 00:14:26.876 "aliases": [ 00:14:26.876 "cb71e775-aad8-4ebc-8210-a18e9d691afc" 00:14:26.876 ], 00:14:26.876 "product_name": "Malloc disk", 00:14:26.876 "block_size": 512, 00:14:26.876 "num_blocks": 65536, 00:14:26.876 "uuid": "cb71e775-aad8-4ebc-8210-a18e9d691afc", 00:14:26.876 "assigned_rate_limits": { 00:14:26.876 "rw_ios_per_sec": 0, 00:14:26.876 "rw_mbytes_per_sec": 0, 00:14:26.876 "r_mbytes_per_sec": 0, 00:14:26.876 "w_mbytes_per_sec": 0 00:14:26.876 }, 00:14:26.876 "claimed": true, 00:14:26.876 "claim_type": "exclusive_write", 00:14:26.876 "zoned": false, 00:14:26.876 "supported_io_types": { 00:14:26.876 "read": true, 00:14:26.876 "write": true, 00:14:26.876 "unmap": true, 00:14:26.876 "flush": true, 00:14:26.876 "reset": true, 00:14:26.876 "nvme_admin": false, 00:14:26.876 "nvme_io": false, 00:14:26.876 "nvme_io_md": false, 00:14:26.876 "write_zeroes": true, 00:14:26.876 "zcopy": true, 00:14:26.876 "get_zone_info": false, 00:14:26.876 "zone_management": false, 00:14:26.876 "zone_append": false, 00:14:26.876 "compare": false, 00:14:26.876 "compare_and_write": false, 00:14:26.876 "abort": true, 00:14:26.876 "seek_hole": false, 00:14:26.876 "seek_data": false, 00:14:26.876 "copy": true, 00:14:26.876 "nvme_iov_md": false 00:14:26.876 }, 00:14:26.876 "memory_domains": [ 00:14:26.876 { 00:14:26.876 "dma_device_id": "system", 00:14:26.876 "dma_device_type": 1 00:14:26.876 }, 00:14:26.876 { 00:14:26.876 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:26.876 "dma_device_type": 2 00:14:26.876 } 00:14:26.876 ], 00:14:26.876 "driver_specific": {} 00:14:26.876 } 00:14:26.876 ] 00:14:26.876 23:36:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:14:26.876 23:36:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:14:26.876 23:36:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:26.876 23:36:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:26.876 23:36:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:26.876 23:36:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:26.876 23:36:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:26.876 23:36:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:26.876 23:36:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:26.876 23:36:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:26.876 23:36:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:26.876 23:36:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:26.876 23:36:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:26.876 23:36:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:26.876 "name": "Existed_Raid", 00:14:26.876 "uuid": "b75de686-e49e-4a52-b36d-b8fc7cdf0ec1", 00:14:26.876 "strip_size_kb": 64, 00:14:26.876 "state": "online", 00:14:26.876 "raid_level": "raid0", 00:14:26.876 "superblock": false, 00:14:26.876 "num_base_bdevs": 4, 00:14:26.876 "num_base_bdevs_discovered": 4, 00:14:26.876 "num_base_bdevs_operational": 4, 00:14:26.876 "base_bdevs_list": [ 00:14:26.876 { 00:14:26.876 "name": "NewBaseBdev", 00:14:26.876 "uuid": "cb71e775-aad8-4ebc-8210-a18e9d691afc", 00:14:26.876 "is_configured": true, 00:14:26.876 "data_offset": 0, 00:14:26.876 "data_size": 65536 00:14:26.876 }, 00:14:26.876 { 00:14:26.876 "name": "BaseBdev2", 00:14:26.876 "uuid": "4fb43d2b-24ce-45c1-b48b-aece9424a1a5", 00:14:26.876 "is_configured": true, 00:14:26.876 "data_offset": 0, 00:14:26.876 "data_size": 65536 00:14:26.876 }, 00:14:26.876 { 00:14:26.876 "name": "BaseBdev3", 00:14:26.876 "uuid": "53473a8b-5fb4-4151-ae68-65bf62fa4d95", 00:14:26.876 "is_configured": true, 00:14:26.876 "data_offset": 0, 00:14:26.876 "data_size": 65536 00:14:26.876 }, 00:14:26.876 { 00:14:26.876 "name": "BaseBdev4", 00:14:26.876 "uuid": "a2486a5c-7317-4cd0-b1bd-0afa4d8d66dc", 00:14:26.876 "is_configured": true, 00:14:26.876 "data_offset": 0, 00:14:26.876 "data_size": 65536 00:14:26.876 } 00:14:26.876 ] 00:14:26.876 }' 00:14:26.876 23:36:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:26.876 23:36:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:27.444 23:36:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:14:27.444 23:36:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:27.444 23:36:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:27.444 23:36:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:27.444 23:36:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:27.444 23:36:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:27.444 23:36:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:27.444 23:36:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:27.703 [2024-07-24 23:36:12.516578] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:27.703 23:36:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:27.703 "name": "Existed_Raid", 00:14:27.703 "aliases": [ 00:14:27.703 "b75de686-e49e-4a52-b36d-b8fc7cdf0ec1" 00:14:27.703 ], 00:14:27.703 "product_name": "Raid Volume", 00:14:27.703 "block_size": 512, 00:14:27.703 "num_blocks": 262144, 00:14:27.703 "uuid": "b75de686-e49e-4a52-b36d-b8fc7cdf0ec1", 00:14:27.703 "assigned_rate_limits": { 00:14:27.703 "rw_ios_per_sec": 0, 00:14:27.703 "rw_mbytes_per_sec": 0, 00:14:27.703 "r_mbytes_per_sec": 0, 00:14:27.703 "w_mbytes_per_sec": 0 00:14:27.703 }, 00:14:27.703 "claimed": false, 00:14:27.703 "zoned": false, 00:14:27.703 "supported_io_types": { 00:14:27.703 "read": true, 00:14:27.703 "write": true, 00:14:27.703 "unmap": true, 00:14:27.703 "flush": true, 00:14:27.703 "reset": true, 00:14:27.703 "nvme_admin": false, 00:14:27.703 "nvme_io": false, 00:14:27.703 "nvme_io_md": false, 00:14:27.703 "write_zeroes": true, 00:14:27.703 "zcopy": false, 00:14:27.703 "get_zone_info": false, 00:14:27.703 "zone_management": false, 00:14:27.703 "zone_append": false, 00:14:27.703 "compare": false, 00:14:27.703 "compare_and_write": false, 00:14:27.703 "abort": false, 00:14:27.703 "seek_hole": false, 00:14:27.703 "seek_data": false, 00:14:27.703 "copy": false, 00:14:27.703 "nvme_iov_md": false 00:14:27.703 }, 00:14:27.703 "memory_domains": [ 00:14:27.703 { 00:14:27.703 "dma_device_id": "system", 00:14:27.703 "dma_device_type": 1 00:14:27.703 }, 00:14:27.703 { 00:14:27.703 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:27.703 "dma_device_type": 2 00:14:27.703 }, 00:14:27.703 { 00:14:27.703 "dma_device_id": "system", 00:14:27.703 "dma_device_type": 1 00:14:27.703 }, 00:14:27.703 { 00:14:27.703 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:27.703 "dma_device_type": 2 00:14:27.703 }, 00:14:27.703 { 00:14:27.703 "dma_device_id": "system", 00:14:27.703 "dma_device_type": 1 00:14:27.703 }, 00:14:27.703 { 00:14:27.703 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:27.703 "dma_device_type": 2 00:14:27.703 }, 00:14:27.703 { 00:14:27.703 "dma_device_id": "system", 00:14:27.703 "dma_device_type": 1 00:14:27.703 }, 00:14:27.703 { 00:14:27.703 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:27.703 "dma_device_type": 2 00:14:27.703 } 00:14:27.703 ], 00:14:27.703 "driver_specific": { 00:14:27.703 "raid": { 00:14:27.703 "uuid": "b75de686-e49e-4a52-b36d-b8fc7cdf0ec1", 00:14:27.703 "strip_size_kb": 64, 00:14:27.703 "state": "online", 00:14:27.703 "raid_level": "raid0", 00:14:27.703 "superblock": false, 00:14:27.703 "num_base_bdevs": 4, 00:14:27.703 "num_base_bdevs_discovered": 4, 00:14:27.703 "num_base_bdevs_operational": 4, 00:14:27.703 "base_bdevs_list": [ 00:14:27.703 { 00:14:27.703 "name": "NewBaseBdev", 00:14:27.703 "uuid": "cb71e775-aad8-4ebc-8210-a18e9d691afc", 00:14:27.703 "is_configured": true, 00:14:27.703 "data_offset": 0, 00:14:27.703 "data_size": 65536 00:14:27.703 }, 00:14:27.703 { 00:14:27.703 "name": "BaseBdev2", 00:14:27.703 "uuid": "4fb43d2b-24ce-45c1-b48b-aece9424a1a5", 00:14:27.703 "is_configured": true, 00:14:27.703 "data_offset": 0, 00:14:27.703 "data_size": 65536 00:14:27.703 }, 00:14:27.703 { 00:14:27.703 "name": "BaseBdev3", 00:14:27.703 "uuid": "53473a8b-5fb4-4151-ae68-65bf62fa4d95", 00:14:27.703 "is_configured": true, 00:14:27.703 "data_offset": 0, 00:14:27.703 "data_size": 65536 00:14:27.703 }, 00:14:27.703 { 00:14:27.703 "name": "BaseBdev4", 00:14:27.703 "uuid": "a2486a5c-7317-4cd0-b1bd-0afa4d8d66dc", 00:14:27.703 "is_configured": true, 00:14:27.703 "data_offset": 0, 00:14:27.703 "data_size": 65536 00:14:27.703 } 00:14:27.703 ] 00:14:27.703 } 00:14:27.703 } 00:14:27.703 }' 00:14:27.703 23:36:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:27.703 23:36:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:14:27.703 BaseBdev2 00:14:27.703 BaseBdev3 00:14:27.703 BaseBdev4' 00:14:27.703 23:36:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:27.703 23:36:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:27.703 23:36:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:14:27.962 23:36:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:27.962 "name": "NewBaseBdev", 00:14:27.962 "aliases": [ 00:14:27.962 "cb71e775-aad8-4ebc-8210-a18e9d691afc" 00:14:27.962 ], 00:14:27.962 "product_name": "Malloc disk", 00:14:27.962 "block_size": 512, 00:14:27.962 "num_blocks": 65536, 00:14:27.962 "uuid": "cb71e775-aad8-4ebc-8210-a18e9d691afc", 00:14:27.962 "assigned_rate_limits": { 00:14:27.962 "rw_ios_per_sec": 0, 00:14:27.962 "rw_mbytes_per_sec": 0, 00:14:27.962 "r_mbytes_per_sec": 0, 00:14:27.962 "w_mbytes_per_sec": 0 00:14:27.962 }, 00:14:27.962 "claimed": true, 00:14:27.962 "claim_type": "exclusive_write", 00:14:27.962 "zoned": false, 00:14:27.962 "supported_io_types": { 00:14:27.962 "read": true, 00:14:27.962 "write": true, 00:14:27.962 "unmap": true, 00:14:27.962 "flush": true, 00:14:27.962 "reset": true, 00:14:27.962 "nvme_admin": false, 00:14:27.962 "nvme_io": false, 00:14:27.962 "nvme_io_md": false, 00:14:27.962 "write_zeroes": true, 00:14:27.962 "zcopy": true, 00:14:27.962 "get_zone_info": false, 00:14:27.962 "zone_management": false, 00:14:27.962 "zone_append": false, 00:14:27.962 "compare": false, 00:14:27.962 "compare_and_write": false, 00:14:27.962 "abort": true, 00:14:27.962 "seek_hole": false, 00:14:27.962 "seek_data": false, 00:14:27.962 "copy": true, 00:14:27.962 "nvme_iov_md": false 00:14:27.962 }, 00:14:27.962 "memory_domains": [ 00:14:27.962 { 00:14:27.962 "dma_device_id": "system", 00:14:27.962 "dma_device_type": 1 00:14:27.962 }, 00:14:27.962 { 00:14:27.962 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:27.962 "dma_device_type": 2 00:14:27.962 } 00:14:27.962 ], 00:14:27.962 "driver_specific": {} 00:14:27.962 }' 00:14:27.962 23:36:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:27.962 23:36:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:27.962 23:36:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:27.962 23:36:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:27.962 23:36:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:27.962 23:36:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:27.962 23:36:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:27.962 23:36:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:27.962 23:36:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:27.962 23:36:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:28.220 23:36:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:28.220 23:36:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:28.220 23:36:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:28.220 23:36:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:28.220 23:36:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:28.220 23:36:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:28.220 "name": "BaseBdev2", 00:14:28.220 "aliases": [ 00:14:28.220 "4fb43d2b-24ce-45c1-b48b-aece9424a1a5" 00:14:28.220 ], 00:14:28.220 "product_name": "Malloc disk", 00:14:28.220 "block_size": 512, 00:14:28.220 "num_blocks": 65536, 00:14:28.220 "uuid": "4fb43d2b-24ce-45c1-b48b-aece9424a1a5", 00:14:28.220 "assigned_rate_limits": { 00:14:28.220 "rw_ios_per_sec": 0, 00:14:28.220 "rw_mbytes_per_sec": 0, 00:14:28.220 "r_mbytes_per_sec": 0, 00:14:28.220 "w_mbytes_per_sec": 0 00:14:28.220 }, 00:14:28.220 "claimed": true, 00:14:28.220 "claim_type": "exclusive_write", 00:14:28.220 "zoned": false, 00:14:28.220 "supported_io_types": { 00:14:28.220 "read": true, 00:14:28.220 "write": true, 00:14:28.220 "unmap": true, 00:14:28.220 "flush": true, 00:14:28.220 "reset": true, 00:14:28.220 "nvme_admin": false, 00:14:28.220 "nvme_io": false, 00:14:28.220 "nvme_io_md": false, 00:14:28.220 "write_zeroes": true, 00:14:28.220 "zcopy": true, 00:14:28.220 "get_zone_info": false, 00:14:28.220 "zone_management": false, 00:14:28.220 "zone_append": false, 00:14:28.220 "compare": false, 00:14:28.220 "compare_and_write": false, 00:14:28.220 "abort": true, 00:14:28.220 "seek_hole": false, 00:14:28.220 "seek_data": false, 00:14:28.220 "copy": true, 00:14:28.220 "nvme_iov_md": false 00:14:28.220 }, 00:14:28.221 "memory_domains": [ 00:14:28.221 { 00:14:28.221 "dma_device_id": "system", 00:14:28.221 "dma_device_type": 1 00:14:28.221 }, 00:14:28.221 { 00:14:28.221 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:28.221 "dma_device_type": 2 00:14:28.221 } 00:14:28.221 ], 00:14:28.221 "driver_specific": {} 00:14:28.221 }' 00:14:28.221 23:36:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:28.479 23:36:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:28.479 23:36:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:28.479 23:36:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:28.479 23:36:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:28.479 23:36:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:28.479 23:36:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:28.479 23:36:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:28.479 23:36:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:28.479 23:36:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:28.479 23:36:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:28.737 23:36:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:28.737 23:36:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:28.737 23:36:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:28.737 23:36:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:28.737 23:36:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:28.737 "name": "BaseBdev3", 00:14:28.737 "aliases": [ 00:14:28.737 "53473a8b-5fb4-4151-ae68-65bf62fa4d95" 00:14:28.737 ], 00:14:28.737 "product_name": "Malloc disk", 00:14:28.737 "block_size": 512, 00:14:28.737 "num_blocks": 65536, 00:14:28.737 "uuid": "53473a8b-5fb4-4151-ae68-65bf62fa4d95", 00:14:28.737 "assigned_rate_limits": { 00:14:28.737 "rw_ios_per_sec": 0, 00:14:28.737 "rw_mbytes_per_sec": 0, 00:14:28.737 "r_mbytes_per_sec": 0, 00:14:28.737 "w_mbytes_per_sec": 0 00:14:28.737 }, 00:14:28.737 "claimed": true, 00:14:28.737 "claim_type": "exclusive_write", 00:14:28.737 "zoned": false, 00:14:28.737 "supported_io_types": { 00:14:28.737 "read": true, 00:14:28.737 "write": true, 00:14:28.737 "unmap": true, 00:14:28.737 "flush": true, 00:14:28.737 "reset": true, 00:14:28.737 "nvme_admin": false, 00:14:28.737 "nvme_io": false, 00:14:28.737 "nvme_io_md": false, 00:14:28.737 "write_zeroes": true, 00:14:28.737 "zcopy": true, 00:14:28.737 "get_zone_info": false, 00:14:28.737 "zone_management": false, 00:14:28.737 "zone_append": false, 00:14:28.737 "compare": false, 00:14:28.737 "compare_and_write": false, 00:14:28.737 "abort": true, 00:14:28.737 "seek_hole": false, 00:14:28.737 "seek_data": false, 00:14:28.737 "copy": true, 00:14:28.737 "nvme_iov_md": false 00:14:28.737 }, 00:14:28.737 "memory_domains": [ 00:14:28.737 { 00:14:28.737 "dma_device_id": "system", 00:14:28.737 "dma_device_type": 1 00:14:28.737 }, 00:14:28.737 { 00:14:28.737 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:28.737 "dma_device_type": 2 00:14:28.737 } 00:14:28.737 ], 00:14:28.737 "driver_specific": {} 00:14:28.737 }' 00:14:28.737 23:36:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:28.737 23:36:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:28.996 23:36:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:28.996 23:36:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:28.996 23:36:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:28.996 23:36:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:28.996 23:36:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:28.996 23:36:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:28.996 23:36:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:28.996 23:36:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:28.996 23:36:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:29.255 23:36:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:29.255 23:36:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:29.255 23:36:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:14:29.255 23:36:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:29.255 23:36:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:29.255 "name": "BaseBdev4", 00:14:29.255 "aliases": [ 00:14:29.255 "a2486a5c-7317-4cd0-b1bd-0afa4d8d66dc" 00:14:29.255 ], 00:14:29.255 "product_name": "Malloc disk", 00:14:29.255 "block_size": 512, 00:14:29.255 "num_blocks": 65536, 00:14:29.255 "uuid": "a2486a5c-7317-4cd0-b1bd-0afa4d8d66dc", 00:14:29.255 "assigned_rate_limits": { 00:14:29.255 "rw_ios_per_sec": 0, 00:14:29.255 "rw_mbytes_per_sec": 0, 00:14:29.255 "r_mbytes_per_sec": 0, 00:14:29.255 "w_mbytes_per_sec": 0 00:14:29.255 }, 00:14:29.255 "claimed": true, 00:14:29.255 "claim_type": "exclusive_write", 00:14:29.255 "zoned": false, 00:14:29.255 "supported_io_types": { 00:14:29.255 "read": true, 00:14:29.255 "write": true, 00:14:29.255 "unmap": true, 00:14:29.255 "flush": true, 00:14:29.255 "reset": true, 00:14:29.255 "nvme_admin": false, 00:14:29.255 "nvme_io": false, 00:14:29.255 "nvme_io_md": false, 00:14:29.255 "write_zeroes": true, 00:14:29.255 "zcopy": true, 00:14:29.255 "get_zone_info": false, 00:14:29.255 "zone_management": false, 00:14:29.255 "zone_append": false, 00:14:29.255 "compare": false, 00:14:29.255 "compare_and_write": false, 00:14:29.255 "abort": true, 00:14:29.255 "seek_hole": false, 00:14:29.255 "seek_data": false, 00:14:29.255 "copy": true, 00:14:29.255 "nvme_iov_md": false 00:14:29.255 }, 00:14:29.255 "memory_domains": [ 00:14:29.255 { 00:14:29.255 "dma_device_id": "system", 00:14:29.255 "dma_device_type": 1 00:14:29.255 }, 00:14:29.255 { 00:14:29.255 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:29.255 "dma_device_type": 2 00:14:29.255 } 00:14:29.255 ], 00:14:29.255 "driver_specific": {} 00:14:29.255 }' 00:14:29.255 23:36:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:29.255 23:36:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:29.516 23:36:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:29.516 23:36:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:29.516 23:36:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:29.516 23:36:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:29.516 23:36:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:29.516 23:36:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:29.516 23:36:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:29.516 23:36:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:29.516 23:36:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:29.516 23:36:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:29.516 23:36:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:29.775 [2024-07-24 23:36:14.645884] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:29.775 [2024-07-24 23:36:14.645902] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:29.775 [2024-07-24 23:36:14.645935] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:29.775 [2024-07-24 23:36:14.645973] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:29.776 [2024-07-24 23:36:14.645979] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17f57e0 name Existed_Raid, state offline 00:14:29.776 23:36:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 302070 00:14:29.776 23:36:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@950 -- # '[' -z 302070 ']' 00:14:29.776 23:36:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # kill -0 302070 00:14:29.776 23:36:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # uname 00:14:29.776 23:36:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:14:29.776 23:36:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 302070 00:14:29.776 23:36:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:14:29.776 23:36:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:14:29.776 23:36:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 302070' 00:14:29.776 killing process with pid 302070 00:14:29.776 23:36:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@969 -- # kill 302070 00:14:29.776 [2024-07-24 23:36:14.704186] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:29.776 23:36:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@974 -- # wait 302070 00:14:29.776 [2024-07-24 23:36:14.735332] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:30.048 23:36:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:14:30.048 00:14:30.048 real 0m24.415s 00:14:30.048 user 0m45.437s 00:14:30.048 sys 0m3.769s 00:14:30.048 23:36:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:14:30.048 23:36:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:30.048 ************************************ 00:14:30.048 END TEST raid_state_function_test 00:14:30.048 ************************************ 00:14:30.048 23:36:14 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 4 true 00:14:30.048 23:36:14 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:14:30.048 23:36:14 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:14:30.048 23:36:14 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:30.048 ************************************ 00:14:30.048 START TEST raid_state_function_test_sb 00:14:30.048 ************************************ 00:14:30.048 23:36:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # raid_state_function_test raid0 4 true 00:14:30.048 23:36:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:14:30.048 23:36:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:14:30.048 23:36:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:14:30.048 23:36:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:14:30.048 23:36:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:14:30.048 23:36:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:30.048 23:36:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:14:30.048 23:36:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:30.048 23:36:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:30.048 23:36:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:14:30.048 23:36:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:30.048 23:36:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:30.048 23:36:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:14:30.048 23:36:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:30.048 23:36:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:30.048 23:36:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:14:30.048 23:36:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:30.048 23:36:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:30.048 23:36:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:14:30.048 23:36:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:14:30.048 23:36:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:14:30.048 23:36:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:14:30.048 23:36:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:14:30.048 23:36:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:14:30.048 23:36:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:14:30.048 23:36:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:14:30.048 23:36:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:14:30.048 23:36:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:14:30.048 23:36:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:14:30.048 23:36:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=307346 00:14:30.048 23:36:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 307346' 00:14:30.048 Process raid pid: 307346 00:14:30.048 23:36:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:14:30.048 23:36:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 307346 /var/tmp/spdk-raid.sock 00:14:30.048 23:36:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@831 -- # '[' -z 307346 ']' 00:14:30.048 23:36:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:30.048 23:36:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # local max_retries=100 00:14:30.048 23:36:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:30.048 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:30.048 23:36:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@840 -- # xtrace_disable 00:14:30.048 23:36:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:30.048 [2024-07-24 23:36:15.028355] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:14:30.048 [2024-07-24 23:36:15.028394] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:14:30.320 [2024-07-24 23:36:15.091645] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:30.320 [2024-07-24 23:36:15.170181] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:30.320 [2024-07-24 23:36:15.224016] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:30.320 [2024-07-24 23:36:15.224041] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:30.886 23:36:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:14:30.886 23:36:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@864 -- # return 0 00:14:30.886 23:36:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:14:31.144 [2024-07-24 23:36:15.974868] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:31.144 [2024-07-24 23:36:15.974897] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:31.144 [2024-07-24 23:36:15.974905] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:31.144 [2024-07-24 23:36:15.974910] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:31.144 [2024-07-24 23:36:15.974914] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:31.144 [2024-07-24 23:36:15.974919] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:31.144 [2024-07-24 23:36:15.974923] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:14:31.144 [2024-07-24 23:36:15.974928] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:14:31.144 23:36:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:31.144 23:36:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:31.144 23:36:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:31.144 23:36:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:31.144 23:36:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:31.144 23:36:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:31.144 23:36:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:31.144 23:36:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:31.144 23:36:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:31.144 23:36:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:31.144 23:36:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:31.144 23:36:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:31.402 23:36:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:31.402 "name": "Existed_Raid", 00:14:31.402 "uuid": "e64c73c7-9ddf-486f-bd3c-2fa03049bc21", 00:14:31.402 "strip_size_kb": 64, 00:14:31.402 "state": "configuring", 00:14:31.402 "raid_level": "raid0", 00:14:31.402 "superblock": true, 00:14:31.402 "num_base_bdevs": 4, 00:14:31.402 "num_base_bdevs_discovered": 0, 00:14:31.402 "num_base_bdevs_operational": 4, 00:14:31.402 "base_bdevs_list": [ 00:14:31.402 { 00:14:31.402 "name": "BaseBdev1", 00:14:31.402 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:31.402 "is_configured": false, 00:14:31.402 "data_offset": 0, 00:14:31.402 "data_size": 0 00:14:31.402 }, 00:14:31.402 { 00:14:31.402 "name": "BaseBdev2", 00:14:31.402 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:31.402 "is_configured": false, 00:14:31.402 "data_offset": 0, 00:14:31.402 "data_size": 0 00:14:31.402 }, 00:14:31.402 { 00:14:31.402 "name": "BaseBdev3", 00:14:31.402 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:31.402 "is_configured": false, 00:14:31.402 "data_offset": 0, 00:14:31.402 "data_size": 0 00:14:31.402 }, 00:14:31.402 { 00:14:31.402 "name": "BaseBdev4", 00:14:31.402 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:31.402 "is_configured": false, 00:14:31.402 "data_offset": 0, 00:14:31.402 "data_size": 0 00:14:31.402 } 00:14:31.402 ] 00:14:31.402 }' 00:14:31.402 23:36:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:31.402 23:36:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:31.660 23:36:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:31.917 [2024-07-24 23:36:16.776864] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:31.917 [2024-07-24 23:36:16.776883] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16e5b50 name Existed_Raid, state configuring 00:14:31.917 23:36:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:14:32.175 [2024-07-24 23:36:16.953352] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:32.175 [2024-07-24 23:36:16.953369] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:32.175 [2024-07-24 23:36:16.953374] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:32.175 [2024-07-24 23:36:16.953379] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:32.175 [2024-07-24 23:36:16.953383] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:32.175 [2024-07-24 23:36:16.953388] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:32.175 [2024-07-24 23:36:16.953392] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:14:32.175 [2024-07-24 23:36:16.953397] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:14:32.175 23:36:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:32.175 [2024-07-24 23:36:17.138532] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:32.175 BaseBdev1 00:14:32.175 23:36:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:14:32.175 23:36:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:14:32.175 23:36:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:14:32.175 23:36:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:14:32.175 23:36:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:14:32.175 23:36:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:14:32.175 23:36:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:32.433 23:36:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:32.691 [ 00:14:32.691 { 00:14:32.691 "name": "BaseBdev1", 00:14:32.691 "aliases": [ 00:14:32.691 "caba46c3-28ea-4cf7-8106-eb30101d9d3b" 00:14:32.691 ], 00:14:32.691 "product_name": "Malloc disk", 00:14:32.691 "block_size": 512, 00:14:32.691 "num_blocks": 65536, 00:14:32.691 "uuid": "caba46c3-28ea-4cf7-8106-eb30101d9d3b", 00:14:32.691 "assigned_rate_limits": { 00:14:32.691 "rw_ios_per_sec": 0, 00:14:32.691 "rw_mbytes_per_sec": 0, 00:14:32.691 "r_mbytes_per_sec": 0, 00:14:32.691 "w_mbytes_per_sec": 0 00:14:32.691 }, 00:14:32.691 "claimed": true, 00:14:32.691 "claim_type": "exclusive_write", 00:14:32.691 "zoned": false, 00:14:32.691 "supported_io_types": { 00:14:32.691 "read": true, 00:14:32.691 "write": true, 00:14:32.691 "unmap": true, 00:14:32.691 "flush": true, 00:14:32.691 "reset": true, 00:14:32.691 "nvme_admin": false, 00:14:32.691 "nvme_io": false, 00:14:32.691 "nvme_io_md": false, 00:14:32.691 "write_zeroes": true, 00:14:32.691 "zcopy": true, 00:14:32.691 "get_zone_info": false, 00:14:32.691 "zone_management": false, 00:14:32.691 "zone_append": false, 00:14:32.691 "compare": false, 00:14:32.691 "compare_and_write": false, 00:14:32.691 "abort": true, 00:14:32.691 "seek_hole": false, 00:14:32.691 "seek_data": false, 00:14:32.691 "copy": true, 00:14:32.691 "nvme_iov_md": false 00:14:32.691 }, 00:14:32.691 "memory_domains": [ 00:14:32.691 { 00:14:32.691 "dma_device_id": "system", 00:14:32.691 "dma_device_type": 1 00:14:32.691 }, 00:14:32.691 { 00:14:32.691 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:32.691 "dma_device_type": 2 00:14:32.691 } 00:14:32.691 ], 00:14:32.691 "driver_specific": {} 00:14:32.691 } 00:14:32.691 ] 00:14:32.691 23:36:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:14:32.691 23:36:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:32.691 23:36:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:32.691 23:36:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:32.691 23:36:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:32.691 23:36:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:32.691 23:36:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:32.691 23:36:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:32.692 23:36:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:32.692 23:36:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:32.692 23:36:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:32.692 23:36:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:32.692 23:36:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:32.692 23:36:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:32.692 "name": "Existed_Raid", 00:14:32.692 "uuid": "fd25261a-b96d-4bc1-b952-7fea9478c987", 00:14:32.692 "strip_size_kb": 64, 00:14:32.692 "state": "configuring", 00:14:32.692 "raid_level": "raid0", 00:14:32.692 "superblock": true, 00:14:32.692 "num_base_bdevs": 4, 00:14:32.692 "num_base_bdevs_discovered": 1, 00:14:32.692 "num_base_bdevs_operational": 4, 00:14:32.692 "base_bdevs_list": [ 00:14:32.692 { 00:14:32.692 "name": "BaseBdev1", 00:14:32.692 "uuid": "caba46c3-28ea-4cf7-8106-eb30101d9d3b", 00:14:32.692 "is_configured": true, 00:14:32.692 "data_offset": 2048, 00:14:32.692 "data_size": 63488 00:14:32.692 }, 00:14:32.692 { 00:14:32.692 "name": "BaseBdev2", 00:14:32.692 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:32.692 "is_configured": false, 00:14:32.692 "data_offset": 0, 00:14:32.692 "data_size": 0 00:14:32.692 }, 00:14:32.692 { 00:14:32.692 "name": "BaseBdev3", 00:14:32.692 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:32.692 "is_configured": false, 00:14:32.692 "data_offset": 0, 00:14:32.692 "data_size": 0 00:14:32.692 }, 00:14:32.692 { 00:14:32.692 "name": "BaseBdev4", 00:14:32.692 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:32.692 "is_configured": false, 00:14:32.692 "data_offset": 0, 00:14:32.692 "data_size": 0 00:14:32.692 } 00:14:32.692 ] 00:14:32.692 }' 00:14:32.692 23:36:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:32.692 23:36:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:33.258 23:36:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:33.515 [2024-07-24 23:36:18.309563] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:33.515 [2024-07-24 23:36:18.309593] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16e53a0 name Existed_Raid, state configuring 00:14:33.515 23:36:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:14:33.515 [2024-07-24 23:36:18.461988] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:33.516 [2024-07-24 23:36:18.462996] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:33.516 [2024-07-24 23:36:18.463020] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:33.516 [2024-07-24 23:36:18.463025] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:33.516 [2024-07-24 23:36:18.463030] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:33.516 [2024-07-24 23:36:18.463035] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:14:33.516 [2024-07-24 23:36:18.463039] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:14:33.516 23:36:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:14:33.516 23:36:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:33.516 23:36:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:33.516 23:36:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:33.516 23:36:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:33.516 23:36:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:33.516 23:36:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:33.516 23:36:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:33.516 23:36:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:33.516 23:36:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:33.516 23:36:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:33.516 23:36:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:33.516 23:36:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:33.516 23:36:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:33.774 23:36:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:33.774 "name": "Existed_Raid", 00:14:33.774 "uuid": "627e6ede-1b18-46c7-a210-4b21e8acd7ac", 00:14:33.774 "strip_size_kb": 64, 00:14:33.774 "state": "configuring", 00:14:33.774 "raid_level": "raid0", 00:14:33.774 "superblock": true, 00:14:33.774 "num_base_bdevs": 4, 00:14:33.774 "num_base_bdevs_discovered": 1, 00:14:33.774 "num_base_bdevs_operational": 4, 00:14:33.774 "base_bdevs_list": [ 00:14:33.774 { 00:14:33.774 "name": "BaseBdev1", 00:14:33.774 "uuid": "caba46c3-28ea-4cf7-8106-eb30101d9d3b", 00:14:33.774 "is_configured": true, 00:14:33.774 "data_offset": 2048, 00:14:33.774 "data_size": 63488 00:14:33.774 }, 00:14:33.774 { 00:14:33.774 "name": "BaseBdev2", 00:14:33.774 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:33.774 "is_configured": false, 00:14:33.774 "data_offset": 0, 00:14:33.774 "data_size": 0 00:14:33.774 }, 00:14:33.774 { 00:14:33.774 "name": "BaseBdev3", 00:14:33.774 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:33.774 "is_configured": false, 00:14:33.774 "data_offset": 0, 00:14:33.774 "data_size": 0 00:14:33.774 }, 00:14:33.774 { 00:14:33.774 "name": "BaseBdev4", 00:14:33.774 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:33.774 "is_configured": false, 00:14:33.774 "data_offset": 0, 00:14:33.774 "data_size": 0 00:14:33.774 } 00:14:33.774 ] 00:14:33.774 }' 00:14:33.774 23:36:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:33.774 23:36:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:34.340 23:36:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:34.340 [2024-07-24 23:36:19.258641] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:34.340 BaseBdev2 00:14:34.340 23:36:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:14:34.340 23:36:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:14:34.340 23:36:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:14:34.340 23:36:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:14:34.340 23:36:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:14:34.340 23:36:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:14:34.340 23:36:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:34.599 23:36:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:34.599 [ 00:14:34.599 { 00:14:34.599 "name": "BaseBdev2", 00:14:34.599 "aliases": [ 00:14:34.599 "cb52d8aa-f2a8-4e6d-b620-cdb78d75892a" 00:14:34.599 ], 00:14:34.599 "product_name": "Malloc disk", 00:14:34.599 "block_size": 512, 00:14:34.599 "num_blocks": 65536, 00:14:34.599 "uuid": "cb52d8aa-f2a8-4e6d-b620-cdb78d75892a", 00:14:34.599 "assigned_rate_limits": { 00:14:34.599 "rw_ios_per_sec": 0, 00:14:34.599 "rw_mbytes_per_sec": 0, 00:14:34.599 "r_mbytes_per_sec": 0, 00:14:34.599 "w_mbytes_per_sec": 0 00:14:34.599 }, 00:14:34.599 "claimed": true, 00:14:34.599 "claim_type": "exclusive_write", 00:14:34.599 "zoned": false, 00:14:34.599 "supported_io_types": { 00:14:34.599 "read": true, 00:14:34.599 "write": true, 00:14:34.599 "unmap": true, 00:14:34.599 "flush": true, 00:14:34.599 "reset": true, 00:14:34.599 "nvme_admin": false, 00:14:34.599 "nvme_io": false, 00:14:34.599 "nvme_io_md": false, 00:14:34.599 "write_zeroes": true, 00:14:34.599 "zcopy": true, 00:14:34.599 "get_zone_info": false, 00:14:34.599 "zone_management": false, 00:14:34.599 "zone_append": false, 00:14:34.599 "compare": false, 00:14:34.599 "compare_and_write": false, 00:14:34.599 "abort": true, 00:14:34.599 "seek_hole": false, 00:14:34.599 "seek_data": false, 00:14:34.599 "copy": true, 00:14:34.599 "nvme_iov_md": false 00:14:34.599 }, 00:14:34.599 "memory_domains": [ 00:14:34.599 { 00:14:34.599 "dma_device_id": "system", 00:14:34.599 "dma_device_type": 1 00:14:34.599 }, 00:14:34.599 { 00:14:34.599 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:34.599 "dma_device_type": 2 00:14:34.599 } 00:14:34.599 ], 00:14:34.599 "driver_specific": {} 00:14:34.599 } 00:14:34.599 ] 00:14:34.599 23:36:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:14:34.599 23:36:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:34.599 23:36:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:34.599 23:36:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:34.599 23:36:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:34.599 23:36:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:34.599 23:36:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:34.599 23:36:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:34.599 23:36:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:34.599 23:36:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:34.599 23:36:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:34.599 23:36:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:34.599 23:36:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:34.599 23:36:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:34.599 23:36:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:34.857 23:36:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:34.857 "name": "Existed_Raid", 00:14:34.857 "uuid": "627e6ede-1b18-46c7-a210-4b21e8acd7ac", 00:14:34.857 "strip_size_kb": 64, 00:14:34.857 "state": "configuring", 00:14:34.857 "raid_level": "raid0", 00:14:34.857 "superblock": true, 00:14:34.857 "num_base_bdevs": 4, 00:14:34.857 "num_base_bdevs_discovered": 2, 00:14:34.857 "num_base_bdevs_operational": 4, 00:14:34.857 "base_bdevs_list": [ 00:14:34.857 { 00:14:34.857 "name": "BaseBdev1", 00:14:34.857 "uuid": "caba46c3-28ea-4cf7-8106-eb30101d9d3b", 00:14:34.857 "is_configured": true, 00:14:34.857 "data_offset": 2048, 00:14:34.857 "data_size": 63488 00:14:34.857 }, 00:14:34.857 { 00:14:34.857 "name": "BaseBdev2", 00:14:34.857 "uuid": "cb52d8aa-f2a8-4e6d-b620-cdb78d75892a", 00:14:34.857 "is_configured": true, 00:14:34.857 "data_offset": 2048, 00:14:34.857 "data_size": 63488 00:14:34.857 }, 00:14:34.857 { 00:14:34.857 "name": "BaseBdev3", 00:14:34.857 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:34.857 "is_configured": false, 00:14:34.857 "data_offset": 0, 00:14:34.857 "data_size": 0 00:14:34.857 }, 00:14:34.857 { 00:14:34.857 "name": "BaseBdev4", 00:14:34.857 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:34.857 "is_configured": false, 00:14:34.857 "data_offset": 0, 00:14:34.857 "data_size": 0 00:14:34.857 } 00:14:34.857 ] 00:14:34.857 }' 00:14:34.857 23:36:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:34.857 23:36:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:35.423 23:36:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:35.423 [2024-07-24 23:36:20.392217] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:35.423 BaseBdev3 00:14:35.423 23:36:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:14:35.423 23:36:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:14:35.423 23:36:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:14:35.423 23:36:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:14:35.423 23:36:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:14:35.423 23:36:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:14:35.423 23:36:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:35.681 23:36:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:35.939 [ 00:14:35.939 { 00:14:35.939 "name": "BaseBdev3", 00:14:35.939 "aliases": [ 00:14:35.939 "e76291c4-62ef-4826-89c7-cd2f9c12cf48" 00:14:35.939 ], 00:14:35.939 "product_name": "Malloc disk", 00:14:35.939 "block_size": 512, 00:14:35.939 "num_blocks": 65536, 00:14:35.939 "uuid": "e76291c4-62ef-4826-89c7-cd2f9c12cf48", 00:14:35.939 "assigned_rate_limits": { 00:14:35.939 "rw_ios_per_sec": 0, 00:14:35.939 "rw_mbytes_per_sec": 0, 00:14:35.939 "r_mbytes_per_sec": 0, 00:14:35.939 "w_mbytes_per_sec": 0 00:14:35.939 }, 00:14:35.939 "claimed": true, 00:14:35.939 "claim_type": "exclusive_write", 00:14:35.939 "zoned": false, 00:14:35.939 "supported_io_types": { 00:14:35.939 "read": true, 00:14:35.939 "write": true, 00:14:35.939 "unmap": true, 00:14:35.939 "flush": true, 00:14:35.939 "reset": true, 00:14:35.939 "nvme_admin": false, 00:14:35.939 "nvme_io": false, 00:14:35.939 "nvme_io_md": false, 00:14:35.939 "write_zeroes": true, 00:14:35.939 "zcopy": true, 00:14:35.939 "get_zone_info": false, 00:14:35.939 "zone_management": false, 00:14:35.939 "zone_append": false, 00:14:35.939 "compare": false, 00:14:35.939 "compare_and_write": false, 00:14:35.939 "abort": true, 00:14:35.939 "seek_hole": false, 00:14:35.939 "seek_data": false, 00:14:35.940 "copy": true, 00:14:35.940 "nvme_iov_md": false 00:14:35.940 }, 00:14:35.940 "memory_domains": [ 00:14:35.940 { 00:14:35.940 "dma_device_id": "system", 00:14:35.940 "dma_device_type": 1 00:14:35.940 }, 00:14:35.940 { 00:14:35.940 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:35.940 "dma_device_type": 2 00:14:35.940 } 00:14:35.940 ], 00:14:35.940 "driver_specific": {} 00:14:35.940 } 00:14:35.940 ] 00:14:35.940 23:36:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:14:35.940 23:36:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:35.940 23:36:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:35.940 23:36:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:35.940 23:36:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:35.940 23:36:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:35.940 23:36:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:35.940 23:36:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:35.940 23:36:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:35.940 23:36:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:35.940 23:36:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:35.940 23:36:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:35.940 23:36:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:35.940 23:36:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:35.940 23:36:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:35.940 23:36:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:35.940 "name": "Existed_Raid", 00:14:35.940 "uuid": "627e6ede-1b18-46c7-a210-4b21e8acd7ac", 00:14:35.940 "strip_size_kb": 64, 00:14:35.940 "state": "configuring", 00:14:35.940 "raid_level": "raid0", 00:14:35.940 "superblock": true, 00:14:35.940 "num_base_bdevs": 4, 00:14:35.940 "num_base_bdevs_discovered": 3, 00:14:35.940 "num_base_bdevs_operational": 4, 00:14:35.940 "base_bdevs_list": [ 00:14:35.940 { 00:14:35.940 "name": "BaseBdev1", 00:14:35.940 "uuid": "caba46c3-28ea-4cf7-8106-eb30101d9d3b", 00:14:35.940 "is_configured": true, 00:14:35.940 "data_offset": 2048, 00:14:35.940 "data_size": 63488 00:14:35.940 }, 00:14:35.940 { 00:14:35.940 "name": "BaseBdev2", 00:14:35.940 "uuid": "cb52d8aa-f2a8-4e6d-b620-cdb78d75892a", 00:14:35.940 "is_configured": true, 00:14:35.940 "data_offset": 2048, 00:14:35.940 "data_size": 63488 00:14:35.940 }, 00:14:35.940 { 00:14:35.940 "name": "BaseBdev3", 00:14:35.940 "uuid": "e76291c4-62ef-4826-89c7-cd2f9c12cf48", 00:14:35.940 "is_configured": true, 00:14:35.940 "data_offset": 2048, 00:14:35.940 "data_size": 63488 00:14:35.940 }, 00:14:35.940 { 00:14:35.940 "name": "BaseBdev4", 00:14:35.940 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:35.940 "is_configured": false, 00:14:35.940 "data_offset": 0, 00:14:35.940 "data_size": 0 00:14:35.940 } 00:14:35.940 ] 00:14:35.940 }' 00:14:35.940 23:36:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:35.940 23:36:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:36.506 23:36:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:14:36.764 [2024-07-24 23:36:21.565953] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:14:36.764 [2024-07-24 23:36:21.566094] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x16e63d0 00:14:36.764 [2024-07-24 23:36:21.566104] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:14:36.764 [2024-07-24 23:36:21.566225] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16e60a0 00:14:36.764 [2024-07-24 23:36:21.566311] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x16e63d0 00:14:36.764 [2024-07-24 23:36:21.566317] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x16e63d0 00:14:36.764 [2024-07-24 23:36:21.566378] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:36.764 BaseBdev4 00:14:36.764 23:36:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:14:36.764 23:36:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev4 00:14:36.764 23:36:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:14:36.764 23:36:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:14:36.764 23:36:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:14:36.764 23:36:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:14:36.764 23:36:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:36.764 23:36:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:14:37.022 [ 00:14:37.022 { 00:14:37.022 "name": "BaseBdev4", 00:14:37.022 "aliases": [ 00:14:37.022 "ea3d023a-3db6-45fd-b0e8-f1399fea9359" 00:14:37.022 ], 00:14:37.022 "product_name": "Malloc disk", 00:14:37.022 "block_size": 512, 00:14:37.022 "num_blocks": 65536, 00:14:37.022 "uuid": "ea3d023a-3db6-45fd-b0e8-f1399fea9359", 00:14:37.022 "assigned_rate_limits": { 00:14:37.022 "rw_ios_per_sec": 0, 00:14:37.022 "rw_mbytes_per_sec": 0, 00:14:37.022 "r_mbytes_per_sec": 0, 00:14:37.022 "w_mbytes_per_sec": 0 00:14:37.022 }, 00:14:37.022 "claimed": true, 00:14:37.022 "claim_type": "exclusive_write", 00:14:37.022 "zoned": false, 00:14:37.022 "supported_io_types": { 00:14:37.022 "read": true, 00:14:37.022 "write": true, 00:14:37.022 "unmap": true, 00:14:37.022 "flush": true, 00:14:37.022 "reset": true, 00:14:37.022 "nvme_admin": false, 00:14:37.022 "nvme_io": false, 00:14:37.022 "nvme_io_md": false, 00:14:37.022 "write_zeroes": true, 00:14:37.022 "zcopy": true, 00:14:37.022 "get_zone_info": false, 00:14:37.022 "zone_management": false, 00:14:37.022 "zone_append": false, 00:14:37.022 "compare": false, 00:14:37.022 "compare_and_write": false, 00:14:37.022 "abort": true, 00:14:37.022 "seek_hole": false, 00:14:37.022 "seek_data": false, 00:14:37.022 "copy": true, 00:14:37.022 "nvme_iov_md": false 00:14:37.022 }, 00:14:37.022 "memory_domains": [ 00:14:37.022 { 00:14:37.022 "dma_device_id": "system", 00:14:37.022 "dma_device_type": 1 00:14:37.022 }, 00:14:37.022 { 00:14:37.022 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:37.022 "dma_device_type": 2 00:14:37.022 } 00:14:37.022 ], 00:14:37.022 "driver_specific": {} 00:14:37.022 } 00:14:37.022 ] 00:14:37.022 23:36:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:14:37.022 23:36:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:37.023 23:36:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:37.023 23:36:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:14:37.023 23:36:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:37.023 23:36:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:37.023 23:36:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:37.023 23:36:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:37.023 23:36:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:37.023 23:36:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:37.023 23:36:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:37.023 23:36:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:37.023 23:36:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:37.023 23:36:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:37.023 23:36:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:37.281 23:36:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:37.281 "name": "Existed_Raid", 00:14:37.281 "uuid": "627e6ede-1b18-46c7-a210-4b21e8acd7ac", 00:14:37.281 "strip_size_kb": 64, 00:14:37.281 "state": "online", 00:14:37.281 "raid_level": "raid0", 00:14:37.281 "superblock": true, 00:14:37.281 "num_base_bdevs": 4, 00:14:37.281 "num_base_bdevs_discovered": 4, 00:14:37.281 "num_base_bdevs_operational": 4, 00:14:37.281 "base_bdevs_list": [ 00:14:37.281 { 00:14:37.281 "name": "BaseBdev1", 00:14:37.281 "uuid": "caba46c3-28ea-4cf7-8106-eb30101d9d3b", 00:14:37.281 "is_configured": true, 00:14:37.281 "data_offset": 2048, 00:14:37.281 "data_size": 63488 00:14:37.281 }, 00:14:37.281 { 00:14:37.281 "name": "BaseBdev2", 00:14:37.281 "uuid": "cb52d8aa-f2a8-4e6d-b620-cdb78d75892a", 00:14:37.281 "is_configured": true, 00:14:37.281 "data_offset": 2048, 00:14:37.281 "data_size": 63488 00:14:37.281 }, 00:14:37.281 { 00:14:37.281 "name": "BaseBdev3", 00:14:37.281 "uuid": "e76291c4-62ef-4826-89c7-cd2f9c12cf48", 00:14:37.281 "is_configured": true, 00:14:37.281 "data_offset": 2048, 00:14:37.281 "data_size": 63488 00:14:37.281 }, 00:14:37.281 { 00:14:37.281 "name": "BaseBdev4", 00:14:37.281 "uuid": "ea3d023a-3db6-45fd-b0e8-f1399fea9359", 00:14:37.281 "is_configured": true, 00:14:37.281 "data_offset": 2048, 00:14:37.281 "data_size": 63488 00:14:37.281 } 00:14:37.281 ] 00:14:37.281 }' 00:14:37.281 23:36:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:37.281 23:36:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:37.847 23:36:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:14:37.847 23:36:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:37.847 23:36:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:37.847 23:36:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:37.847 23:36:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:37.847 23:36:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:14:37.847 23:36:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:37.847 23:36:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:37.848 [2024-07-24 23:36:22.713118] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:37.848 23:36:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:37.848 "name": "Existed_Raid", 00:14:37.848 "aliases": [ 00:14:37.848 "627e6ede-1b18-46c7-a210-4b21e8acd7ac" 00:14:37.848 ], 00:14:37.848 "product_name": "Raid Volume", 00:14:37.848 "block_size": 512, 00:14:37.848 "num_blocks": 253952, 00:14:37.848 "uuid": "627e6ede-1b18-46c7-a210-4b21e8acd7ac", 00:14:37.848 "assigned_rate_limits": { 00:14:37.848 "rw_ios_per_sec": 0, 00:14:37.848 "rw_mbytes_per_sec": 0, 00:14:37.848 "r_mbytes_per_sec": 0, 00:14:37.848 "w_mbytes_per_sec": 0 00:14:37.848 }, 00:14:37.848 "claimed": false, 00:14:37.848 "zoned": false, 00:14:37.848 "supported_io_types": { 00:14:37.848 "read": true, 00:14:37.848 "write": true, 00:14:37.848 "unmap": true, 00:14:37.848 "flush": true, 00:14:37.848 "reset": true, 00:14:37.848 "nvme_admin": false, 00:14:37.848 "nvme_io": false, 00:14:37.848 "nvme_io_md": false, 00:14:37.848 "write_zeroes": true, 00:14:37.848 "zcopy": false, 00:14:37.848 "get_zone_info": false, 00:14:37.848 "zone_management": false, 00:14:37.848 "zone_append": false, 00:14:37.848 "compare": false, 00:14:37.848 "compare_and_write": false, 00:14:37.848 "abort": false, 00:14:37.848 "seek_hole": false, 00:14:37.848 "seek_data": false, 00:14:37.848 "copy": false, 00:14:37.848 "nvme_iov_md": false 00:14:37.848 }, 00:14:37.848 "memory_domains": [ 00:14:37.848 { 00:14:37.848 "dma_device_id": "system", 00:14:37.848 "dma_device_type": 1 00:14:37.848 }, 00:14:37.848 { 00:14:37.848 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:37.848 "dma_device_type": 2 00:14:37.848 }, 00:14:37.848 { 00:14:37.848 "dma_device_id": "system", 00:14:37.848 "dma_device_type": 1 00:14:37.848 }, 00:14:37.848 { 00:14:37.848 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:37.848 "dma_device_type": 2 00:14:37.848 }, 00:14:37.848 { 00:14:37.848 "dma_device_id": "system", 00:14:37.848 "dma_device_type": 1 00:14:37.848 }, 00:14:37.848 { 00:14:37.848 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:37.848 "dma_device_type": 2 00:14:37.848 }, 00:14:37.848 { 00:14:37.848 "dma_device_id": "system", 00:14:37.848 "dma_device_type": 1 00:14:37.848 }, 00:14:37.848 { 00:14:37.848 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:37.848 "dma_device_type": 2 00:14:37.848 } 00:14:37.848 ], 00:14:37.848 "driver_specific": { 00:14:37.848 "raid": { 00:14:37.848 "uuid": "627e6ede-1b18-46c7-a210-4b21e8acd7ac", 00:14:37.848 "strip_size_kb": 64, 00:14:37.848 "state": "online", 00:14:37.848 "raid_level": "raid0", 00:14:37.848 "superblock": true, 00:14:37.848 "num_base_bdevs": 4, 00:14:37.848 "num_base_bdevs_discovered": 4, 00:14:37.848 "num_base_bdevs_operational": 4, 00:14:37.848 "base_bdevs_list": [ 00:14:37.848 { 00:14:37.848 "name": "BaseBdev1", 00:14:37.848 "uuid": "caba46c3-28ea-4cf7-8106-eb30101d9d3b", 00:14:37.848 "is_configured": true, 00:14:37.848 "data_offset": 2048, 00:14:37.848 "data_size": 63488 00:14:37.848 }, 00:14:37.848 { 00:14:37.848 "name": "BaseBdev2", 00:14:37.848 "uuid": "cb52d8aa-f2a8-4e6d-b620-cdb78d75892a", 00:14:37.848 "is_configured": true, 00:14:37.848 "data_offset": 2048, 00:14:37.848 "data_size": 63488 00:14:37.848 }, 00:14:37.848 { 00:14:37.848 "name": "BaseBdev3", 00:14:37.848 "uuid": "e76291c4-62ef-4826-89c7-cd2f9c12cf48", 00:14:37.848 "is_configured": true, 00:14:37.848 "data_offset": 2048, 00:14:37.848 "data_size": 63488 00:14:37.848 }, 00:14:37.848 { 00:14:37.848 "name": "BaseBdev4", 00:14:37.848 "uuid": "ea3d023a-3db6-45fd-b0e8-f1399fea9359", 00:14:37.848 "is_configured": true, 00:14:37.848 "data_offset": 2048, 00:14:37.848 "data_size": 63488 00:14:37.848 } 00:14:37.848 ] 00:14:37.848 } 00:14:37.848 } 00:14:37.848 }' 00:14:37.848 23:36:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:37.848 23:36:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:14:37.848 BaseBdev2 00:14:37.848 BaseBdev3 00:14:37.848 BaseBdev4' 00:14:37.848 23:36:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:37.848 23:36:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:37.848 23:36:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:14:38.106 23:36:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:38.106 "name": "BaseBdev1", 00:14:38.106 "aliases": [ 00:14:38.106 "caba46c3-28ea-4cf7-8106-eb30101d9d3b" 00:14:38.106 ], 00:14:38.106 "product_name": "Malloc disk", 00:14:38.106 "block_size": 512, 00:14:38.106 "num_blocks": 65536, 00:14:38.106 "uuid": "caba46c3-28ea-4cf7-8106-eb30101d9d3b", 00:14:38.106 "assigned_rate_limits": { 00:14:38.106 "rw_ios_per_sec": 0, 00:14:38.106 "rw_mbytes_per_sec": 0, 00:14:38.106 "r_mbytes_per_sec": 0, 00:14:38.106 "w_mbytes_per_sec": 0 00:14:38.106 }, 00:14:38.106 "claimed": true, 00:14:38.106 "claim_type": "exclusive_write", 00:14:38.106 "zoned": false, 00:14:38.106 "supported_io_types": { 00:14:38.106 "read": true, 00:14:38.106 "write": true, 00:14:38.106 "unmap": true, 00:14:38.106 "flush": true, 00:14:38.106 "reset": true, 00:14:38.106 "nvme_admin": false, 00:14:38.106 "nvme_io": false, 00:14:38.106 "nvme_io_md": false, 00:14:38.106 "write_zeroes": true, 00:14:38.106 "zcopy": true, 00:14:38.106 "get_zone_info": false, 00:14:38.106 "zone_management": false, 00:14:38.106 "zone_append": false, 00:14:38.106 "compare": false, 00:14:38.106 "compare_and_write": false, 00:14:38.106 "abort": true, 00:14:38.106 "seek_hole": false, 00:14:38.106 "seek_data": false, 00:14:38.106 "copy": true, 00:14:38.106 "nvme_iov_md": false 00:14:38.106 }, 00:14:38.106 "memory_domains": [ 00:14:38.106 { 00:14:38.106 "dma_device_id": "system", 00:14:38.106 "dma_device_type": 1 00:14:38.106 }, 00:14:38.106 { 00:14:38.106 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:38.106 "dma_device_type": 2 00:14:38.106 } 00:14:38.106 ], 00:14:38.106 "driver_specific": {} 00:14:38.106 }' 00:14:38.106 23:36:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:38.106 23:36:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:38.106 23:36:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:38.106 23:36:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:38.106 23:36:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:38.106 23:36:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:38.106 23:36:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:38.364 23:36:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:38.364 23:36:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:38.364 23:36:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:38.364 23:36:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:38.364 23:36:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:38.364 23:36:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:38.364 23:36:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:38.364 23:36:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:38.621 23:36:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:38.621 "name": "BaseBdev2", 00:14:38.621 "aliases": [ 00:14:38.621 "cb52d8aa-f2a8-4e6d-b620-cdb78d75892a" 00:14:38.621 ], 00:14:38.621 "product_name": "Malloc disk", 00:14:38.621 "block_size": 512, 00:14:38.621 "num_blocks": 65536, 00:14:38.621 "uuid": "cb52d8aa-f2a8-4e6d-b620-cdb78d75892a", 00:14:38.621 "assigned_rate_limits": { 00:14:38.621 "rw_ios_per_sec": 0, 00:14:38.621 "rw_mbytes_per_sec": 0, 00:14:38.621 "r_mbytes_per_sec": 0, 00:14:38.621 "w_mbytes_per_sec": 0 00:14:38.621 }, 00:14:38.621 "claimed": true, 00:14:38.621 "claim_type": "exclusive_write", 00:14:38.621 "zoned": false, 00:14:38.621 "supported_io_types": { 00:14:38.621 "read": true, 00:14:38.621 "write": true, 00:14:38.621 "unmap": true, 00:14:38.621 "flush": true, 00:14:38.621 "reset": true, 00:14:38.621 "nvme_admin": false, 00:14:38.621 "nvme_io": false, 00:14:38.621 "nvme_io_md": false, 00:14:38.621 "write_zeroes": true, 00:14:38.621 "zcopy": true, 00:14:38.621 "get_zone_info": false, 00:14:38.621 "zone_management": false, 00:14:38.621 "zone_append": false, 00:14:38.621 "compare": false, 00:14:38.621 "compare_and_write": false, 00:14:38.621 "abort": true, 00:14:38.621 "seek_hole": false, 00:14:38.621 "seek_data": false, 00:14:38.621 "copy": true, 00:14:38.621 "nvme_iov_md": false 00:14:38.621 }, 00:14:38.621 "memory_domains": [ 00:14:38.621 { 00:14:38.621 "dma_device_id": "system", 00:14:38.621 "dma_device_type": 1 00:14:38.621 }, 00:14:38.621 { 00:14:38.621 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:38.621 "dma_device_type": 2 00:14:38.621 } 00:14:38.621 ], 00:14:38.621 "driver_specific": {} 00:14:38.621 }' 00:14:38.621 23:36:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:38.621 23:36:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:38.621 23:36:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:38.621 23:36:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:38.621 23:36:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:38.621 23:36:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:38.621 23:36:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:38.621 23:36:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:38.878 23:36:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:38.878 23:36:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:38.878 23:36:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:38.878 23:36:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:38.878 23:36:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:38.878 23:36:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:38.878 23:36:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:39.136 23:36:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:39.136 "name": "BaseBdev3", 00:14:39.136 "aliases": [ 00:14:39.136 "e76291c4-62ef-4826-89c7-cd2f9c12cf48" 00:14:39.136 ], 00:14:39.136 "product_name": "Malloc disk", 00:14:39.136 "block_size": 512, 00:14:39.136 "num_blocks": 65536, 00:14:39.136 "uuid": "e76291c4-62ef-4826-89c7-cd2f9c12cf48", 00:14:39.136 "assigned_rate_limits": { 00:14:39.136 "rw_ios_per_sec": 0, 00:14:39.136 "rw_mbytes_per_sec": 0, 00:14:39.136 "r_mbytes_per_sec": 0, 00:14:39.136 "w_mbytes_per_sec": 0 00:14:39.136 }, 00:14:39.136 "claimed": true, 00:14:39.136 "claim_type": "exclusive_write", 00:14:39.136 "zoned": false, 00:14:39.136 "supported_io_types": { 00:14:39.136 "read": true, 00:14:39.136 "write": true, 00:14:39.136 "unmap": true, 00:14:39.136 "flush": true, 00:14:39.136 "reset": true, 00:14:39.136 "nvme_admin": false, 00:14:39.136 "nvme_io": false, 00:14:39.136 "nvme_io_md": false, 00:14:39.136 "write_zeroes": true, 00:14:39.136 "zcopy": true, 00:14:39.136 "get_zone_info": false, 00:14:39.136 "zone_management": false, 00:14:39.136 "zone_append": false, 00:14:39.136 "compare": false, 00:14:39.136 "compare_and_write": false, 00:14:39.136 "abort": true, 00:14:39.136 "seek_hole": false, 00:14:39.136 "seek_data": false, 00:14:39.136 "copy": true, 00:14:39.136 "nvme_iov_md": false 00:14:39.136 }, 00:14:39.136 "memory_domains": [ 00:14:39.136 { 00:14:39.136 "dma_device_id": "system", 00:14:39.136 "dma_device_type": 1 00:14:39.136 }, 00:14:39.136 { 00:14:39.136 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:39.136 "dma_device_type": 2 00:14:39.136 } 00:14:39.136 ], 00:14:39.136 "driver_specific": {} 00:14:39.136 }' 00:14:39.136 23:36:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:39.136 23:36:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:39.136 23:36:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:39.136 23:36:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:39.136 23:36:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:39.136 23:36:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:39.136 23:36:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:39.136 23:36:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:39.393 23:36:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:39.393 23:36:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:39.393 23:36:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:39.393 23:36:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:39.394 23:36:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:39.394 23:36:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:39.394 23:36:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:14:39.394 23:36:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:39.394 "name": "BaseBdev4", 00:14:39.394 "aliases": [ 00:14:39.394 "ea3d023a-3db6-45fd-b0e8-f1399fea9359" 00:14:39.394 ], 00:14:39.394 "product_name": "Malloc disk", 00:14:39.394 "block_size": 512, 00:14:39.394 "num_blocks": 65536, 00:14:39.394 "uuid": "ea3d023a-3db6-45fd-b0e8-f1399fea9359", 00:14:39.394 "assigned_rate_limits": { 00:14:39.394 "rw_ios_per_sec": 0, 00:14:39.394 "rw_mbytes_per_sec": 0, 00:14:39.394 "r_mbytes_per_sec": 0, 00:14:39.394 "w_mbytes_per_sec": 0 00:14:39.394 }, 00:14:39.394 "claimed": true, 00:14:39.394 "claim_type": "exclusive_write", 00:14:39.394 "zoned": false, 00:14:39.394 "supported_io_types": { 00:14:39.394 "read": true, 00:14:39.394 "write": true, 00:14:39.394 "unmap": true, 00:14:39.394 "flush": true, 00:14:39.394 "reset": true, 00:14:39.394 "nvme_admin": false, 00:14:39.394 "nvme_io": false, 00:14:39.394 "nvme_io_md": false, 00:14:39.394 "write_zeroes": true, 00:14:39.394 "zcopy": true, 00:14:39.394 "get_zone_info": false, 00:14:39.394 "zone_management": false, 00:14:39.394 "zone_append": false, 00:14:39.394 "compare": false, 00:14:39.394 "compare_and_write": false, 00:14:39.394 "abort": true, 00:14:39.394 "seek_hole": false, 00:14:39.394 "seek_data": false, 00:14:39.394 "copy": true, 00:14:39.394 "nvme_iov_md": false 00:14:39.394 }, 00:14:39.394 "memory_domains": [ 00:14:39.394 { 00:14:39.394 "dma_device_id": "system", 00:14:39.394 "dma_device_type": 1 00:14:39.394 }, 00:14:39.394 { 00:14:39.394 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:39.394 "dma_device_type": 2 00:14:39.394 } 00:14:39.394 ], 00:14:39.394 "driver_specific": {} 00:14:39.394 }' 00:14:39.394 23:36:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:39.651 23:36:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:39.651 23:36:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:39.651 23:36:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:39.651 23:36:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:39.651 23:36:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:39.651 23:36:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:39.651 23:36:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:39.651 23:36:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:39.651 23:36:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:39.909 23:36:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:39.909 23:36:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:39.909 23:36:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:39.909 [2024-07-24 23:36:24.850482] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:39.909 [2024-07-24 23:36:24.850501] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:39.909 [2024-07-24 23:36:24.850530] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:39.909 23:36:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:14:39.909 23:36:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:14:39.909 23:36:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:39.909 23:36:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:14:39.909 23:36:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:14:39.909 23:36:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 3 00:14:39.909 23:36:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:39.909 23:36:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:14:39.909 23:36:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:39.909 23:36:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:39.909 23:36:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:39.909 23:36:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:39.909 23:36:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:39.909 23:36:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:39.909 23:36:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:39.909 23:36:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:39.909 23:36:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:40.167 23:36:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:40.167 "name": "Existed_Raid", 00:14:40.167 "uuid": "627e6ede-1b18-46c7-a210-4b21e8acd7ac", 00:14:40.167 "strip_size_kb": 64, 00:14:40.167 "state": "offline", 00:14:40.167 "raid_level": "raid0", 00:14:40.167 "superblock": true, 00:14:40.167 "num_base_bdevs": 4, 00:14:40.167 "num_base_bdevs_discovered": 3, 00:14:40.167 "num_base_bdevs_operational": 3, 00:14:40.167 "base_bdevs_list": [ 00:14:40.167 { 00:14:40.167 "name": null, 00:14:40.167 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:40.167 "is_configured": false, 00:14:40.167 "data_offset": 2048, 00:14:40.167 "data_size": 63488 00:14:40.167 }, 00:14:40.167 { 00:14:40.167 "name": "BaseBdev2", 00:14:40.167 "uuid": "cb52d8aa-f2a8-4e6d-b620-cdb78d75892a", 00:14:40.167 "is_configured": true, 00:14:40.167 "data_offset": 2048, 00:14:40.167 "data_size": 63488 00:14:40.167 }, 00:14:40.167 { 00:14:40.167 "name": "BaseBdev3", 00:14:40.167 "uuid": "e76291c4-62ef-4826-89c7-cd2f9c12cf48", 00:14:40.167 "is_configured": true, 00:14:40.167 "data_offset": 2048, 00:14:40.167 "data_size": 63488 00:14:40.167 }, 00:14:40.167 { 00:14:40.167 "name": "BaseBdev4", 00:14:40.167 "uuid": "ea3d023a-3db6-45fd-b0e8-f1399fea9359", 00:14:40.167 "is_configured": true, 00:14:40.167 "data_offset": 2048, 00:14:40.167 "data_size": 63488 00:14:40.167 } 00:14:40.167 ] 00:14:40.167 }' 00:14:40.167 23:36:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:40.167 23:36:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:40.731 23:36:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:14:40.731 23:36:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:40.731 23:36:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:40.731 23:36:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:40.731 23:36:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:40.731 23:36:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:40.731 23:36:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:14:40.989 [2024-07-24 23:36:25.817747] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:40.989 23:36:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:40.989 23:36:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:40.989 23:36:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:40.989 23:36:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:41.247 23:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:41.247 23:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:41.247 23:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:14:41.247 [2024-07-24 23:36:26.160191] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:41.247 23:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:41.247 23:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:41.247 23:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:41.247 23:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:41.505 23:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:41.505 23:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:41.505 23:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:14:41.762 [2024-07-24 23:36:26.510911] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:14:41.762 [2024-07-24 23:36:26.510939] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16e63d0 name Existed_Raid, state offline 00:14:41.762 23:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:41.762 23:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:41.762 23:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:41.762 23:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:14:41.762 23:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:14:41.762 23:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:14:41.762 23:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:14:41.762 23:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:14:41.762 23:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:41.762 23:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:42.020 BaseBdev2 00:14:42.020 23:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:14:42.020 23:36:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:14:42.020 23:36:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:14:42.020 23:36:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:14:42.020 23:36:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:14:42.020 23:36:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:14:42.020 23:36:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:42.277 23:36:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:42.277 [ 00:14:42.277 { 00:14:42.277 "name": "BaseBdev2", 00:14:42.277 "aliases": [ 00:14:42.277 "ff84529b-7b19-4a38-9428-6212ebbcb076" 00:14:42.277 ], 00:14:42.277 "product_name": "Malloc disk", 00:14:42.277 "block_size": 512, 00:14:42.277 "num_blocks": 65536, 00:14:42.277 "uuid": "ff84529b-7b19-4a38-9428-6212ebbcb076", 00:14:42.277 "assigned_rate_limits": { 00:14:42.277 "rw_ios_per_sec": 0, 00:14:42.277 "rw_mbytes_per_sec": 0, 00:14:42.277 "r_mbytes_per_sec": 0, 00:14:42.277 "w_mbytes_per_sec": 0 00:14:42.277 }, 00:14:42.277 "claimed": false, 00:14:42.277 "zoned": false, 00:14:42.277 "supported_io_types": { 00:14:42.277 "read": true, 00:14:42.277 "write": true, 00:14:42.277 "unmap": true, 00:14:42.277 "flush": true, 00:14:42.277 "reset": true, 00:14:42.277 "nvme_admin": false, 00:14:42.277 "nvme_io": false, 00:14:42.277 "nvme_io_md": false, 00:14:42.277 "write_zeroes": true, 00:14:42.277 "zcopy": true, 00:14:42.277 "get_zone_info": false, 00:14:42.277 "zone_management": false, 00:14:42.277 "zone_append": false, 00:14:42.277 "compare": false, 00:14:42.277 "compare_and_write": false, 00:14:42.277 "abort": true, 00:14:42.277 "seek_hole": false, 00:14:42.277 "seek_data": false, 00:14:42.277 "copy": true, 00:14:42.277 "nvme_iov_md": false 00:14:42.277 }, 00:14:42.277 "memory_domains": [ 00:14:42.277 { 00:14:42.277 "dma_device_id": "system", 00:14:42.277 "dma_device_type": 1 00:14:42.277 }, 00:14:42.277 { 00:14:42.277 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:42.277 "dma_device_type": 2 00:14:42.277 } 00:14:42.277 ], 00:14:42.277 "driver_specific": {} 00:14:42.277 } 00:14:42.277 ] 00:14:42.277 23:36:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:14:42.277 23:36:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:42.277 23:36:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:42.277 23:36:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:42.535 BaseBdev3 00:14:42.535 23:36:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:14:42.535 23:36:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:14:42.535 23:36:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:14:42.535 23:36:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:14:42.535 23:36:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:14:42.535 23:36:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:14:42.535 23:36:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:42.535 23:36:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:42.793 [ 00:14:42.793 { 00:14:42.793 "name": "BaseBdev3", 00:14:42.793 "aliases": [ 00:14:42.793 "6614d98d-428b-4e2a-a0eb-88bb9519415c" 00:14:42.793 ], 00:14:42.793 "product_name": "Malloc disk", 00:14:42.793 "block_size": 512, 00:14:42.793 "num_blocks": 65536, 00:14:42.793 "uuid": "6614d98d-428b-4e2a-a0eb-88bb9519415c", 00:14:42.793 "assigned_rate_limits": { 00:14:42.793 "rw_ios_per_sec": 0, 00:14:42.793 "rw_mbytes_per_sec": 0, 00:14:42.793 "r_mbytes_per_sec": 0, 00:14:42.793 "w_mbytes_per_sec": 0 00:14:42.793 }, 00:14:42.793 "claimed": false, 00:14:42.793 "zoned": false, 00:14:42.793 "supported_io_types": { 00:14:42.793 "read": true, 00:14:42.793 "write": true, 00:14:42.793 "unmap": true, 00:14:42.793 "flush": true, 00:14:42.793 "reset": true, 00:14:42.793 "nvme_admin": false, 00:14:42.793 "nvme_io": false, 00:14:42.793 "nvme_io_md": false, 00:14:42.793 "write_zeroes": true, 00:14:42.793 "zcopy": true, 00:14:42.793 "get_zone_info": false, 00:14:42.793 "zone_management": false, 00:14:42.793 "zone_append": false, 00:14:42.793 "compare": false, 00:14:42.793 "compare_and_write": false, 00:14:42.793 "abort": true, 00:14:42.793 "seek_hole": false, 00:14:42.793 "seek_data": false, 00:14:42.793 "copy": true, 00:14:42.793 "nvme_iov_md": false 00:14:42.793 }, 00:14:42.793 "memory_domains": [ 00:14:42.793 { 00:14:42.793 "dma_device_id": "system", 00:14:42.793 "dma_device_type": 1 00:14:42.793 }, 00:14:42.793 { 00:14:42.793 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:42.793 "dma_device_type": 2 00:14:42.793 } 00:14:42.793 ], 00:14:42.793 "driver_specific": {} 00:14:42.793 } 00:14:42.793 ] 00:14:42.793 23:36:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:14:42.793 23:36:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:42.793 23:36:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:42.793 23:36:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:14:43.051 BaseBdev4 00:14:43.051 23:36:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:14:43.051 23:36:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev4 00:14:43.051 23:36:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:14:43.051 23:36:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:14:43.051 23:36:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:14:43.051 23:36:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:14:43.051 23:36:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:43.051 23:36:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:14:43.310 [ 00:14:43.310 { 00:14:43.310 "name": "BaseBdev4", 00:14:43.310 "aliases": [ 00:14:43.310 "7ef712aa-74b9-4d75-889b-e889feecfae5" 00:14:43.310 ], 00:14:43.310 "product_name": "Malloc disk", 00:14:43.310 "block_size": 512, 00:14:43.310 "num_blocks": 65536, 00:14:43.310 "uuid": "7ef712aa-74b9-4d75-889b-e889feecfae5", 00:14:43.310 "assigned_rate_limits": { 00:14:43.310 "rw_ios_per_sec": 0, 00:14:43.310 "rw_mbytes_per_sec": 0, 00:14:43.310 "r_mbytes_per_sec": 0, 00:14:43.310 "w_mbytes_per_sec": 0 00:14:43.310 }, 00:14:43.310 "claimed": false, 00:14:43.310 "zoned": false, 00:14:43.310 "supported_io_types": { 00:14:43.310 "read": true, 00:14:43.310 "write": true, 00:14:43.310 "unmap": true, 00:14:43.310 "flush": true, 00:14:43.310 "reset": true, 00:14:43.310 "nvme_admin": false, 00:14:43.310 "nvme_io": false, 00:14:43.310 "nvme_io_md": false, 00:14:43.310 "write_zeroes": true, 00:14:43.310 "zcopy": true, 00:14:43.310 "get_zone_info": false, 00:14:43.310 "zone_management": false, 00:14:43.310 "zone_append": false, 00:14:43.310 "compare": false, 00:14:43.310 "compare_and_write": false, 00:14:43.310 "abort": true, 00:14:43.310 "seek_hole": false, 00:14:43.310 "seek_data": false, 00:14:43.310 "copy": true, 00:14:43.310 "nvme_iov_md": false 00:14:43.310 }, 00:14:43.310 "memory_domains": [ 00:14:43.310 { 00:14:43.310 "dma_device_id": "system", 00:14:43.310 "dma_device_type": 1 00:14:43.310 }, 00:14:43.310 { 00:14:43.310 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:43.310 "dma_device_type": 2 00:14:43.310 } 00:14:43.310 ], 00:14:43.310 "driver_specific": {} 00:14:43.310 } 00:14:43.310 ] 00:14:43.310 23:36:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:14:43.310 23:36:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:43.310 23:36:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:43.310 23:36:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:14:43.569 [2024-07-24 23:36:28.320450] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:43.569 [2024-07-24 23:36:28.320487] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:43.569 [2024-07-24 23:36:28.320498] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:43.569 [2024-07-24 23:36:28.321554] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:43.569 [2024-07-24 23:36:28.321584] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:14:43.569 23:36:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:43.569 23:36:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:43.569 23:36:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:43.569 23:36:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:43.569 23:36:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:43.569 23:36:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:43.569 23:36:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:43.569 23:36:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:43.569 23:36:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:43.569 23:36:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:43.569 23:36:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:43.569 23:36:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:43.569 23:36:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:43.569 "name": "Existed_Raid", 00:14:43.569 "uuid": "4ce9975b-fc34-4346-8adb-7bce00d04393", 00:14:43.569 "strip_size_kb": 64, 00:14:43.569 "state": "configuring", 00:14:43.569 "raid_level": "raid0", 00:14:43.569 "superblock": true, 00:14:43.569 "num_base_bdevs": 4, 00:14:43.569 "num_base_bdevs_discovered": 3, 00:14:43.569 "num_base_bdevs_operational": 4, 00:14:43.569 "base_bdevs_list": [ 00:14:43.569 { 00:14:43.569 "name": "BaseBdev1", 00:14:43.569 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:43.569 "is_configured": false, 00:14:43.569 "data_offset": 0, 00:14:43.569 "data_size": 0 00:14:43.569 }, 00:14:43.569 { 00:14:43.569 "name": "BaseBdev2", 00:14:43.569 "uuid": "ff84529b-7b19-4a38-9428-6212ebbcb076", 00:14:43.569 "is_configured": true, 00:14:43.569 "data_offset": 2048, 00:14:43.569 "data_size": 63488 00:14:43.569 }, 00:14:43.569 { 00:14:43.569 "name": "BaseBdev3", 00:14:43.569 "uuid": "6614d98d-428b-4e2a-a0eb-88bb9519415c", 00:14:43.569 "is_configured": true, 00:14:43.569 "data_offset": 2048, 00:14:43.569 "data_size": 63488 00:14:43.569 }, 00:14:43.569 { 00:14:43.569 "name": "BaseBdev4", 00:14:43.569 "uuid": "7ef712aa-74b9-4d75-889b-e889feecfae5", 00:14:43.569 "is_configured": true, 00:14:43.569 "data_offset": 2048, 00:14:43.569 "data_size": 63488 00:14:43.569 } 00:14:43.569 ] 00:14:43.569 }' 00:14:43.569 23:36:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:43.569 23:36:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:44.136 23:36:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:14:44.136 [2024-07-24 23:36:29.074375] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:44.136 23:36:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:44.136 23:36:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:44.136 23:36:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:44.136 23:36:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:44.136 23:36:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:44.136 23:36:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:44.136 23:36:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:44.136 23:36:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:44.136 23:36:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:44.136 23:36:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:44.136 23:36:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:44.136 23:36:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:44.394 23:36:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:44.394 "name": "Existed_Raid", 00:14:44.394 "uuid": "4ce9975b-fc34-4346-8adb-7bce00d04393", 00:14:44.394 "strip_size_kb": 64, 00:14:44.394 "state": "configuring", 00:14:44.394 "raid_level": "raid0", 00:14:44.394 "superblock": true, 00:14:44.394 "num_base_bdevs": 4, 00:14:44.394 "num_base_bdevs_discovered": 2, 00:14:44.394 "num_base_bdevs_operational": 4, 00:14:44.394 "base_bdevs_list": [ 00:14:44.394 { 00:14:44.394 "name": "BaseBdev1", 00:14:44.394 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:44.394 "is_configured": false, 00:14:44.394 "data_offset": 0, 00:14:44.394 "data_size": 0 00:14:44.394 }, 00:14:44.394 { 00:14:44.394 "name": null, 00:14:44.394 "uuid": "ff84529b-7b19-4a38-9428-6212ebbcb076", 00:14:44.394 "is_configured": false, 00:14:44.394 "data_offset": 2048, 00:14:44.394 "data_size": 63488 00:14:44.394 }, 00:14:44.394 { 00:14:44.394 "name": "BaseBdev3", 00:14:44.394 "uuid": "6614d98d-428b-4e2a-a0eb-88bb9519415c", 00:14:44.394 "is_configured": true, 00:14:44.394 "data_offset": 2048, 00:14:44.394 "data_size": 63488 00:14:44.394 }, 00:14:44.394 { 00:14:44.394 "name": "BaseBdev4", 00:14:44.394 "uuid": "7ef712aa-74b9-4d75-889b-e889feecfae5", 00:14:44.394 "is_configured": true, 00:14:44.394 "data_offset": 2048, 00:14:44.394 "data_size": 63488 00:14:44.394 } 00:14:44.394 ] 00:14:44.394 }' 00:14:44.394 23:36:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:44.394 23:36:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:44.959 23:36:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:44.959 23:36:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:44.959 23:36:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:14:44.959 23:36:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:45.217 [2024-07-24 23:36:30.011556] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:45.217 BaseBdev1 00:14:45.217 23:36:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:14:45.217 23:36:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:14:45.217 23:36:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:14:45.217 23:36:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:14:45.217 23:36:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:14:45.217 23:36:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:14:45.217 23:36:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:45.217 23:36:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:45.475 [ 00:14:45.475 { 00:14:45.475 "name": "BaseBdev1", 00:14:45.475 "aliases": [ 00:14:45.475 "6705be40-3dee-495d-b374-e32c08b00f91" 00:14:45.475 ], 00:14:45.475 "product_name": "Malloc disk", 00:14:45.475 "block_size": 512, 00:14:45.475 "num_blocks": 65536, 00:14:45.475 "uuid": "6705be40-3dee-495d-b374-e32c08b00f91", 00:14:45.475 "assigned_rate_limits": { 00:14:45.475 "rw_ios_per_sec": 0, 00:14:45.475 "rw_mbytes_per_sec": 0, 00:14:45.475 "r_mbytes_per_sec": 0, 00:14:45.475 "w_mbytes_per_sec": 0 00:14:45.475 }, 00:14:45.475 "claimed": true, 00:14:45.475 "claim_type": "exclusive_write", 00:14:45.475 "zoned": false, 00:14:45.475 "supported_io_types": { 00:14:45.475 "read": true, 00:14:45.475 "write": true, 00:14:45.475 "unmap": true, 00:14:45.475 "flush": true, 00:14:45.475 "reset": true, 00:14:45.475 "nvme_admin": false, 00:14:45.475 "nvme_io": false, 00:14:45.475 "nvme_io_md": false, 00:14:45.475 "write_zeroes": true, 00:14:45.475 "zcopy": true, 00:14:45.475 "get_zone_info": false, 00:14:45.475 "zone_management": false, 00:14:45.475 "zone_append": false, 00:14:45.475 "compare": false, 00:14:45.475 "compare_and_write": false, 00:14:45.475 "abort": true, 00:14:45.475 "seek_hole": false, 00:14:45.475 "seek_data": false, 00:14:45.475 "copy": true, 00:14:45.475 "nvme_iov_md": false 00:14:45.475 }, 00:14:45.475 "memory_domains": [ 00:14:45.475 { 00:14:45.475 "dma_device_id": "system", 00:14:45.475 "dma_device_type": 1 00:14:45.475 }, 00:14:45.475 { 00:14:45.475 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:45.475 "dma_device_type": 2 00:14:45.475 } 00:14:45.475 ], 00:14:45.475 "driver_specific": {} 00:14:45.475 } 00:14:45.475 ] 00:14:45.475 23:36:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:14:45.475 23:36:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:45.475 23:36:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:45.475 23:36:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:45.475 23:36:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:45.475 23:36:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:45.475 23:36:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:45.475 23:36:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:45.475 23:36:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:45.475 23:36:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:45.475 23:36:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:45.475 23:36:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:45.475 23:36:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:45.732 23:36:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:45.732 "name": "Existed_Raid", 00:14:45.732 "uuid": "4ce9975b-fc34-4346-8adb-7bce00d04393", 00:14:45.732 "strip_size_kb": 64, 00:14:45.732 "state": "configuring", 00:14:45.732 "raid_level": "raid0", 00:14:45.732 "superblock": true, 00:14:45.732 "num_base_bdevs": 4, 00:14:45.732 "num_base_bdevs_discovered": 3, 00:14:45.732 "num_base_bdevs_operational": 4, 00:14:45.732 "base_bdevs_list": [ 00:14:45.732 { 00:14:45.732 "name": "BaseBdev1", 00:14:45.732 "uuid": "6705be40-3dee-495d-b374-e32c08b00f91", 00:14:45.732 "is_configured": true, 00:14:45.732 "data_offset": 2048, 00:14:45.732 "data_size": 63488 00:14:45.732 }, 00:14:45.732 { 00:14:45.732 "name": null, 00:14:45.732 "uuid": "ff84529b-7b19-4a38-9428-6212ebbcb076", 00:14:45.732 "is_configured": false, 00:14:45.732 "data_offset": 2048, 00:14:45.732 "data_size": 63488 00:14:45.732 }, 00:14:45.732 { 00:14:45.732 "name": "BaseBdev3", 00:14:45.732 "uuid": "6614d98d-428b-4e2a-a0eb-88bb9519415c", 00:14:45.732 "is_configured": true, 00:14:45.732 "data_offset": 2048, 00:14:45.732 "data_size": 63488 00:14:45.732 }, 00:14:45.732 { 00:14:45.732 "name": "BaseBdev4", 00:14:45.732 "uuid": "7ef712aa-74b9-4d75-889b-e889feecfae5", 00:14:45.732 "is_configured": true, 00:14:45.732 "data_offset": 2048, 00:14:45.732 "data_size": 63488 00:14:45.732 } 00:14:45.732 ] 00:14:45.732 }' 00:14:45.732 23:36:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:45.732 23:36:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:45.990 23:36:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:45.990 23:36:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:46.247 23:36:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:14:46.247 23:36:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:14:46.508 [2024-07-24 23:36:31.298884] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:46.508 23:36:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:46.508 23:36:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:46.508 23:36:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:46.508 23:36:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:46.508 23:36:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:46.508 23:36:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:46.508 23:36:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:46.508 23:36:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:46.508 23:36:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:46.508 23:36:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:46.508 23:36:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:46.508 23:36:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:46.508 23:36:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:46.508 "name": "Existed_Raid", 00:14:46.508 "uuid": "4ce9975b-fc34-4346-8adb-7bce00d04393", 00:14:46.508 "strip_size_kb": 64, 00:14:46.508 "state": "configuring", 00:14:46.508 "raid_level": "raid0", 00:14:46.508 "superblock": true, 00:14:46.508 "num_base_bdevs": 4, 00:14:46.508 "num_base_bdevs_discovered": 2, 00:14:46.508 "num_base_bdevs_operational": 4, 00:14:46.508 "base_bdevs_list": [ 00:14:46.508 { 00:14:46.508 "name": "BaseBdev1", 00:14:46.508 "uuid": "6705be40-3dee-495d-b374-e32c08b00f91", 00:14:46.508 "is_configured": true, 00:14:46.508 "data_offset": 2048, 00:14:46.508 "data_size": 63488 00:14:46.508 }, 00:14:46.508 { 00:14:46.508 "name": null, 00:14:46.508 "uuid": "ff84529b-7b19-4a38-9428-6212ebbcb076", 00:14:46.508 "is_configured": false, 00:14:46.508 "data_offset": 2048, 00:14:46.508 "data_size": 63488 00:14:46.508 }, 00:14:46.508 { 00:14:46.508 "name": null, 00:14:46.508 "uuid": "6614d98d-428b-4e2a-a0eb-88bb9519415c", 00:14:46.508 "is_configured": false, 00:14:46.508 "data_offset": 2048, 00:14:46.508 "data_size": 63488 00:14:46.508 }, 00:14:46.508 { 00:14:46.508 "name": "BaseBdev4", 00:14:46.508 "uuid": "7ef712aa-74b9-4d75-889b-e889feecfae5", 00:14:46.508 "is_configured": true, 00:14:46.508 "data_offset": 2048, 00:14:46.508 "data_size": 63488 00:14:46.508 } 00:14:46.508 ] 00:14:46.508 }' 00:14:46.508 23:36:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:46.508 23:36:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:47.074 23:36:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:47.074 23:36:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:47.332 23:36:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:14:47.332 23:36:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:14:47.332 [2024-07-24 23:36:32.261400] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:47.332 23:36:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:47.332 23:36:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:47.332 23:36:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:47.332 23:36:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:47.332 23:36:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:47.332 23:36:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:47.332 23:36:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:47.332 23:36:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:47.332 23:36:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:47.332 23:36:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:47.332 23:36:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:47.332 23:36:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:47.591 23:36:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:47.591 "name": "Existed_Raid", 00:14:47.591 "uuid": "4ce9975b-fc34-4346-8adb-7bce00d04393", 00:14:47.591 "strip_size_kb": 64, 00:14:47.591 "state": "configuring", 00:14:47.591 "raid_level": "raid0", 00:14:47.591 "superblock": true, 00:14:47.591 "num_base_bdevs": 4, 00:14:47.591 "num_base_bdevs_discovered": 3, 00:14:47.591 "num_base_bdevs_operational": 4, 00:14:47.591 "base_bdevs_list": [ 00:14:47.591 { 00:14:47.591 "name": "BaseBdev1", 00:14:47.591 "uuid": "6705be40-3dee-495d-b374-e32c08b00f91", 00:14:47.591 "is_configured": true, 00:14:47.591 "data_offset": 2048, 00:14:47.591 "data_size": 63488 00:14:47.591 }, 00:14:47.591 { 00:14:47.591 "name": null, 00:14:47.591 "uuid": "ff84529b-7b19-4a38-9428-6212ebbcb076", 00:14:47.591 "is_configured": false, 00:14:47.591 "data_offset": 2048, 00:14:47.591 "data_size": 63488 00:14:47.591 }, 00:14:47.591 { 00:14:47.591 "name": "BaseBdev3", 00:14:47.591 "uuid": "6614d98d-428b-4e2a-a0eb-88bb9519415c", 00:14:47.591 "is_configured": true, 00:14:47.591 "data_offset": 2048, 00:14:47.591 "data_size": 63488 00:14:47.591 }, 00:14:47.591 { 00:14:47.591 "name": "BaseBdev4", 00:14:47.591 "uuid": "7ef712aa-74b9-4d75-889b-e889feecfae5", 00:14:47.591 "is_configured": true, 00:14:47.591 "data_offset": 2048, 00:14:47.591 "data_size": 63488 00:14:47.591 } 00:14:47.591 ] 00:14:47.591 }' 00:14:47.591 23:36:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:47.591 23:36:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:48.157 23:36:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:48.157 23:36:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:48.157 23:36:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:14:48.157 23:36:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:48.415 [2024-07-24 23:36:33.235953] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:48.416 23:36:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:48.416 23:36:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:48.416 23:36:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:48.416 23:36:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:48.416 23:36:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:48.416 23:36:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:48.416 23:36:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:48.416 23:36:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:48.416 23:36:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:48.416 23:36:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:48.416 23:36:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:48.416 23:36:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:48.672 23:36:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:48.672 "name": "Existed_Raid", 00:14:48.672 "uuid": "4ce9975b-fc34-4346-8adb-7bce00d04393", 00:14:48.672 "strip_size_kb": 64, 00:14:48.672 "state": "configuring", 00:14:48.672 "raid_level": "raid0", 00:14:48.672 "superblock": true, 00:14:48.672 "num_base_bdevs": 4, 00:14:48.672 "num_base_bdevs_discovered": 2, 00:14:48.672 "num_base_bdevs_operational": 4, 00:14:48.672 "base_bdevs_list": [ 00:14:48.672 { 00:14:48.672 "name": null, 00:14:48.672 "uuid": "6705be40-3dee-495d-b374-e32c08b00f91", 00:14:48.672 "is_configured": false, 00:14:48.672 "data_offset": 2048, 00:14:48.672 "data_size": 63488 00:14:48.672 }, 00:14:48.672 { 00:14:48.672 "name": null, 00:14:48.672 "uuid": "ff84529b-7b19-4a38-9428-6212ebbcb076", 00:14:48.672 "is_configured": false, 00:14:48.672 "data_offset": 2048, 00:14:48.672 "data_size": 63488 00:14:48.672 }, 00:14:48.672 { 00:14:48.672 "name": "BaseBdev3", 00:14:48.672 "uuid": "6614d98d-428b-4e2a-a0eb-88bb9519415c", 00:14:48.672 "is_configured": true, 00:14:48.672 "data_offset": 2048, 00:14:48.672 "data_size": 63488 00:14:48.672 }, 00:14:48.672 { 00:14:48.672 "name": "BaseBdev4", 00:14:48.672 "uuid": "7ef712aa-74b9-4d75-889b-e889feecfae5", 00:14:48.672 "is_configured": true, 00:14:48.672 "data_offset": 2048, 00:14:48.672 "data_size": 63488 00:14:48.672 } 00:14:48.672 ] 00:14:48.672 }' 00:14:48.672 23:36:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:48.672 23:36:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:48.929 23:36:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:48.929 23:36:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:49.186 23:36:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:14:49.186 23:36:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:14:49.444 [2024-07-24 23:36:34.228340] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:49.444 23:36:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:49.444 23:36:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:49.444 23:36:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:49.444 23:36:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:49.444 23:36:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:49.444 23:36:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:49.444 23:36:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:49.444 23:36:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:49.444 23:36:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:49.444 23:36:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:49.444 23:36:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:49.444 23:36:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:49.444 23:36:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:49.444 "name": "Existed_Raid", 00:14:49.444 "uuid": "4ce9975b-fc34-4346-8adb-7bce00d04393", 00:14:49.444 "strip_size_kb": 64, 00:14:49.444 "state": "configuring", 00:14:49.444 "raid_level": "raid0", 00:14:49.444 "superblock": true, 00:14:49.444 "num_base_bdevs": 4, 00:14:49.444 "num_base_bdevs_discovered": 3, 00:14:49.444 "num_base_bdevs_operational": 4, 00:14:49.444 "base_bdevs_list": [ 00:14:49.444 { 00:14:49.444 "name": null, 00:14:49.444 "uuid": "6705be40-3dee-495d-b374-e32c08b00f91", 00:14:49.444 "is_configured": false, 00:14:49.444 "data_offset": 2048, 00:14:49.444 "data_size": 63488 00:14:49.444 }, 00:14:49.444 { 00:14:49.444 "name": "BaseBdev2", 00:14:49.444 "uuid": "ff84529b-7b19-4a38-9428-6212ebbcb076", 00:14:49.444 "is_configured": true, 00:14:49.444 "data_offset": 2048, 00:14:49.444 "data_size": 63488 00:14:49.444 }, 00:14:49.444 { 00:14:49.444 "name": "BaseBdev3", 00:14:49.444 "uuid": "6614d98d-428b-4e2a-a0eb-88bb9519415c", 00:14:49.444 "is_configured": true, 00:14:49.444 "data_offset": 2048, 00:14:49.444 "data_size": 63488 00:14:49.444 }, 00:14:49.444 { 00:14:49.444 "name": "BaseBdev4", 00:14:49.444 "uuid": "7ef712aa-74b9-4d75-889b-e889feecfae5", 00:14:49.444 "is_configured": true, 00:14:49.444 "data_offset": 2048, 00:14:49.444 "data_size": 63488 00:14:49.444 } 00:14:49.444 ] 00:14:49.444 }' 00:14:49.444 23:36:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:49.444 23:36:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:50.009 23:36:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:50.009 23:36:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:50.266 23:36:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:14:50.266 23:36:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:50.266 23:36:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:14:50.266 23:36:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 6705be40-3dee-495d-b374-e32c08b00f91 00:14:50.524 [2024-07-24 23:36:35.393859] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:14:50.524 [2024-07-24 23:36:35.393986] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x16e4670 00:14:50.524 [2024-07-24 23:36:35.393994] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:14:50.524 [2024-07-24 23:36:35.394111] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16de570 00:14:50.524 [2024-07-24 23:36:35.394186] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x16e4670 00:14:50.524 [2024-07-24 23:36:35.394191] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x16e4670 00:14:50.524 [2024-07-24 23:36:35.394254] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:50.524 NewBaseBdev 00:14:50.524 23:36:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:14:50.524 23:36:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=NewBaseBdev 00:14:50.524 23:36:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:14:50.524 23:36:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:14:50.524 23:36:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:14:50.524 23:36:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:14:50.524 23:36:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:50.782 23:36:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:14:50.782 [ 00:14:50.782 { 00:14:50.782 "name": "NewBaseBdev", 00:14:50.782 "aliases": [ 00:14:50.782 "6705be40-3dee-495d-b374-e32c08b00f91" 00:14:50.782 ], 00:14:50.782 "product_name": "Malloc disk", 00:14:50.782 "block_size": 512, 00:14:50.782 "num_blocks": 65536, 00:14:50.782 "uuid": "6705be40-3dee-495d-b374-e32c08b00f91", 00:14:50.782 "assigned_rate_limits": { 00:14:50.782 "rw_ios_per_sec": 0, 00:14:50.782 "rw_mbytes_per_sec": 0, 00:14:50.782 "r_mbytes_per_sec": 0, 00:14:50.782 "w_mbytes_per_sec": 0 00:14:50.782 }, 00:14:50.782 "claimed": true, 00:14:50.782 "claim_type": "exclusive_write", 00:14:50.782 "zoned": false, 00:14:50.782 "supported_io_types": { 00:14:50.782 "read": true, 00:14:50.782 "write": true, 00:14:50.782 "unmap": true, 00:14:50.782 "flush": true, 00:14:50.782 "reset": true, 00:14:50.782 "nvme_admin": false, 00:14:50.782 "nvme_io": false, 00:14:50.782 "nvme_io_md": false, 00:14:50.782 "write_zeroes": true, 00:14:50.782 "zcopy": true, 00:14:50.782 "get_zone_info": false, 00:14:50.782 "zone_management": false, 00:14:50.782 "zone_append": false, 00:14:50.782 "compare": false, 00:14:50.782 "compare_and_write": false, 00:14:50.782 "abort": true, 00:14:50.782 "seek_hole": false, 00:14:50.782 "seek_data": false, 00:14:50.782 "copy": true, 00:14:50.782 "nvme_iov_md": false 00:14:50.782 }, 00:14:50.782 "memory_domains": [ 00:14:50.782 { 00:14:50.782 "dma_device_id": "system", 00:14:50.782 "dma_device_type": 1 00:14:50.782 }, 00:14:50.782 { 00:14:50.782 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:50.782 "dma_device_type": 2 00:14:50.782 } 00:14:50.782 ], 00:14:50.782 "driver_specific": {} 00:14:50.782 } 00:14:50.782 ] 00:14:50.782 23:36:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:14:50.782 23:36:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:14:50.782 23:36:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:50.782 23:36:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:50.782 23:36:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:50.782 23:36:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:50.782 23:36:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:50.782 23:36:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:50.782 23:36:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:50.782 23:36:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:50.782 23:36:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:50.782 23:36:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:50.782 23:36:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:51.040 23:36:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:51.040 "name": "Existed_Raid", 00:14:51.040 "uuid": "4ce9975b-fc34-4346-8adb-7bce00d04393", 00:14:51.040 "strip_size_kb": 64, 00:14:51.040 "state": "online", 00:14:51.040 "raid_level": "raid0", 00:14:51.040 "superblock": true, 00:14:51.040 "num_base_bdevs": 4, 00:14:51.040 "num_base_bdevs_discovered": 4, 00:14:51.040 "num_base_bdevs_operational": 4, 00:14:51.040 "base_bdevs_list": [ 00:14:51.040 { 00:14:51.040 "name": "NewBaseBdev", 00:14:51.040 "uuid": "6705be40-3dee-495d-b374-e32c08b00f91", 00:14:51.040 "is_configured": true, 00:14:51.040 "data_offset": 2048, 00:14:51.040 "data_size": 63488 00:14:51.040 }, 00:14:51.040 { 00:14:51.040 "name": "BaseBdev2", 00:14:51.040 "uuid": "ff84529b-7b19-4a38-9428-6212ebbcb076", 00:14:51.040 "is_configured": true, 00:14:51.040 "data_offset": 2048, 00:14:51.040 "data_size": 63488 00:14:51.040 }, 00:14:51.040 { 00:14:51.040 "name": "BaseBdev3", 00:14:51.040 "uuid": "6614d98d-428b-4e2a-a0eb-88bb9519415c", 00:14:51.040 "is_configured": true, 00:14:51.040 "data_offset": 2048, 00:14:51.040 "data_size": 63488 00:14:51.040 }, 00:14:51.040 { 00:14:51.040 "name": "BaseBdev4", 00:14:51.040 "uuid": "7ef712aa-74b9-4d75-889b-e889feecfae5", 00:14:51.040 "is_configured": true, 00:14:51.040 "data_offset": 2048, 00:14:51.040 "data_size": 63488 00:14:51.040 } 00:14:51.040 ] 00:14:51.040 }' 00:14:51.040 23:36:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:51.040 23:36:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:51.606 23:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:14:51.606 23:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:51.606 23:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:51.606 23:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:51.606 23:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:51.606 23:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:14:51.606 23:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:51.606 23:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:51.606 [2024-07-24 23:36:36.513022] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:51.606 23:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:51.606 "name": "Existed_Raid", 00:14:51.606 "aliases": [ 00:14:51.606 "4ce9975b-fc34-4346-8adb-7bce00d04393" 00:14:51.606 ], 00:14:51.606 "product_name": "Raid Volume", 00:14:51.606 "block_size": 512, 00:14:51.606 "num_blocks": 253952, 00:14:51.606 "uuid": "4ce9975b-fc34-4346-8adb-7bce00d04393", 00:14:51.606 "assigned_rate_limits": { 00:14:51.606 "rw_ios_per_sec": 0, 00:14:51.606 "rw_mbytes_per_sec": 0, 00:14:51.606 "r_mbytes_per_sec": 0, 00:14:51.606 "w_mbytes_per_sec": 0 00:14:51.606 }, 00:14:51.606 "claimed": false, 00:14:51.606 "zoned": false, 00:14:51.606 "supported_io_types": { 00:14:51.606 "read": true, 00:14:51.606 "write": true, 00:14:51.606 "unmap": true, 00:14:51.606 "flush": true, 00:14:51.606 "reset": true, 00:14:51.606 "nvme_admin": false, 00:14:51.606 "nvme_io": false, 00:14:51.606 "nvme_io_md": false, 00:14:51.606 "write_zeroes": true, 00:14:51.606 "zcopy": false, 00:14:51.606 "get_zone_info": false, 00:14:51.606 "zone_management": false, 00:14:51.606 "zone_append": false, 00:14:51.606 "compare": false, 00:14:51.606 "compare_and_write": false, 00:14:51.606 "abort": false, 00:14:51.606 "seek_hole": false, 00:14:51.606 "seek_data": false, 00:14:51.606 "copy": false, 00:14:51.606 "nvme_iov_md": false 00:14:51.606 }, 00:14:51.606 "memory_domains": [ 00:14:51.606 { 00:14:51.606 "dma_device_id": "system", 00:14:51.606 "dma_device_type": 1 00:14:51.606 }, 00:14:51.606 { 00:14:51.606 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:51.606 "dma_device_type": 2 00:14:51.606 }, 00:14:51.606 { 00:14:51.606 "dma_device_id": "system", 00:14:51.606 "dma_device_type": 1 00:14:51.606 }, 00:14:51.606 { 00:14:51.606 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:51.606 "dma_device_type": 2 00:14:51.606 }, 00:14:51.606 { 00:14:51.606 "dma_device_id": "system", 00:14:51.606 "dma_device_type": 1 00:14:51.606 }, 00:14:51.606 { 00:14:51.606 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:51.606 "dma_device_type": 2 00:14:51.606 }, 00:14:51.606 { 00:14:51.606 "dma_device_id": "system", 00:14:51.606 "dma_device_type": 1 00:14:51.606 }, 00:14:51.606 { 00:14:51.606 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:51.606 "dma_device_type": 2 00:14:51.606 } 00:14:51.606 ], 00:14:51.606 "driver_specific": { 00:14:51.606 "raid": { 00:14:51.606 "uuid": "4ce9975b-fc34-4346-8adb-7bce00d04393", 00:14:51.606 "strip_size_kb": 64, 00:14:51.606 "state": "online", 00:14:51.606 "raid_level": "raid0", 00:14:51.606 "superblock": true, 00:14:51.606 "num_base_bdevs": 4, 00:14:51.606 "num_base_bdevs_discovered": 4, 00:14:51.606 "num_base_bdevs_operational": 4, 00:14:51.606 "base_bdevs_list": [ 00:14:51.607 { 00:14:51.607 "name": "NewBaseBdev", 00:14:51.607 "uuid": "6705be40-3dee-495d-b374-e32c08b00f91", 00:14:51.607 "is_configured": true, 00:14:51.607 "data_offset": 2048, 00:14:51.607 "data_size": 63488 00:14:51.607 }, 00:14:51.607 { 00:14:51.607 "name": "BaseBdev2", 00:14:51.607 "uuid": "ff84529b-7b19-4a38-9428-6212ebbcb076", 00:14:51.607 "is_configured": true, 00:14:51.607 "data_offset": 2048, 00:14:51.607 "data_size": 63488 00:14:51.607 }, 00:14:51.607 { 00:14:51.607 "name": "BaseBdev3", 00:14:51.607 "uuid": "6614d98d-428b-4e2a-a0eb-88bb9519415c", 00:14:51.607 "is_configured": true, 00:14:51.607 "data_offset": 2048, 00:14:51.607 "data_size": 63488 00:14:51.607 }, 00:14:51.607 { 00:14:51.607 "name": "BaseBdev4", 00:14:51.607 "uuid": "7ef712aa-74b9-4d75-889b-e889feecfae5", 00:14:51.607 "is_configured": true, 00:14:51.607 "data_offset": 2048, 00:14:51.607 "data_size": 63488 00:14:51.607 } 00:14:51.607 ] 00:14:51.607 } 00:14:51.607 } 00:14:51.607 }' 00:14:51.607 23:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:51.607 23:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:14:51.607 BaseBdev2 00:14:51.607 BaseBdev3 00:14:51.607 BaseBdev4' 00:14:51.607 23:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:51.607 23:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:14:51.607 23:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:51.864 23:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:51.864 "name": "NewBaseBdev", 00:14:51.864 "aliases": [ 00:14:51.864 "6705be40-3dee-495d-b374-e32c08b00f91" 00:14:51.864 ], 00:14:51.864 "product_name": "Malloc disk", 00:14:51.864 "block_size": 512, 00:14:51.864 "num_blocks": 65536, 00:14:51.864 "uuid": "6705be40-3dee-495d-b374-e32c08b00f91", 00:14:51.864 "assigned_rate_limits": { 00:14:51.864 "rw_ios_per_sec": 0, 00:14:51.864 "rw_mbytes_per_sec": 0, 00:14:51.864 "r_mbytes_per_sec": 0, 00:14:51.864 "w_mbytes_per_sec": 0 00:14:51.864 }, 00:14:51.864 "claimed": true, 00:14:51.864 "claim_type": "exclusive_write", 00:14:51.864 "zoned": false, 00:14:51.864 "supported_io_types": { 00:14:51.864 "read": true, 00:14:51.864 "write": true, 00:14:51.864 "unmap": true, 00:14:51.864 "flush": true, 00:14:51.864 "reset": true, 00:14:51.864 "nvme_admin": false, 00:14:51.864 "nvme_io": false, 00:14:51.864 "nvme_io_md": false, 00:14:51.864 "write_zeroes": true, 00:14:51.864 "zcopy": true, 00:14:51.864 "get_zone_info": false, 00:14:51.864 "zone_management": false, 00:14:51.864 "zone_append": false, 00:14:51.864 "compare": false, 00:14:51.864 "compare_and_write": false, 00:14:51.864 "abort": true, 00:14:51.864 "seek_hole": false, 00:14:51.864 "seek_data": false, 00:14:51.864 "copy": true, 00:14:51.864 "nvme_iov_md": false 00:14:51.864 }, 00:14:51.864 "memory_domains": [ 00:14:51.864 { 00:14:51.864 "dma_device_id": "system", 00:14:51.864 "dma_device_type": 1 00:14:51.864 }, 00:14:51.864 { 00:14:51.864 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:51.864 "dma_device_type": 2 00:14:51.864 } 00:14:51.864 ], 00:14:51.864 "driver_specific": {} 00:14:51.864 }' 00:14:51.864 23:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:51.864 23:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:51.864 23:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:51.864 23:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:52.122 23:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:52.122 23:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:52.122 23:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:52.122 23:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:52.122 23:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:52.122 23:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:52.122 23:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:52.122 23:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:52.122 23:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:52.122 23:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:52.122 23:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:52.379 23:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:52.379 "name": "BaseBdev2", 00:14:52.379 "aliases": [ 00:14:52.379 "ff84529b-7b19-4a38-9428-6212ebbcb076" 00:14:52.379 ], 00:14:52.379 "product_name": "Malloc disk", 00:14:52.379 "block_size": 512, 00:14:52.379 "num_blocks": 65536, 00:14:52.379 "uuid": "ff84529b-7b19-4a38-9428-6212ebbcb076", 00:14:52.379 "assigned_rate_limits": { 00:14:52.379 "rw_ios_per_sec": 0, 00:14:52.379 "rw_mbytes_per_sec": 0, 00:14:52.379 "r_mbytes_per_sec": 0, 00:14:52.379 "w_mbytes_per_sec": 0 00:14:52.379 }, 00:14:52.379 "claimed": true, 00:14:52.379 "claim_type": "exclusive_write", 00:14:52.379 "zoned": false, 00:14:52.379 "supported_io_types": { 00:14:52.379 "read": true, 00:14:52.380 "write": true, 00:14:52.380 "unmap": true, 00:14:52.380 "flush": true, 00:14:52.380 "reset": true, 00:14:52.380 "nvme_admin": false, 00:14:52.380 "nvme_io": false, 00:14:52.380 "nvme_io_md": false, 00:14:52.380 "write_zeroes": true, 00:14:52.380 "zcopy": true, 00:14:52.380 "get_zone_info": false, 00:14:52.380 "zone_management": false, 00:14:52.380 "zone_append": false, 00:14:52.380 "compare": false, 00:14:52.380 "compare_and_write": false, 00:14:52.380 "abort": true, 00:14:52.380 "seek_hole": false, 00:14:52.380 "seek_data": false, 00:14:52.380 "copy": true, 00:14:52.380 "nvme_iov_md": false 00:14:52.380 }, 00:14:52.380 "memory_domains": [ 00:14:52.380 { 00:14:52.380 "dma_device_id": "system", 00:14:52.380 "dma_device_type": 1 00:14:52.380 }, 00:14:52.380 { 00:14:52.380 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:52.380 "dma_device_type": 2 00:14:52.380 } 00:14:52.380 ], 00:14:52.380 "driver_specific": {} 00:14:52.380 }' 00:14:52.380 23:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:52.380 23:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:52.380 23:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:52.380 23:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:52.380 23:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:52.380 23:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:52.380 23:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:52.637 23:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:52.638 23:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:52.638 23:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:52.638 23:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:52.638 23:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:52.638 23:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:52.638 23:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:52.638 23:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:52.896 23:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:52.896 "name": "BaseBdev3", 00:14:52.896 "aliases": [ 00:14:52.896 "6614d98d-428b-4e2a-a0eb-88bb9519415c" 00:14:52.896 ], 00:14:52.896 "product_name": "Malloc disk", 00:14:52.896 "block_size": 512, 00:14:52.896 "num_blocks": 65536, 00:14:52.896 "uuid": "6614d98d-428b-4e2a-a0eb-88bb9519415c", 00:14:52.896 "assigned_rate_limits": { 00:14:52.896 "rw_ios_per_sec": 0, 00:14:52.896 "rw_mbytes_per_sec": 0, 00:14:52.896 "r_mbytes_per_sec": 0, 00:14:52.896 "w_mbytes_per_sec": 0 00:14:52.896 }, 00:14:52.896 "claimed": true, 00:14:52.896 "claim_type": "exclusive_write", 00:14:52.896 "zoned": false, 00:14:52.896 "supported_io_types": { 00:14:52.896 "read": true, 00:14:52.896 "write": true, 00:14:52.896 "unmap": true, 00:14:52.896 "flush": true, 00:14:52.896 "reset": true, 00:14:52.896 "nvme_admin": false, 00:14:52.896 "nvme_io": false, 00:14:52.896 "nvme_io_md": false, 00:14:52.896 "write_zeroes": true, 00:14:52.896 "zcopy": true, 00:14:52.896 "get_zone_info": false, 00:14:52.896 "zone_management": false, 00:14:52.896 "zone_append": false, 00:14:52.896 "compare": false, 00:14:52.896 "compare_and_write": false, 00:14:52.896 "abort": true, 00:14:52.896 "seek_hole": false, 00:14:52.896 "seek_data": false, 00:14:52.896 "copy": true, 00:14:52.896 "nvme_iov_md": false 00:14:52.896 }, 00:14:52.896 "memory_domains": [ 00:14:52.896 { 00:14:52.896 "dma_device_id": "system", 00:14:52.896 "dma_device_type": 1 00:14:52.896 }, 00:14:52.896 { 00:14:52.896 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:52.896 "dma_device_type": 2 00:14:52.896 } 00:14:52.896 ], 00:14:52.896 "driver_specific": {} 00:14:52.896 }' 00:14:52.896 23:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:52.896 23:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:52.896 23:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:52.896 23:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:52.896 23:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:52.896 23:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:52.896 23:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:52.896 23:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:53.163 23:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:53.163 23:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:53.164 23:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:53.164 23:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:53.164 23:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:53.164 23:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:14:53.164 23:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:53.164 23:36:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:53.164 "name": "BaseBdev4", 00:14:53.164 "aliases": [ 00:14:53.164 "7ef712aa-74b9-4d75-889b-e889feecfae5" 00:14:53.164 ], 00:14:53.164 "product_name": "Malloc disk", 00:14:53.164 "block_size": 512, 00:14:53.164 "num_blocks": 65536, 00:14:53.164 "uuid": "7ef712aa-74b9-4d75-889b-e889feecfae5", 00:14:53.164 "assigned_rate_limits": { 00:14:53.164 "rw_ios_per_sec": 0, 00:14:53.164 "rw_mbytes_per_sec": 0, 00:14:53.164 "r_mbytes_per_sec": 0, 00:14:53.164 "w_mbytes_per_sec": 0 00:14:53.164 }, 00:14:53.164 "claimed": true, 00:14:53.164 "claim_type": "exclusive_write", 00:14:53.164 "zoned": false, 00:14:53.164 "supported_io_types": { 00:14:53.164 "read": true, 00:14:53.164 "write": true, 00:14:53.164 "unmap": true, 00:14:53.164 "flush": true, 00:14:53.164 "reset": true, 00:14:53.164 "nvme_admin": false, 00:14:53.164 "nvme_io": false, 00:14:53.164 "nvme_io_md": false, 00:14:53.164 "write_zeroes": true, 00:14:53.164 "zcopy": true, 00:14:53.164 "get_zone_info": false, 00:14:53.164 "zone_management": false, 00:14:53.164 "zone_append": false, 00:14:53.164 "compare": false, 00:14:53.164 "compare_and_write": false, 00:14:53.164 "abort": true, 00:14:53.164 "seek_hole": false, 00:14:53.164 "seek_data": false, 00:14:53.164 "copy": true, 00:14:53.164 "nvme_iov_md": false 00:14:53.164 }, 00:14:53.164 "memory_domains": [ 00:14:53.164 { 00:14:53.164 "dma_device_id": "system", 00:14:53.164 "dma_device_type": 1 00:14:53.164 }, 00:14:53.164 { 00:14:53.164 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:53.164 "dma_device_type": 2 00:14:53.164 } 00:14:53.164 ], 00:14:53.164 "driver_specific": {} 00:14:53.164 }' 00:14:53.164 23:36:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:53.421 23:36:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:53.421 23:36:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:53.421 23:36:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:53.421 23:36:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:53.422 23:36:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:53.422 23:36:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:53.422 23:36:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:53.422 23:36:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:53.422 23:36:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:53.422 23:36:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:53.679 23:36:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:53.679 23:36:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:53.679 [2024-07-24 23:36:38.594155] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:53.679 [2024-07-24 23:36:38.594174] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:53.679 [2024-07-24 23:36:38.594209] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:53.679 [2024-07-24 23:36:38.594249] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:53.679 [2024-07-24 23:36:38.594255] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16e4670 name Existed_Raid, state offline 00:14:53.679 23:36:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 307346 00:14:53.679 23:36:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@950 -- # '[' -z 307346 ']' 00:14:53.679 23:36:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # kill -0 307346 00:14:53.679 23:36:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # uname 00:14:53.679 23:36:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:14:53.679 23:36:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 307346 00:14:53.679 23:36:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:14:53.679 23:36:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:14:53.679 23:36:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # echo 'killing process with pid 307346' 00:14:53.679 killing process with pid 307346 00:14:53.679 23:36:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@969 -- # kill 307346 00:14:53.679 [2024-07-24 23:36:38.646368] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:53.679 23:36:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@974 -- # wait 307346 00:14:53.679 [2024-07-24 23:36:38.676740] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:53.937 23:36:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:14:53.937 00:14:53.937 real 0m23.872s 00:14:53.937 user 0m44.469s 00:14:53.937 sys 0m3.653s 00:14:53.937 23:36:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1126 -- # xtrace_disable 00:14:53.937 23:36:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:53.937 ************************************ 00:14:53.937 END TEST raid_state_function_test_sb 00:14:53.937 ************************************ 00:14:53.937 23:36:38 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 4 00:14:53.937 23:36:38 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:14:53.937 23:36:38 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:14:53.937 23:36:38 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:53.937 ************************************ 00:14:53.937 START TEST raid_superblock_test 00:14:53.937 ************************************ 00:14:53.937 23:36:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # raid_superblock_test raid0 4 00:14:53.937 23:36:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:14:53.937 23:36:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:14:53.937 23:36:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:14:53.937 23:36:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:14:53.937 23:36:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:14:53.937 23:36:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:14:53.937 23:36:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:14:53.937 23:36:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:14:53.937 23:36:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:14:53.938 23:36:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:14:53.938 23:36:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:14:53.938 23:36:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:14:53.938 23:36:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:14:53.938 23:36:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:14:53.938 23:36:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:14:53.938 23:36:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:14:53.938 23:36:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=311904 00:14:53.938 23:36:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 311904 /var/tmp/spdk-raid.sock 00:14:53.938 23:36:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:14:53.938 23:36:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@831 -- # '[' -z 311904 ']' 00:14:53.938 23:36:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:53.938 23:36:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:14:53.938 23:36:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:53.938 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:53.938 23:36:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:14:53.938 23:36:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:54.196 [2024-07-24 23:36:38.970034] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:14:54.196 [2024-07-24 23:36:38.970072] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid311904 ] 00:14:54.196 [2024-07-24 23:36:39.034565] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:54.196 [2024-07-24 23:36:39.112852] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:54.196 [2024-07-24 23:36:39.171426] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:54.196 [2024-07-24 23:36:39.171454] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:55.129 23:36:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:14:55.129 23:36:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@864 -- # return 0 00:14:55.129 23:36:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:14:55.129 23:36:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:55.129 23:36:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:14:55.129 23:36:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:14:55.129 23:36:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:14:55.129 23:36:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:55.129 23:36:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:14:55.129 23:36:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:55.129 23:36:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:14:55.129 malloc1 00:14:55.129 23:36:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:14:55.129 [2024-07-24 23:36:40.083847] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:14:55.129 [2024-07-24 23:36:40.083885] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:55.129 [2024-07-24 23:36:40.083897] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xebadd0 00:14:55.129 [2024-07-24 23:36:40.083903] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:55.129 [2024-07-24 23:36:40.085099] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:55.129 [2024-07-24 23:36:40.085120] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:14:55.129 pt1 00:14:55.129 23:36:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:14:55.129 23:36:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:55.129 23:36:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:14:55.129 23:36:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:14:55.129 23:36:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:14:55.129 23:36:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:55.129 23:36:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:14:55.129 23:36:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:55.129 23:36:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:14:55.388 malloc2 00:14:55.388 23:36:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:55.646 [2024-07-24 23:36:40.424220] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:55.646 [2024-07-24 23:36:40.424254] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:55.646 [2024-07-24 23:36:40.424266] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xebb8d0 00:14:55.646 [2024-07-24 23:36:40.424272] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:55.646 [2024-07-24 23:36:40.425331] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:55.646 [2024-07-24 23:36:40.425352] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:55.646 pt2 00:14:55.646 23:36:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:14:55.646 23:36:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:55.646 23:36:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:14:55.646 23:36:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:14:55.646 23:36:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:14:55.646 23:36:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:55.646 23:36:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:14:55.646 23:36:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:55.646 23:36:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:14:55.646 malloc3 00:14:55.646 23:36:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:14:55.905 [2024-07-24 23:36:40.760676] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:14:55.905 [2024-07-24 23:36:40.760705] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:55.905 [2024-07-24 23:36:40.760714] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf7c740 00:14:55.905 [2024-07-24 23:36:40.760720] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:55.905 [2024-07-24 23:36:40.761728] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:55.905 [2024-07-24 23:36:40.761749] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:14:55.905 pt3 00:14:55.905 23:36:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:14:55.905 23:36:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:55.905 23:36:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:14:55.905 23:36:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:14:55.905 23:36:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:14:55.905 23:36:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:55.905 23:36:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:14:55.905 23:36:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:55.905 23:36:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:14:56.163 malloc4 00:14:56.163 23:36:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:14:56.163 [2024-07-24 23:36:41.105114] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:14:56.163 [2024-07-24 23:36:41.105148] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:56.163 [2024-07-24 23:36:41.105159] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xeb2990 00:14:56.163 [2024-07-24 23:36:41.105166] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:56.163 [2024-07-24 23:36:41.106222] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:56.163 [2024-07-24 23:36:41.106243] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:14:56.163 pt4 00:14:56.163 23:36:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:14:56.163 23:36:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:56.163 23:36:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:14:56.420 [2024-07-24 23:36:41.269557] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:14:56.420 [2024-07-24 23:36:41.270411] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:56.420 [2024-07-24 23:36:41.270448] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:14:56.420 [2024-07-24 23:36:41.270482] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:14:56.420 [2024-07-24 23:36:41.270598] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xeb49d0 00:14:56.420 [2024-07-24 23:36:41.270605] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:14:56.420 [2024-07-24 23:36:41.270739] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xebd7d0 00:14:56.420 [2024-07-24 23:36:41.270838] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xeb49d0 00:14:56.420 [2024-07-24 23:36:41.270843] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xeb49d0 00:14:56.420 [2024-07-24 23:36:41.270908] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:56.420 23:36:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:14:56.420 23:36:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:56.420 23:36:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:56.420 23:36:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:56.420 23:36:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:56.420 23:36:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:14:56.420 23:36:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:56.420 23:36:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:56.420 23:36:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:56.420 23:36:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:56.420 23:36:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:56.420 23:36:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:56.677 23:36:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:56.677 "name": "raid_bdev1", 00:14:56.677 "uuid": "c1c40ed4-9171-4af9-a9a7-70bad9a29ee8", 00:14:56.677 "strip_size_kb": 64, 00:14:56.677 "state": "online", 00:14:56.677 "raid_level": "raid0", 00:14:56.677 "superblock": true, 00:14:56.677 "num_base_bdevs": 4, 00:14:56.677 "num_base_bdevs_discovered": 4, 00:14:56.677 "num_base_bdevs_operational": 4, 00:14:56.677 "base_bdevs_list": [ 00:14:56.677 { 00:14:56.677 "name": "pt1", 00:14:56.677 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:56.677 "is_configured": true, 00:14:56.677 "data_offset": 2048, 00:14:56.677 "data_size": 63488 00:14:56.677 }, 00:14:56.677 { 00:14:56.677 "name": "pt2", 00:14:56.677 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:56.677 "is_configured": true, 00:14:56.677 "data_offset": 2048, 00:14:56.677 "data_size": 63488 00:14:56.677 }, 00:14:56.677 { 00:14:56.677 "name": "pt3", 00:14:56.677 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:56.677 "is_configured": true, 00:14:56.677 "data_offset": 2048, 00:14:56.677 "data_size": 63488 00:14:56.677 }, 00:14:56.678 { 00:14:56.678 "name": "pt4", 00:14:56.678 "uuid": "00000000-0000-0000-0000-000000000004", 00:14:56.678 "is_configured": true, 00:14:56.678 "data_offset": 2048, 00:14:56.678 "data_size": 63488 00:14:56.678 } 00:14:56.678 ] 00:14:56.678 }' 00:14:56.678 23:36:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:56.678 23:36:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:56.935 23:36:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:14:56.935 23:36:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:14:56.935 23:36:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:56.935 23:36:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:56.935 23:36:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:56.935 23:36:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:56.935 23:36:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:56.935 23:36:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:57.192 [2024-07-24 23:36:42.055768] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:57.192 23:36:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:57.192 "name": "raid_bdev1", 00:14:57.192 "aliases": [ 00:14:57.192 "c1c40ed4-9171-4af9-a9a7-70bad9a29ee8" 00:14:57.192 ], 00:14:57.192 "product_name": "Raid Volume", 00:14:57.192 "block_size": 512, 00:14:57.192 "num_blocks": 253952, 00:14:57.192 "uuid": "c1c40ed4-9171-4af9-a9a7-70bad9a29ee8", 00:14:57.192 "assigned_rate_limits": { 00:14:57.192 "rw_ios_per_sec": 0, 00:14:57.192 "rw_mbytes_per_sec": 0, 00:14:57.192 "r_mbytes_per_sec": 0, 00:14:57.192 "w_mbytes_per_sec": 0 00:14:57.192 }, 00:14:57.192 "claimed": false, 00:14:57.192 "zoned": false, 00:14:57.192 "supported_io_types": { 00:14:57.192 "read": true, 00:14:57.192 "write": true, 00:14:57.192 "unmap": true, 00:14:57.192 "flush": true, 00:14:57.192 "reset": true, 00:14:57.192 "nvme_admin": false, 00:14:57.192 "nvme_io": false, 00:14:57.192 "nvme_io_md": false, 00:14:57.192 "write_zeroes": true, 00:14:57.192 "zcopy": false, 00:14:57.192 "get_zone_info": false, 00:14:57.192 "zone_management": false, 00:14:57.192 "zone_append": false, 00:14:57.192 "compare": false, 00:14:57.192 "compare_and_write": false, 00:14:57.192 "abort": false, 00:14:57.192 "seek_hole": false, 00:14:57.192 "seek_data": false, 00:14:57.192 "copy": false, 00:14:57.192 "nvme_iov_md": false 00:14:57.192 }, 00:14:57.192 "memory_domains": [ 00:14:57.192 { 00:14:57.192 "dma_device_id": "system", 00:14:57.192 "dma_device_type": 1 00:14:57.192 }, 00:14:57.192 { 00:14:57.192 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:57.192 "dma_device_type": 2 00:14:57.192 }, 00:14:57.192 { 00:14:57.192 "dma_device_id": "system", 00:14:57.192 "dma_device_type": 1 00:14:57.192 }, 00:14:57.192 { 00:14:57.192 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:57.192 "dma_device_type": 2 00:14:57.192 }, 00:14:57.192 { 00:14:57.192 "dma_device_id": "system", 00:14:57.192 "dma_device_type": 1 00:14:57.192 }, 00:14:57.192 { 00:14:57.192 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:57.192 "dma_device_type": 2 00:14:57.192 }, 00:14:57.192 { 00:14:57.192 "dma_device_id": "system", 00:14:57.192 "dma_device_type": 1 00:14:57.192 }, 00:14:57.192 { 00:14:57.192 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:57.192 "dma_device_type": 2 00:14:57.192 } 00:14:57.192 ], 00:14:57.192 "driver_specific": { 00:14:57.192 "raid": { 00:14:57.192 "uuid": "c1c40ed4-9171-4af9-a9a7-70bad9a29ee8", 00:14:57.192 "strip_size_kb": 64, 00:14:57.192 "state": "online", 00:14:57.192 "raid_level": "raid0", 00:14:57.192 "superblock": true, 00:14:57.192 "num_base_bdevs": 4, 00:14:57.192 "num_base_bdevs_discovered": 4, 00:14:57.192 "num_base_bdevs_operational": 4, 00:14:57.192 "base_bdevs_list": [ 00:14:57.192 { 00:14:57.192 "name": "pt1", 00:14:57.192 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:57.192 "is_configured": true, 00:14:57.192 "data_offset": 2048, 00:14:57.192 "data_size": 63488 00:14:57.192 }, 00:14:57.192 { 00:14:57.192 "name": "pt2", 00:14:57.192 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:57.192 "is_configured": true, 00:14:57.192 "data_offset": 2048, 00:14:57.192 "data_size": 63488 00:14:57.192 }, 00:14:57.192 { 00:14:57.192 "name": "pt3", 00:14:57.192 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:57.192 "is_configured": true, 00:14:57.192 "data_offset": 2048, 00:14:57.192 "data_size": 63488 00:14:57.192 }, 00:14:57.192 { 00:14:57.192 "name": "pt4", 00:14:57.192 "uuid": "00000000-0000-0000-0000-000000000004", 00:14:57.192 "is_configured": true, 00:14:57.192 "data_offset": 2048, 00:14:57.192 "data_size": 63488 00:14:57.192 } 00:14:57.192 ] 00:14:57.192 } 00:14:57.192 } 00:14:57.192 }' 00:14:57.192 23:36:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:57.192 23:36:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:14:57.192 pt2 00:14:57.192 pt3 00:14:57.192 pt4' 00:14:57.192 23:36:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:57.192 23:36:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:57.192 23:36:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:14:57.448 23:36:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:57.448 "name": "pt1", 00:14:57.448 "aliases": [ 00:14:57.448 "00000000-0000-0000-0000-000000000001" 00:14:57.448 ], 00:14:57.448 "product_name": "passthru", 00:14:57.448 "block_size": 512, 00:14:57.448 "num_blocks": 65536, 00:14:57.448 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:57.449 "assigned_rate_limits": { 00:14:57.449 "rw_ios_per_sec": 0, 00:14:57.449 "rw_mbytes_per_sec": 0, 00:14:57.449 "r_mbytes_per_sec": 0, 00:14:57.449 "w_mbytes_per_sec": 0 00:14:57.449 }, 00:14:57.449 "claimed": true, 00:14:57.449 "claim_type": "exclusive_write", 00:14:57.449 "zoned": false, 00:14:57.449 "supported_io_types": { 00:14:57.449 "read": true, 00:14:57.449 "write": true, 00:14:57.449 "unmap": true, 00:14:57.449 "flush": true, 00:14:57.449 "reset": true, 00:14:57.449 "nvme_admin": false, 00:14:57.449 "nvme_io": false, 00:14:57.449 "nvme_io_md": false, 00:14:57.449 "write_zeroes": true, 00:14:57.449 "zcopy": true, 00:14:57.449 "get_zone_info": false, 00:14:57.449 "zone_management": false, 00:14:57.449 "zone_append": false, 00:14:57.449 "compare": false, 00:14:57.449 "compare_and_write": false, 00:14:57.449 "abort": true, 00:14:57.449 "seek_hole": false, 00:14:57.449 "seek_data": false, 00:14:57.449 "copy": true, 00:14:57.449 "nvme_iov_md": false 00:14:57.449 }, 00:14:57.449 "memory_domains": [ 00:14:57.449 { 00:14:57.449 "dma_device_id": "system", 00:14:57.449 "dma_device_type": 1 00:14:57.449 }, 00:14:57.449 { 00:14:57.449 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:57.449 "dma_device_type": 2 00:14:57.449 } 00:14:57.449 ], 00:14:57.449 "driver_specific": { 00:14:57.449 "passthru": { 00:14:57.449 "name": "pt1", 00:14:57.449 "base_bdev_name": "malloc1" 00:14:57.449 } 00:14:57.449 } 00:14:57.449 }' 00:14:57.449 23:36:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:57.449 23:36:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:57.449 23:36:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:57.449 23:36:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:57.449 23:36:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:57.449 23:36:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:57.449 23:36:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:57.705 23:36:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:57.705 23:36:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:57.705 23:36:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:57.705 23:36:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:57.705 23:36:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:57.705 23:36:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:57.705 23:36:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:14:57.706 23:36:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:57.962 23:36:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:57.962 "name": "pt2", 00:14:57.962 "aliases": [ 00:14:57.962 "00000000-0000-0000-0000-000000000002" 00:14:57.962 ], 00:14:57.962 "product_name": "passthru", 00:14:57.962 "block_size": 512, 00:14:57.962 "num_blocks": 65536, 00:14:57.962 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:57.962 "assigned_rate_limits": { 00:14:57.962 "rw_ios_per_sec": 0, 00:14:57.962 "rw_mbytes_per_sec": 0, 00:14:57.962 "r_mbytes_per_sec": 0, 00:14:57.962 "w_mbytes_per_sec": 0 00:14:57.962 }, 00:14:57.962 "claimed": true, 00:14:57.962 "claim_type": "exclusive_write", 00:14:57.962 "zoned": false, 00:14:57.962 "supported_io_types": { 00:14:57.962 "read": true, 00:14:57.962 "write": true, 00:14:57.962 "unmap": true, 00:14:57.962 "flush": true, 00:14:57.962 "reset": true, 00:14:57.962 "nvme_admin": false, 00:14:57.962 "nvme_io": false, 00:14:57.962 "nvme_io_md": false, 00:14:57.962 "write_zeroes": true, 00:14:57.962 "zcopy": true, 00:14:57.962 "get_zone_info": false, 00:14:57.962 "zone_management": false, 00:14:57.962 "zone_append": false, 00:14:57.962 "compare": false, 00:14:57.962 "compare_and_write": false, 00:14:57.962 "abort": true, 00:14:57.962 "seek_hole": false, 00:14:57.962 "seek_data": false, 00:14:57.962 "copy": true, 00:14:57.962 "nvme_iov_md": false 00:14:57.962 }, 00:14:57.962 "memory_domains": [ 00:14:57.962 { 00:14:57.962 "dma_device_id": "system", 00:14:57.962 "dma_device_type": 1 00:14:57.962 }, 00:14:57.962 { 00:14:57.962 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:57.962 "dma_device_type": 2 00:14:57.962 } 00:14:57.962 ], 00:14:57.962 "driver_specific": { 00:14:57.962 "passthru": { 00:14:57.962 "name": "pt2", 00:14:57.962 "base_bdev_name": "malloc2" 00:14:57.962 } 00:14:57.962 } 00:14:57.962 }' 00:14:57.962 23:36:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:57.962 23:36:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:57.962 23:36:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:57.962 23:36:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:57.962 23:36:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:57.962 23:36:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:57.962 23:36:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:57.962 23:36:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:58.218 23:36:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:58.218 23:36:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:58.218 23:36:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:58.218 23:36:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:58.218 23:36:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:58.218 23:36:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:14:58.218 23:36:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:58.475 23:36:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:58.475 "name": "pt3", 00:14:58.475 "aliases": [ 00:14:58.475 "00000000-0000-0000-0000-000000000003" 00:14:58.475 ], 00:14:58.475 "product_name": "passthru", 00:14:58.475 "block_size": 512, 00:14:58.475 "num_blocks": 65536, 00:14:58.475 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:58.475 "assigned_rate_limits": { 00:14:58.475 "rw_ios_per_sec": 0, 00:14:58.475 "rw_mbytes_per_sec": 0, 00:14:58.475 "r_mbytes_per_sec": 0, 00:14:58.475 "w_mbytes_per_sec": 0 00:14:58.475 }, 00:14:58.475 "claimed": true, 00:14:58.475 "claim_type": "exclusive_write", 00:14:58.475 "zoned": false, 00:14:58.475 "supported_io_types": { 00:14:58.475 "read": true, 00:14:58.475 "write": true, 00:14:58.475 "unmap": true, 00:14:58.475 "flush": true, 00:14:58.475 "reset": true, 00:14:58.475 "nvme_admin": false, 00:14:58.475 "nvme_io": false, 00:14:58.475 "nvme_io_md": false, 00:14:58.475 "write_zeroes": true, 00:14:58.475 "zcopy": true, 00:14:58.475 "get_zone_info": false, 00:14:58.475 "zone_management": false, 00:14:58.475 "zone_append": false, 00:14:58.475 "compare": false, 00:14:58.475 "compare_and_write": false, 00:14:58.475 "abort": true, 00:14:58.475 "seek_hole": false, 00:14:58.475 "seek_data": false, 00:14:58.475 "copy": true, 00:14:58.475 "nvme_iov_md": false 00:14:58.475 }, 00:14:58.475 "memory_domains": [ 00:14:58.475 { 00:14:58.475 "dma_device_id": "system", 00:14:58.475 "dma_device_type": 1 00:14:58.475 }, 00:14:58.475 { 00:14:58.475 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:58.475 "dma_device_type": 2 00:14:58.475 } 00:14:58.475 ], 00:14:58.475 "driver_specific": { 00:14:58.475 "passthru": { 00:14:58.475 "name": "pt3", 00:14:58.475 "base_bdev_name": "malloc3" 00:14:58.475 } 00:14:58.475 } 00:14:58.475 }' 00:14:58.475 23:36:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:58.475 23:36:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:58.475 23:36:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:58.475 23:36:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:58.475 23:36:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:58.476 23:36:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:58.476 23:36:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:58.476 23:36:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:58.732 23:36:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:58.732 23:36:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:58.732 23:36:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:58.732 23:36:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:58.732 23:36:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:58.732 23:36:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:14:58.732 23:36:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:58.732 23:36:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:58.732 "name": "pt4", 00:14:58.732 "aliases": [ 00:14:58.732 "00000000-0000-0000-0000-000000000004" 00:14:58.732 ], 00:14:58.732 "product_name": "passthru", 00:14:58.732 "block_size": 512, 00:14:58.733 "num_blocks": 65536, 00:14:58.733 "uuid": "00000000-0000-0000-0000-000000000004", 00:14:58.733 "assigned_rate_limits": { 00:14:58.733 "rw_ios_per_sec": 0, 00:14:58.733 "rw_mbytes_per_sec": 0, 00:14:58.733 "r_mbytes_per_sec": 0, 00:14:58.733 "w_mbytes_per_sec": 0 00:14:58.733 }, 00:14:58.733 "claimed": true, 00:14:58.733 "claim_type": "exclusive_write", 00:14:58.733 "zoned": false, 00:14:58.733 "supported_io_types": { 00:14:58.733 "read": true, 00:14:58.733 "write": true, 00:14:58.733 "unmap": true, 00:14:58.733 "flush": true, 00:14:58.733 "reset": true, 00:14:58.733 "nvme_admin": false, 00:14:58.733 "nvme_io": false, 00:14:58.733 "nvme_io_md": false, 00:14:58.733 "write_zeroes": true, 00:14:58.733 "zcopy": true, 00:14:58.733 "get_zone_info": false, 00:14:58.733 "zone_management": false, 00:14:58.733 "zone_append": false, 00:14:58.733 "compare": false, 00:14:58.733 "compare_and_write": false, 00:14:58.733 "abort": true, 00:14:58.733 "seek_hole": false, 00:14:58.733 "seek_data": false, 00:14:58.733 "copy": true, 00:14:58.733 "nvme_iov_md": false 00:14:58.733 }, 00:14:58.733 "memory_domains": [ 00:14:58.733 { 00:14:58.733 "dma_device_id": "system", 00:14:58.733 "dma_device_type": 1 00:14:58.733 }, 00:14:58.733 { 00:14:58.733 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:58.733 "dma_device_type": 2 00:14:58.733 } 00:14:58.733 ], 00:14:58.733 "driver_specific": { 00:14:58.733 "passthru": { 00:14:58.733 "name": "pt4", 00:14:58.733 "base_bdev_name": "malloc4" 00:14:58.733 } 00:14:58.733 } 00:14:58.733 }' 00:14:58.733 23:36:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:58.989 23:36:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:58.989 23:36:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:58.989 23:36:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:58.989 23:36:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:58.989 23:36:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:58.989 23:36:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:58.989 23:36:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:58.989 23:36:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:58.989 23:36:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:58.989 23:36:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:59.246 23:36:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:59.246 23:36:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:59.246 23:36:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:14:59.246 [2024-07-24 23:36:44.157236] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:59.246 23:36:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=c1c40ed4-9171-4af9-a9a7-70bad9a29ee8 00:14:59.246 23:36:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z c1c40ed4-9171-4af9-a9a7-70bad9a29ee8 ']' 00:14:59.246 23:36:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:59.502 [2024-07-24 23:36:44.313442] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:59.502 [2024-07-24 23:36:44.313454] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:59.502 [2024-07-24 23:36:44.313490] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:59.502 [2024-07-24 23:36:44.313533] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:59.502 [2024-07-24 23:36:44.313538] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xeb49d0 name raid_bdev1, state offline 00:14:59.502 23:36:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:59.502 23:36:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:14:59.502 23:36:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:14:59.502 23:36:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:14:59.502 23:36:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:14:59.502 23:36:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:14:59.759 23:36:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:14:59.759 23:36:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:15:00.017 23:36:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:15:00.017 23:36:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:15:00.017 23:36:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:15:00.017 23:36:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:15:00.274 23:36:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:15:00.274 23:36:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:15:00.531 23:36:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:15:00.531 23:36:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:15:00.531 23:36:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # local es=0 00:15:00.531 23:36:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:15:00.531 23:36:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:00.531 23:36:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:15:00.531 23:36:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:00.531 23:36:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:15:00.531 23:36:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:00.531 23:36:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:15:00.531 23:36:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:00.531 23:36:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:15:00.531 23:36:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:15:00.531 [2024-07-24 23:36:45.456378] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:15:00.532 [2024-07-24 23:36:45.457355] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:15:00.532 [2024-07-24 23:36:45.457386] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:15:00.532 [2024-07-24 23:36:45.457408] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:15:00.532 [2024-07-24 23:36:45.457440] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:15:00.532 [2024-07-24 23:36:45.457466] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:15:00.532 [2024-07-24 23:36:45.457488] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:15:00.532 [2024-07-24 23:36:45.457500] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:15:00.532 [2024-07-24 23:36:45.457509] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:00.532 [2024-07-24 23:36:45.457515] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xebb250 name raid_bdev1, state configuring 00:15:00.532 request: 00:15:00.532 { 00:15:00.532 "name": "raid_bdev1", 00:15:00.532 "raid_level": "raid0", 00:15:00.532 "base_bdevs": [ 00:15:00.532 "malloc1", 00:15:00.532 "malloc2", 00:15:00.532 "malloc3", 00:15:00.532 "malloc4" 00:15:00.532 ], 00:15:00.532 "strip_size_kb": 64, 00:15:00.532 "superblock": false, 00:15:00.532 "method": "bdev_raid_create", 00:15:00.532 "req_id": 1 00:15:00.532 } 00:15:00.532 Got JSON-RPC error response 00:15:00.532 response: 00:15:00.532 { 00:15:00.532 "code": -17, 00:15:00.532 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:15:00.532 } 00:15:00.532 23:36:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # es=1 00:15:00.532 23:36:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:15:00.532 23:36:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:15:00.532 23:36:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:15:00.532 23:36:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:00.532 23:36:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:15:00.789 23:36:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:15:00.789 23:36:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:15:00.789 23:36:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:15:01.046 [2024-07-24 23:36:45.801226] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:15:01.046 [2024-07-24 23:36:45.801254] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:01.046 [2024-07-24 23:36:45.801267] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf7c970 00:15:01.046 [2024-07-24 23:36:45.801273] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:01.046 [2024-07-24 23:36:45.802485] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:01.046 [2024-07-24 23:36:45.802506] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:15:01.046 [2024-07-24 23:36:45.802552] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:15:01.046 [2024-07-24 23:36:45.802570] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:15:01.046 pt1 00:15:01.046 23:36:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 4 00:15:01.046 23:36:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:01.046 23:36:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:01.046 23:36:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:01.046 23:36:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:01.046 23:36:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:01.046 23:36:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:01.046 23:36:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:01.046 23:36:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:01.046 23:36:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:01.046 23:36:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:01.046 23:36:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:01.046 23:36:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:01.046 "name": "raid_bdev1", 00:15:01.046 "uuid": "c1c40ed4-9171-4af9-a9a7-70bad9a29ee8", 00:15:01.046 "strip_size_kb": 64, 00:15:01.046 "state": "configuring", 00:15:01.046 "raid_level": "raid0", 00:15:01.046 "superblock": true, 00:15:01.046 "num_base_bdevs": 4, 00:15:01.046 "num_base_bdevs_discovered": 1, 00:15:01.046 "num_base_bdevs_operational": 4, 00:15:01.046 "base_bdevs_list": [ 00:15:01.046 { 00:15:01.046 "name": "pt1", 00:15:01.046 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:01.046 "is_configured": true, 00:15:01.046 "data_offset": 2048, 00:15:01.046 "data_size": 63488 00:15:01.046 }, 00:15:01.046 { 00:15:01.046 "name": null, 00:15:01.046 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:01.046 "is_configured": false, 00:15:01.046 "data_offset": 2048, 00:15:01.046 "data_size": 63488 00:15:01.046 }, 00:15:01.046 { 00:15:01.046 "name": null, 00:15:01.046 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:01.046 "is_configured": false, 00:15:01.046 "data_offset": 2048, 00:15:01.046 "data_size": 63488 00:15:01.046 }, 00:15:01.046 { 00:15:01.046 "name": null, 00:15:01.046 "uuid": "00000000-0000-0000-0000-000000000004", 00:15:01.046 "is_configured": false, 00:15:01.047 "data_offset": 2048, 00:15:01.047 "data_size": 63488 00:15:01.047 } 00:15:01.047 ] 00:15:01.047 }' 00:15:01.047 23:36:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:01.047 23:36:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:01.611 23:36:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:15:01.611 23:36:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:01.868 [2024-07-24 23:36:46.631372] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:01.868 [2024-07-24 23:36:46.631409] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:01.868 [2024-07-24 23:36:46.631419] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xeb24e0 00:15:01.868 [2024-07-24 23:36:46.631426] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:01.868 [2024-07-24 23:36:46.631676] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:01.868 [2024-07-24 23:36:46.631688] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:01.869 [2024-07-24 23:36:46.631730] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:15:01.869 [2024-07-24 23:36:46.631743] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:01.869 pt2 00:15:01.869 23:36:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:15:01.869 [2024-07-24 23:36:46.803826] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:15:01.869 23:36:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 4 00:15:01.869 23:36:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:01.869 23:36:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:01.869 23:36:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:01.869 23:36:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:01.869 23:36:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:01.869 23:36:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:01.869 23:36:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:01.869 23:36:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:01.869 23:36:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:01.869 23:36:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:01.869 23:36:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:02.131 23:36:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:02.131 "name": "raid_bdev1", 00:15:02.131 "uuid": "c1c40ed4-9171-4af9-a9a7-70bad9a29ee8", 00:15:02.131 "strip_size_kb": 64, 00:15:02.131 "state": "configuring", 00:15:02.131 "raid_level": "raid0", 00:15:02.131 "superblock": true, 00:15:02.131 "num_base_bdevs": 4, 00:15:02.131 "num_base_bdevs_discovered": 1, 00:15:02.131 "num_base_bdevs_operational": 4, 00:15:02.131 "base_bdevs_list": [ 00:15:02.131 { 00:15:02.131 "name": "pt1", 00:15:02.131 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:02.131 "is_configured": true, 00:15:02.131 "data_offset": 2048, 00:15:02.131 "data_size": 63488 00:15:02.131 }, 00:15:02.131 { 00:15:02.131 "name": null, 00:15:02.131 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:02.131 "is_configured": false, 00:15:02.131 "data_offset": 2048, 00:15:02.131 "data_size": 63488 00:15:02.131 }, 00:15:02.131 { 00:15:02.131 "name": null, 00:15:02.131 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:02.131 "is_configured": false, 00:15:02.131 "data_offset": 2048, 00:15:02.131 "data_size": 63488 00:15:02.131 }, 00:15:02.131 { 00:15:02.131 "name": null, 00:15:02.131 "uuid": "00000000-0000-0000-0000-000000000004", 00:15:02.131 "is_configured": false, 00:15:02.131 "data_offset": 2048, 00:15:02.131 "data_size": 63488 00:15:02.131 } 00:15:02.131 ] 00:15:02.131 }' 00:15:02.131 23:36:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:02.131 23:36:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:02.696 23:36:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:15:02.696 23:36:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:15:02.696 23:36:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:02.696 [2024-07-24 23:36:47.646011] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:02.696 [2024-07-24 23:36:47.646043] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:02.696 [2024-07-24 23:36:47.646054] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xeb2710 00:15:02.696 [2024-07-24 23:36:47.646060] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:02.696 [2024-07-24 23:36:47.646302] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:02.696 [2024-07-24 23:36:47.646315] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:02.696 [2024-07-24 23:36:47.646358] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:15:02.696 [2024-07-24 23:36:47.646370] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:02.696 pt2 00:15:02.696 23:36:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:15:02.696 23:36:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:15:02.696 23:36:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:15:02.995 [2024-07-24 23:36:47.818455] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:15:02.995 [2024-07-24 23:36:47.818481] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:02.995 [2024-07-24 23:36:47.818512] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xeb54f0 00:15:02.995 [2024-07-24 23:36:47.818518] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:02.995 [2024-07-24 23:36:47.818721] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:02.995 [2024-07-24 23:36:47.818731] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:15:02.995 [2024-07-24 23:36:47.818766] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:15:02.995 [2024-07-24 23:36:47.818777] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:15:02.995 pt3 00:15:02.995 23:36:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:15:02.995 23:36:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:15:02.995 23:36:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:15:02.995 [2024-07-24 23:36:47.982881] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:15:02.995 [2024-07-24 23:36:47.982899] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:02.995 [2024-07-24 23:36:47.982908] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xeb6a50 00:15:02.996 [2024-07-24 23:36:47.982913] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:02.996 [2024-07-24 23:36:47.983096] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:02.996 [2024-07-24 23:36:47.983107] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:15:02.996 [2024-07-24 23:36:47.983138] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:15:02.996 [2024-07-24 23:36:47.983148] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:15:02.996 [2024-07-24 23:36:47.983224] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xeb3610 00:15:02.996 [2024-07-24 23:36:47.983230] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:15:02.996 [2024-07-24 23:36:47.983339] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xeb8430 00:15:02.996 [2024-07-24 23:36:47.983424] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xeb3610 00:15:02.996 [2024-07-24 23:36:47.983429] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xeb3610 00:15:02.996 [2024-07-24 23:36:47.983497] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:03.267 pt4 00:15:03.267 23:36:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:15:03.267 23:36:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:15:03.267 23:36:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:15:03.267 23:36:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:03.267 23:36:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:03.267 23:36:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:03.267 23:36:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:03.267 23:36:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:03.267 23:36:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:03.267 23:36:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:03.267 23:36:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:03.267 23:36:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:03.267 23:36:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:03.267 23:36:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:03.267 23:36:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:03.267 "name": "raid_bdev1", 00:15:03.267 "uuid": "c1c40ed4-9171-4af9-a9a7-70bad9a29ee8", 00:15:03.267 "strip_size_kb": 64, 00:15:03.267 "state": "online", 00:15:03.267 "raid_level": "raid0", 00:15:03.267 "superblock": true, 00:15:03.267 "num_base_bdevs": 4, 00:15:03.267 "num_base_bdevs_discovered": 4, 00:15:03.267 "num_base_bdevs_operational": 4, 00:15:03.267 "base_bdevs_list": [ 00:15:03.267 { 00:15:03.267 "name": "pt1", 00:15:03.267 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:03.267 "is_configured": true, 00:15:03.267 "data_offset": 2048, 00:15:03.267 "data_size": 63488 00:15:03.267 }, 00:15:03.267 { 00:15:03.267 "name": "pt2", 00:15:03.267 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:03.267 "is_configured": true, 00:15:03.267 "data_offset": 2048, 00:15:03.267 "data_size": 63488 00:15:03.267 }, 00:15:03.267 { 00:15:03.267 "name": "pt3", 00:15:03.267 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:03.267 "is_configured": true, 00:15:03.267 "data_offset": 2048, 00:15:03.267 "data_size": 63488 00:15:03.267 }, 00:15:03.267 { 00:15:03.267 "name": "pt4", 00:15:03.267 "uuid": "00000000-0000-0000-0000-000000000004", 00:15:03.267 "is_configured": true, 00:15:03.267 "data_offset": 2048, 00:15:03.267 "data_size": 63488 00:15:03.267 } 00:15:03.267 ] 00:15:03.267 }' 00:15:03.267 23:36:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:03.267 23:36:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:03.832 23:36:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:15:03.832 23:36:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:15:03.832 23:36:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:03.832 23:36:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:03.832 23:36:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:03.832 23:36:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:03.832 23:36:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:03.832 23:36:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:03.832 [2024-07-24 23:36:48.793187] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:03.832 23:36:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:03.832 "name": "raid_bdev1", 00:15:03.832 "aliases": [ 00:15:03.832 "c1c40ed4-9171-4af9-a9a7-70bad9a29ee8" 00:15:03.832 ], 00:15:03.832 "product_name": "Raid Volume", 00:15:03.832 "block_size": 512, 00:15:03.832 "num_blocks": 253952, 00:15:03.832 "uuid": "c1c40ed4-9171-4af9-a9a7-70bad9a29ee8", 00:15:03.832 "assigned_rate_limits": { 00:15:03.832 "rw_ios_per_sec": 0, 00:15:03.832 "rw_mbytes_per_sec": 0, 00:15:03.832 "r_mbytes_per_sec": 0, 00:15:03.832 "w_mbytes_per_sec": 0 00:15:03.832 }, 00:15:03.832 "claimed": false, 00:15:03.832 "zoned": false, 00:15:03.832 "supported_io_types": { 00:15:03.832 "read": true, 00:15:03.832 "write": true, 00:15:03.832 "unmap": true, 00:15:03.832 "flush": true, 00:15:03.832 "reset": true, 00:15:03.832 "nvme_admin": false, 00:15:03.832 "nvme_io": false, 00:15:03.832 "nvme_io_md": false, 00:15:03.832 "write_zeroes": true, 00:15:03.832 "zcopy": false, 00:15:03.832 "get_zone_info": false, 00:15:03.832 "zone_management": false, 00:15:03.832 "zone_append": false, 00:15:03.832 "compare": false, 00:15:03.832 "compare_and_write": false, 00:15:03.832 "abort": false, 00:15:03.832 "seek_hole": false, 00:15:03.832 "seek_data": false, 00:15:03.832 "copy": false, 00:15:03.832 "nvme_iov_md": false 00:15:03.832 }, 00:15:03.832 "memory_domains": [ 00:15:03.832 { 00:15:03.832 "dma_device_id": "system", 00:15:03.832 "dma_device_type": 1 00:15:03.832 }, 00:15:03.832 { 00:15:03.832 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:03.832 "dma_device_type": 2 00:15:03.832 }, 00:15:03.832 { 00:15:03.832 "dma_device_id": "system", 00:15:03.832 "dma_device_type": 1 00:15:03.832 }, 00:15:03.832 { 00:15:03.832 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:03.832 "dma_device_type": 2 00:15:03.832 }, 00:15:03.832 { 00:15:03.832 "dma_device_id": "system", 00:15:03.832 "dma_device_type": 1 00:15:03.832 }, 00:15:03.832 { 00:15:03.832 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:03.832 "dma_device_type": 2 00:15:03.832 }, 00:15:03.832 { 00:15:03.832 "dma_device_id": "system", 00:15:03.832 "dma_device_type": 1 00:15:03.832 }, 00:15:03.832 { 00:15:03.832 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:03.832 "dma_device_type": 2 00:15:03.832 } 00:15:03.832 ], 00:15:03.832 "driver_specific": { 00:15:03.832 "raid": { 00:15:03.832 "uuid": "c1c40ed4-9171-4af9-a9a7-70bad9a29ee8", 00:15:03.832 "strip_size_kb": 64, 00:15:03.832 "state": "online", 00:15:03.832 "raid_level": "raid0", 00:15:03.832 "superblock": true, 00:15:03.832 "num_base_bdevs": 4, 00:15:03.832 "num_base_bdevs_discovered": 4, 00:15:03.832 "num_base_bdevs_operational": 4, 00:15:03.832 "base_bdevs_list": [ 00:15:03.832 { 00:15:03.832 "name": "pt1", 00:15:03.832 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:03.832 "is_configured": true, 00:15:03.832 "data_offset": 2048, 00:15:03.832 "data_size": 63488 00:15:03.832 }, 00:15:03.832 { 00:15:03.832 "name": "pt2", 00:15:03.832 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:03.832 "is_configured": true, 00:15:03.832 "data_offset": 2048, 00:15:03.832 "data_size": 63488 00:15:03.832 }, 00:15:03.832 { 00:15:03.832 "name": "pt3", 00:15:03.832 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:03.832 "is_configured": true, 00:15:03.832 "data_offset": 2048, 00:15:03.832 "data_size": 63488 00:15:03.832 }, 00:15:03.832 { 00:15:03.832 "name": "pt4", 00:15:03.832 "uuid": "00000000-0000-0000-0000-000000000004", 00:15:03.832 "is_configured": true, 00:15:03.832 "data_offset": 2048, 00:15:03.832 "data_size": 63488 00:15:03.832 } 00:15:03.832 ] 00:15:03.832 } 00:15:03.832 } 00:15:03.832 }' 00:15:03.832 23:36:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:04.091 23:36:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:15:04.091 pt2 00:15:04.091 pt3 00:15:04.091 pt4' 00:15:04.091 23:36:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:04.091 23:36:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:15:04.091 23:36:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:04.091 23:36:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:04.091 "name": "pt1", 00:15:04.091 "aliases": [ 00:15:04.091 "00000000-0000-0000-0000-000000000001" 00:15:04.091 ], 00:15:04.091 "product_name": "passthru", 00:15:04.091 "block_size": 512, 00:15:04.091 "num_blocks": 65536, 00:15:04.091 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:04.091 "assigned_rate_limits": { 00:15:04.091 "rw_ios_per_sec": 0, 00:15:04.091 "rw_mbytes_per_sec": 0, 00:15:04.091 "r_mbytes_per_sec": 0, 00:15:04.091 "w_mbytes_per_sec": 0 00:15:04.091 }, 00:15:04.091 "claimed": true, 00:15:04.091 "claim_type": "exclusive_write", 00:15:04.091 "zoned": false, 00:15:04.091 "supported_io_types": { 00:15:04.091 "read": true, 00:15:04.091 "write": true, 00:15:04.091 "unmap": true, 00:15:04.091 "flush": true, 00:15:04.091 "reset": true, 00:15:04.091 "nvme_admin": false, 00:15:04.091 "nvme_io": false, 00:15:04.091 "nvme_io_md": false, 00:15:04.091 "write_zeroes": true, 00:15:04.091 "zcopy": true, 00:15:04.091 "get_zone_info": false, 00:15:04.091 "zone_management": false, 00:15:04.091 "zone_append": false, 00:15:04.091 "compare": false, 00:15:04.091 "compare_and_write": false, 00:15:04.091 "abort": true, 00:15:04.091 "seek_hole": false, 00:15:04.091 "seek_data": false, 00:15:04.091 "copy": true, 00:15:04.091 "nvme_iov_md": false 00:15:04.091 }, 00:15:04.091 "memory_domains": [ 00:15:04.091 { 00:15:04.091 "dma_device_id": "system", 00:15:04.091 "dma_device_type": 1 00:15:04.091 }, 00:15:04.091 { 00:15:04.091 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:04.091 "dma_device_type": 2 00:15:04.091 } 00:15:04.091 ], 00:15:04.091 "driver_specific": { 00:15:04.091 "passthru": { 00:15:04.091 "name": "pt1", 00:15:04.091 "base_bdev_name": "malloc1" 00:15:04.091 } 00:15:04.091 } 00:15:04.091 }' 00:15:04.091 23:36:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:04.091 23:36:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:04.349 23:36:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:04.349 23:36:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:04.349 23:36:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:04.349 23:36:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:04.349 23:36:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:04.349 23:36:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:04.349 23:36:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:04.349 23:36:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:04.349 23:36:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:04.349 23:36:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:04.349 23:36:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:04.349 23:36:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:15:04.349 23:36:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:04.607 23:36:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:04.607 "name": "pt2", 00:15:04.607 "aliases": [ 00:15:04.607 "00000000-0000-0000-0000-000000000002" 00:15:04.607 ], 00:15:04.607 "product_name": "passthru", 00:15:04.607 "block_size": 512, 00:15:04.607 "num_blocks": 65536, 00:15:04.607 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:04.607 "assigned_rate_limits": { 00:15:04.607 "rw_ios_per_sec": 0, 00:15:04.607 "rw_mbytes_per_sec": 0, 00:15:04.607 "r_mbytes_per_sec": 0, 00:15:04.607 "w_mbytes_per_sec": 0 00:15:04.607 }, 00:15:04.607 "claimed": true, 00:15:04.607 "claim_type": "exclusive_write", 00:15:04.607 "zoned": false, 00:15:04.607 "supported_io_types": { 00:15:04.607 "read": true, 00:15:04.607 "write": true, 00:15:04.607 "unmap": true, 00:15:04.607 "flush": true, 00:15:04.607 "reset": true, 00:15:04.607 "nvme_admin": false, 00:15:04.607 "nvme_io": false, 00:15:04.607 "nvme_io_md": false, 00:15:04.607 "write_zeroes": true, 00:15:04.607 "zcopy": true, 00:15:04.607 "get_zone_info": false, 00:15:04.607 "zone_management": false, 00:15:04.607 "zone_append": false, 00:15:04.607 "compare": false, 00:15:04.607 "compare_and_write": false, 00:15:04.607 "abort": true, 00:15:04.607 "seek_hole": false, 00:15:04.607 "seek_data": false, 00:15:04.607 "copy": true, 00:15:04.607 "nvme_iov_md": false 00:15:04.607 }, 00:15:04.607 "memory_domains": [ 00:15:04.607 { 00:15:04.607 "dma_device_id": "system", 00:15:04.607 "dma_device_type": 1 00:15:04.607 }, 00:15:04.607 { 00:15:04.607 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:04.607 "dma_device_type": 2 00:15:04.607 } 00:15:04.607 ], 00:15:04.607 "driver_specific": { 00:15:04.607 "passthru": { 00:15:04.607 "name": "pt2", 00:15:04.607 "base_bdev_name": "malloc2" 00:15:04.607 } 00:15:04.607 } 00:15:04.607 }' 00:15:04.607 23:36:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:04.607 23:36:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:04.607 23:36:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:04.607 23:36:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:04.865 23:36:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:04.865 23:36:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:04.865 23:36:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:04.865 23:36:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:04.865 23:36:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:04.865 23:36:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:04.865 23:36:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:04.865 23:36:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:04.865 23:36:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:04.865 23:36:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:15:04.865 23:36:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:05.122 23:36:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:05.122 "name": "pt3", 00:15:05.122 "aliases": [ 00:15:05.122 "00000000-0000-0000-0000-000000000003" 00:15:05.122 ], 00:15:05.122 "product_name": "passthru", 00:15:05.122 "block_size": 512, 00:15:05.122 "num_blocks": 65536, 00:15:05.122 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:05.122 "assigned_rate_limits": { 00:15:05.122 "rw_ios_per_sec": 0, 00:15:05.122 "rw_mbytes_per_sec": 0, 00:15:05.122 "r_mbytes_per_sec": 0, 00:15:05.122 "w_mbytes_per_sec": 0 00:15:05.122 }, 00:15:05.123 "claimed": true, 00:15:05.123 "claim_type": "exclusive_write", 00:15:05.123 "zoned": false, 00:15:05.123 "supported_io_types": { 00:15:05.123 "read": true, 00:15:05.123 "write": true, 00:15:05.123 "unmap": true, 00:15:05.123 "flush": true, 00:15:05.123 "reset": true, 00:15:05.123 "nvme_admin": false, 00:15:05.123 "nvme_io": false, 00:15:05.123 "nvme_io_md": false, 00:15:05.123 "write_zeroes": true, 00:15:05.123 "zcopy": true, 00:15:05.123 "get_zone_info": false, 00:15:05.123 "zone_management": false, 00:15:05.123 "zone_append": false, 00:15:05.123 "compare": false, 00:15:05.123 "compare_and_write": false, 00:15:05.123 "abort": true, 00:15:05.123 "seek_hole": false, 00:15:05.123 "seek_data": false, 00:15:05.123 "copy": true, 00:15:05.123 "nvme_iov_md": false 00:15:05.123 }, 00:15:05.123 "memory_domains": [ 00:15:05.123 { 00:15:05.123 "dma_device_id": "system", 00:15:05.123 "dma_device_type": 1 00:15:05.123 }, 00:15:05.123 { 00:15:05.123 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:05.123 "dma_device_type": 2 00:15:05.123 } 00:15:05.123 ], 00:15:05.123 "driver_specific": { 00:15:05.123 "passthru": { 00:15:05.123 "name": "pt3", 00:15:05.123 "base_bdev_name": "malloc3" 00:15:05.123 } 00:15:05.123 } 00:15:05.123 }' 00:15:05.123 23:36:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:05.123 23:36:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:05.123 23:36:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:05.123 23:36:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:05.381 23:36:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:05.381 23:36:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:05.381 23:36:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:05.381 23:36:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:05.381 23:36:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:05.381 23:36:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:05.381 23:36:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:05.381 23:36:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:05.381 23:36:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:05.381 23:36:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:05.381 23:36:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:15:05.639 23:36:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:05.639 "name": "pt4", 00:15:05.639 "aliases": [ 00:15:05.639 "00000000-0000-0000-0000-000000000004" 00:15:05.639 ], 00:15:05.639 "product_name": "passthru", 00:15:05.639 "block_size": 512, 00:15:05.639 "num_blocks": 65536, 00:15:05.639 "uuid": "00000000-0000-0000-0000-000000000004", 00:15:05.639 "assigned_rate_limits": { 00:15:05.639 "rw_ios_per_sec": 0, 00:15:05.639 "rw_mbytes_per_sec": 0, 00:15:05.639 "r_mbytes_per_sec": 0, 00:15:05.639 "w_mbytes_per_sec": 0 00:15:05.639 }, 00:15:05.639 "claimed": true, 00:15:05.639 "claim_type": "exclusive_write", 00:15:05.639 "zoned": false, 00:15:05.639 "supported_io_types": { 00:15:05.639 "read": true, 00:15:05.639 "write": true, 00:15:05.639 "unmap": true, 00:15:05.639 "flush": true, 00:15:05.639 "reset": true, 00:15:05.639 "nvme_admin": false, 00:15:05.639 "nvme_io": false, 00:15:05.639 "nvme_io_md": false, 00:15:05.639 "write_zeroes": true, 00:15:05.639 "zcopy": true, 00:15:05.639 "get_zone_info": false, 00:15:05.639 "zone_management": false, 00:15:05.639 "zone_append": false, 00:15:05.639 "compare": false, 00:15:05.639 "compare_and_write": false, 00:15:05.639 "abort": true, 00:15:05.639 "seek_hole": false, 00:15:05.639 "seek_data": false, 00:15:05.639 "copy": true, 00:15:05.639 "nvme_iov_md": false 00:15:05.639 }, 00:15:05.639 "memory_domains": [ 00:15:05.639 { 00:15:05.639 "dma_device_id": "system", 00:15:05.639 "dma_device_type": 1 00:15:05.639 }, 00:15:05.639 { 00:15:05.639 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:05.639 "dma_device_type": 2 00:15:05.639 } 00:15:05.639 ], 00:15:05.639 "driver_specific": { 00:15:05.639 "passthru": { 00:15:05.639 "name": "pt4", 00:15:05.639 "base_bdev_name": "malloc4" 00:15:05.639 } 00:15:05.639 } 00:15:05.639 }' 00:15:05.639 23:36:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:05.639 23:36:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:05.639 23:36:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:05.639 23:36:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:05.639 23:36:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:05.897 23:36:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:05.897 23:36:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:05.897 23:36:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:05.897 23:36:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:05.897 23:36:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:05.897 23:36:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:05.897 23:36:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:05.897 23:36:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:05.897 23:36:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:15:06.155 [2024-07-24 23:36:50.966818] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:06.155 23:36:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' c1c40ed4-9171-4af9-a9a7-70bad9a29ee8 '!=' c1c40ed4-9171-4af9-a9a7-70bad9a29ee8 ']' 00:15:06.155 23:36:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:15:06.155 23:36:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:06.155 23:36:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:15:06.155 23:36:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 311904 00:15:06.155 23:36:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@950 -- # '[' -z 311904 ']' 00:15:06.155 23:36:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # kill -0 311904 00:15:06.155 23:36:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # uname 00:15:06.155 23:36:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:15:06.155 23:36:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 311904 00:15:06.155 23:36:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:15:06.155 23:36:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:15:06.155 23:36:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 311904' 00:15:06.155 killing process with pid 311904 00:15:06.155 23:36:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@969 -- # kill 311904 00:15:06.155 [2024-07-24 23:36:51.026120] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:06.155 [2024-07-24 23:36:51.026169] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:06.155 [2024-07-24 23:36:51.026213] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:06.155 [2024-07-24 23:36:51.026219] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xeb3610 name raid_bdev1, state offline 00:15:06.155 23:36:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@974 -- # wait 311904 00:15:06.155 [2024-07-24 23:36:51.057650] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:06.413 23:36:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:15:06.413 00:15:06.413 real 0m12.313s 00:15:06.413 user 0m22.554s 00:15:06.413 sys 0m1.873s 00:15:06.413 23:36:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:15:06.413 23:36:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:06.413 ************************************ 00:15:06.413 END TEST raid_superblock_test 00:15:06.413 ************************************ 00:15:06.413 23:36:51 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 4 read 00:15:06.413 23:36:51 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:15:06.413 23:36:51 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:15:06.413 23:36:51 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:06.413 ************************************ 00:15:06.413 START TEST raid_read_error_test 00:15:06.413 ************************************ 00:15:06.413 23:36:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test raid0 4 read 00:15:06.413 23:36:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:15:06.413 23:36:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:15:06.413 23:36:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:15:06.413 23:36:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:15:06.413 23:36:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:06.413 23:36:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:15:06.413 23:36:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:06.413 23:36:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:06.413 23:36:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:15:06.413 23:36:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:06.413 23:36:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:06.413 23:36:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:15:06.413 23:36:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:06.413 23:36:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:06.413 23:36:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:15:06.413 23:36:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:06.413 23:36:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:06.413 23:36:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:15:06.413 23:36:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:15:06.413 23:36:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:15:06.413 23:36:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:15:06.413 23:36:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:15:06.413 23:36:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:15:06.413 23:36:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:15:06.413 23:36:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:15:06.413 23:36:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:15:06.413 23:36:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:15:06.413 23:36:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:15:06.413 23:36:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.eq1cySTVlj 00:15:06.413 23:36:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=314296 00:15:06.413 23:36:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 314296 /var/tmp/spdk-raid.sock 00:15:06.413 23:36:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:15:06.413 23:36:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@831 -- # '[' -z 314296 ']' 00:15:06.413 23:36:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:06.413 23:36:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:15:06.413 23:36:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:06.413 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:06.413 23:36:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:15:06.413 23:36:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:06.413 [2024-07-24 23:36:51.364221] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:15:06.413 [2024-07-24 23:36:51.364260] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid314296 ] 00:15:06.671 [2024-07-24 23:36:51.429029] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:06.671 [2024-07-24 23:36:51.506603] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:06.671 [2024-07-24 23:36:51.559884] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:06.671 [2024-07-24 23:36:51.559907] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:07.255 23:36:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:15:07.255 23:36:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@864 -- # return 0 00:15:07.255 23:36:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:07.255 23:36:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:15:07.513 BaseBdev1_malloc 00:15:07.513 23:36:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:15:07.514 true 00:15:07.514 23:36:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:15:07.772 [2024-07-24 23:36:52.635787] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:15:07.772 [2024-07-24 23:36:52.635823] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:07.772 [2024-07-24 23:36:52.635832] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1274550 00:15:07.772 [2024-07-24 23:36:52.635837] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:07.772 [2024-07-24 23:36:52.636997] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:07.772 [2024-07-24 23:36:52.637017] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:15:07.772 BaseBdev1 00:15:07.772 23:36:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:07.772 23:36:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:15:08.030 BaseBdev2_malloc 00:15:08.030 23:36:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:15:08.030 true 00:15:08.030 23:36:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:15:08.288 [2024-07-24 23:36:53.152393] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:15:08.288 [2024-07-24 23:36:53.152424] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:08.288 [2024-07-24 23:36:53.152434] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1278d90 00:15:08.288 [2024-07-24 23:36:53.152440] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:08.288 [2024-07-24 23:36:53.153508] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:08.288 [2024-07-24 23:36:53.153528] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:15:08.288 BaseBdev2 00:15:08.288 23:36:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:08.288 23:36:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:15:08.550 BaseBdev3_malloc 00:15:08.550 23:36:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:15:08.550 true 00:15:08.550 23:36:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:15:08.812 [2024-07-24 23:36:53.665213] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:15:08.812 [2024-07-24 23:36:53.665248] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:08.812 [2024-07-24 23:36:53.665258] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x127b050 00:15:08.812 [2024-07-24 23:36:53.665264] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:08.813 [2024-07-24 23:36:53.666263] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:08.813 [2024-07-24 23:36:53.666285] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:15:08.813 BaseBdev3 00:15:08.813 23:36:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:08.813 23:36:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:15:09.070 BaseBdev4_malloc 00:15:09.070 23:36:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:15:09.070 true 00:15:09.070 23:36:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:15:09.328 [2024-07-24 23:36:54.177957] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:15:09.328 [2024-07-24 23:36:54.177987] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:09.328 [2024-07-24 23:36:54.177997] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x127bf20 00:15:09.328 [2024-07-24 23:36:54.178002] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:09.328 [2024-07-24 23:36:54.178920] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:09.328 [2024-07-24 23:36:54.178939] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:15:09.328 BaseBdev4 00:15:09.328 23:36:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:15:09.587 [2024-07-24 23:36:54.350436] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:09.587 [2024-07-24 23:36:54.351284] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:09.587 [2024-07-24 23:36:54.351330] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:09.587 [2024-07-24 23:36:54.351370] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:15:09.587 [2024-07-24 23:36:54.351534] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x12760a0 00:15:09.587 [2024-07-24 23:36:54.351543] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:15:09.587 [2024-07-24 23:36:54.351667] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x10ca6e0 00:15:09.587 [2024-07-24 23:36:54.351780] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x12760a0 00:15:09.587 [2024-07-24 23:36:54.351785] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x12760a0 00:15:09.587 [2024-07-24 23:36:54.351848] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:09.587 23:36:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:15:09.587 23:36:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:09.587 23:36:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:09.587 23:36:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:09.587 23:36:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:09.587 23:36:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:09.587 23:36:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:09.587 23:36:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:09.587 23:36:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:09.587 23:36:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:09.587 23:36:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:09.587 23:36:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:09.587 23:36:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:09.587 "name": "raid_bdev1", 00:15:09.587 "uuid": "ac47ab63-a8be-403b-b7b2-32b7467fce8b", 00:15:09.587 "strip_size_kb": 64, 00:15:09.587 "state": "online", 00:15:09.587 "raid_level": "raid0", 00:15:09.587 "superblock": true, 00:15:09.587 "num_base_bdevs": 4, 00:15:09.587 "num_base_bdevs_discovered": 4, 00:15:09.587 "num_base_bdevs_operational": 4, 00:15:09.587 "base_bdevs_list": [ 00:15:09.587 { 00:15:09.587 "name": "BaseBdev1", 00:15:09.587 "uuid": "a35dfd68-f35a-53b7-acf0-9222df8e8e7a", 00:15:09.587 "is_configured": true, 00:15:09.587 "data_offset": 2048, 00:15:09.587 "data_size": 63488 00:15:09.587 }, 00:15:09.587 { 00:15:09.587 "name": "BaseBdev2", 00:15:09.587 "uuid": "3dbee596-bfdd-5e18-85a1-91480f853889", 00:15:09.587 "is_configured": true, 00:15:09.587 "data_offset": 2048, 00:15:09.587 "data_size": 63488 00:15:09.587 }, 00:15:09.587 { 00:15:09.587 "name": "BaseBdev3", 00:15:09.587 "uuid": "abb9eb06-4088-56fe-8623-a7f6785c201d", 00:15:09.587 "is_configured": true, 00:15:09.587 "data_offset": 2048, 00:15:09.587 "data_size": 63488 00:15:09.587 }, 00:15:09.587 { 00:15:09.587 "name": "BaseBdev4", 00:15:09.587 "uuid": "701cb925-b7a6-5c4c-b9b8-0e05599c2499", 00:15:09.587 "is_configured": true, 00:15:09.587 "data_offset": 2048, 00:15:09.587 "data_size": 63488 00:15:09.587 } 00:15:09.587 ] 00:15:09.587 }' 00:15:09.587 23:36:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:09.587 23:36:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:10.152 23:36:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:15:10.152 23:36:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:15:10.152 [2024-07-24 23:36:55.092581] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1268440 00:15:11.085 23:36:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:15:11.344 23:36:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:15:11.344 23:36:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:15:11.344 23:36:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:15:11.344 23:36:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:15:11.344 23:36:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:11.344 23:36:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:11.344 23:36:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:11.344 23:36:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:11.344 23:36:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:11.344 23:36:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:11.344 23:36:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:11.344 23:36:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:11.344 23:36:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:11.344 23:36:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:11.344 23:36:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:11.602 23:36:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:11.602 "name": "raid_bdev1", 00:15:11.602 "uuid": "ac47ab63-a8be-403b-b7b2-32b7467fce8b", 00:15:11.602 "strip_size_kb": 64, 00:15:11.602 "state": "online", 00:15:11.602 "raid_level": "raid0", 00:15:11.602 "superblock": true, 00:15:11.602 "num_base_bdevs": 4, 00:15:11.602 "num_base_bdevs_discovered": 4, 00:15:11.602 "num_base_bdevs_operational": 4, 00:15:11.602 "base_bdevs_list": [ 00:15:11.602 { 00:15:11.602 "name": "BaseBdev1", 00:15:11.602 "uuid": "a35dfd68-f35a-53b7-acf0-9222df8e8e7a", 00:15:11.602 "is_configured": true, 00:15:11.602 "data_offset": 2048, 00:15:11.602 "data_size": 63488 00:15:11.602 }, 00:15:11.602 { 00:15:11.602 "name": "BaseBdev2", 00:15:11.602 "uuid": "3dbee596-bfdd-5e18-85a1-91480f853889", 00:15:11.602 "is_configured": true, 00:15:11.602 "data_offset": 2048, 00:15:11.602 "data_size": 63488 00:15:11.602 }, 00:15:11.602 { 00:15:11.602 "name": "BaseBdev3", 00:15:11.602 "uuid": "abb9eb06-4088-56fe-8623-a7f6785c201d", 00:15:11.602 "is_configured": true, 00:15:11.602 "data_offset": 2048, 00:15:11.602 "data_size": 63488 00:15:11.602 }, 00:15:11.602 { 00:15:11.602 "name": "BaseBdev4", 00:15:11.602 "uuid": "701cb925-b7a6-5c4c-b9b8-0e05599c2499", 00:15:11.602 "is_configured": true, 00:15:11.602 "data_offset": 2048, 00:15:11.602 "data_size": 63488 00:15:11.602 } 00:15:11.602 ] 00:15:11.602 }' 00:15:11.602 23:36:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:11.602 23:36:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:11.861 23:36:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:12.120 [2024-07-24 23:36:57.001546] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:12.120 [2024-07-24 23:36:57.001579] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:12.120 [2024-07-24 23:36:57.003667] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:12.120 [2024-07-24 23:36:57.003692] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:12.120 [2024-07-24 23:36:57.003718] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:12.120 [2024-07-24 23:36:57.003723] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12760a0 name raid_bdev1, state offline 00:15:12.120 0 00:15:12.120 23:36:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 314296 00:15:12.120 23:36:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@950 -- # '[' -z 314296 ']' 00:15:12.120 23:36:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # kill -0 314296 00:15:12.120 23:36:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # uname 00:15:12.120 23:36:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:15:12.120 23:36:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 314296 00:15:12.120 23:36:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:15:12.120 23:36:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:15:12.120 23:36:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 314296' 00:15:12.120 killing process with pid 314296 00:15:12.120 23:36:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@969 -- # kill 314296 00:15:12.120 [2024-07-24 23:36:57.053510] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:12.120 23:36:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@974 -- # wait 314296 00:15:12.120 [2024-07-24 23:36:57.079181] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:12.378 23:36:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.eq1cySTVlj 00:15:12.378 23:36:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:15:12.378 23:36:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:15:12.378 23:36:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.53 00:15:12.378 23:36:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:15:12.378 23:36:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:12.378 23:36:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:15:12.378 23:36:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.53 != \0\.\0\0 ]] 00:15:12.378 00:15:12.378 real 0m5.971s 00:15:12.378 user 0m9.363s 00:15:12.378 sys 0m0.897s 00:15:12.378 23:36:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:15:12.378 23:36:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:12.378 ************************************ 00:15:12.378 END TEST raid_read_error_test 00:15:12.378 ************************************ 00:15:12.378 23:36:57 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 4 write 00:15:12.378 23:36:57 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:15:12.378 23:36:57 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:15:12.378 23:36:57 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:12.378 ************************************ 00:15:12.378 START TEST raid_write_error_test 00:15:12.378 ************************************ 00:15:12.378 23:36:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test raid0 4 write 00:15:12.378 23:36:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:15:12.378 23:36:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:15:12.378 23:36:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:15:12.378 23:36:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:15:12.378 23:36:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:12.378 23:36:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:15:12.378 23:36:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:12.378 23:36:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:12.378 23:36:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:15:12.378 23:36:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:12.378 23:36:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:12.378 23:36:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:15:12.378 23:36:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:12.378 23:36:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:12.378 23:36:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:15:12.378 23:36:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:12.378 23:36:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:12.378 23:36:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:15:12.378 23:36:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:15:12.378 23:36:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:15:12.378 23:36:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:15:12.378 23:36:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:15:12.378 23:36:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:15:12.378 23:36:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:15:12.378 23:36:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:15:12.378 23:36:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:15:12.378 23:36:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:15:12.378 23:36:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:15:12.378 23:36:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.EVDkk2bmZn 00:15:12.378 23:36:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:15:12.378 23:36:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=315503 00:15:12.379 23:36:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 315503 /var/tmp/spdk-raid.sock 00:15:12.379 23:36:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@831 -- # '[' -z 315503 ']' 00:15:12.379 23:36:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:12.379 23:36:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:15:12.379 23:36:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:12.379 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:12.379 23:36:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:15:12.379 23:36:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:12.637 [2024-07-24 23:36:57.379882] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:15:12.637 [2024-07-24 23:36:57.379922] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid315503 ] 00:15:12.637 [2024-07-24 23:36:57.437471] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:12.637 [2024-07-24 23:36:57.515899] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:12.637 [2024-07-24 23:36:57.564950] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:12.637 [2024-07-24 23:36:57.564975] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:13.203 23:36:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:15:13.203 23:36:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@864 -- # return 0 00:15:13.203 23:36:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:13.203 23:36:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:15:13.461 BaseBdev1_malloc 00:15:13.461 23:36:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:15:13.720 true 00:15:13.720 23:36:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:15:13.720 [2024-07-24 23:36:58.672770] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:15:13.720 [2024-07-24 23:36:58.672803] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:13.720 [2024-07-24 23:36:58.672815] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1656550 00:15:13.720 [2024-07-24 23:36:58.672820] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:13.720 [2024-07-24 23:36:58.674084] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:13.720 [2024-07-24 23:36:58.674107] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:15:13.720 BaseBdev1 00:15:13.720 23:36:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:13.720 23:36:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:15:13.978 BaseBdev2_malloc 00:15:13.978 23:36:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:15:14.237 true 00:15:14.237 23:36:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:15:14.237 [2024-07-24 23:36:59.177512] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:15:14.237 [2024-07-24 23:36:59.177543] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:14.237 [2024-07-24 23:36:59.177555] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x165ad90 00:15:14.237 [2024-07-24 23:36:59.177561] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:14.237 [2024-07-24 23:36:59.178596] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:14.237 [2024-07-24 23:36:59.178617] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:15:14.237 BaseBdev2 00:15:14.237 23:36:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:14.237 23:36:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:15:14.495 BaseBdev3_malloc 00:15:14.495 23:36:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:15:14.763 true 00:15:14.763 23:36:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:15:14.763 [2024-07-24 23:36:59.666321] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:15:14.763 [2024-07-24 23:36:59.666351] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:14.763 [2024-07-24 23:36:59.666363] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x165d050 00:15:14.763 [2024-07-24 23:36:59.666369] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:14.763 [2024-07-24 23:36:59.667398] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:14.763 [2024-07-24 23:36:59.667418] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:15:14.763 BaseBdev3 00:15:14.763 23:36:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:14.763 23:36:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:15:15.026 BaseBdev4_malloc 00:15:15.026 23:36:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:15:15.026 true 00:15:15.026 23:37:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:15:15.285 [2024-07-24 23:37:00.175250] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:15:15.285 [2024-07-24 23:37:00.175287] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:15.285 [2024-07-24 23:37:00.175301] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x165df20 00:15:15.285 [2024-07-24 23:37:00.175307] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:15.285 [2024-07-24 23:37:00.176405] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:15.285 [2024-07-24 23:37:00.176425] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:15:15.285 BaseBdev4 00:15:15.285 23:37:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:15:15.544 [2024-07-24 23:37:00.331689] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:15.544 [2024-07-24 23:37:00.332554] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:15.544 [2024-07-24 23:37:00.332601] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:15.544 [2024-07-24 23:37:00.332640] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:15:15.544 [2024-07-24 23:37:00.332791] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x16580a0 00:15:15.544 [2024-07-24 23:37:00.332797] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:15:15.544 [2024-07-24 23:37:00.332926] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14ac6e0 00:15:15.544 [2024-07-24 23:37:00.333028] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x16580a0 00:15:15.544 [2024-07-24 23:37:00.333034] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x16580a0 00:15:15.544 [2024-07-24 23:37:00.333104] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:15.544 23:37:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:15:15.544 23:37:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:15.544 23:37:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:15.544 23:37:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:15.544 23:37:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:15.544 23:37:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:15.544 23:37:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:15.544 23:37:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:15.544 23:37:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:15.544 23:37:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:15.544 23:37:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:15.544 23:37:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:15.544 23:37:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:15.544 "name": "raid_bdev1", 00:15:15.544 "uuid": "af04380a-65cf-4d5f-8e8c-8a9deca5ed63", 00:15:15.544 "strip_size_kb": 64, 00:15:15.544 "state": "online", 00:15:15.544 "raid_level": "raid0", 00:15:15.544 "superblock": true, 00:15:15.544 "num_base_bdevs": 4, 00:15:15.544 "num_base_bdevs_discovered": 4, 00:15:15.544 "num_base_bdevs_operational": 4, 00:15:15.544 "base_bdevs_list": [ 00:15:15.544 { 00:15:15.544 "name": "BaseBdev1", 00:15:15.544 "uuid": "1d281d5a-0860-5a59-9261-1ac55865fd71", 00:15:15.544 "is_configured": true, 00:15:15.544 "data_offset": 2048, 00:15:15.544 "data_size": 63488 00:15:15.544 }, 00:15:15.544 { 00:15:15.544 "name": "BaseBdev2", 00:15:15.544 "uuid": "8aa5ff51-2758-5995-b4c1-4122119f70ce", 00:15:15.544 "is_configured": true, 00:15:15.544 "data_offset": 2048, 00:15:15.544 "data_size": 63488 00:15:15.544 }, 00:15:15.544 { 00:15:15.544 "name": "BaseBdev3", 00:15:15.544 "uuid": "21ba75ea-e03d-5da4-9705-85a30b725c42", 00:15:15.544 "is_configured": true, 00:15:15.545 "data_offset": 2048, 00:15:15.545 "data_size": 63488 00:15:15.545 }, 00:15:15.545 { 00:15:15.545 "name": "BaseBdev4", 00:15:15.545 "uuid": "f82dcfea-c931-575c-96e8-5a9e91365a81", 00:15:15.545 "is_configured": true, 00:15:15.545 "data_offset": 2048, 00:15:15.545 "data_size": 63488 00:15:15.545 } 00:15:15.545 ] 00:15:15.545 }' 00:15:15.545 23:37:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:15.545 23:37:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:16.112 23:37:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:15:16.112 23:37:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:15:16.112 [2024-07-24 23:37:01.081827] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x164a440 00:15:17.048 23:37:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:15:17.307 23:37:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:15:17.307 23:37:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:15:17.307 23:37:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:15:17.307 23:37:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:15:17.307 23:37:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:17.307 23:37:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:17.307 23:37:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:17.307 23:37:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:17.307 23:37:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:17.307 23:37:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:17.307 23:37:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:17.307 23:37:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:17.307 23:37:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:17.307 23:37:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:17.307 23:37:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:17.566 23:37:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:17.566 "name": "raid_bdev1", 00:15:17.566 "uuid": "af04380a-65cf-4d5f-8e8c-8a9deca5ed63", 00:15:17.566 "strip_size_kb": 64, 00:15:17.566 "state": "online", 00:15:17.566 "raid_level": "raid0", 00:15:17.566 "superblock": true, 00:15:17.566 "num_base_bdevs": 4, 00:15:17.566 "num_base_bdevs_discovered": 4, 00:15:17.566 "num_base_bdevs_operational": 4, 00:15:17.566 "base_bdevs_list": [ 00:15:17.566 { 00:15:17.566 "name": "BaseBdev1", 00:15:17.566 "uuid": "1d281d5a-0860-5a59-9261-1ac55865fd71", 00:15:17.566 "is_configured": true, 00:15:17.567 "data_offset": 2048, 00:15:17.567 "data_size": 63488 00:15:17.567 }, 00:15:17.567 { 00:15:17.567 "name": "BaseBdev2", 00:15:17.567 "uuid": "8aa5ff51-2758-5995-b4c1-4122119f70ce", 00:15:17.567 "is_configured": true, 00:15:17.567 "data_offset": 2048, 00:15:17.567 "data_size": 63488 00:15:17.567 }, 00:15:17.567 { 00:15:17.567 "name": "BaseBdev3", 00:15:17.567 "uuid": "21ba75ea-e03d-5da4-9705-85a30b725c42", 00:15:17.567 "is_configured": true, 00:15:17.567 "data_offset": 2048, 00:15:17.567 "data_size": 63488 00:15:17.567 }, 00:15:17.567 { 00:15:17.567 "name": "BaseBdev4", 00:15:17.567 "uuid": "f82dcfea-c931-575c-96e8-5a9e91365a81", 00:15:17.567 "is_configured": true, 00:15:17.567 "data_offset": 2048, 00:15:17.567 "data_size": 63488 00:15:17.567 } 00:15:17.567 ] 00:15:17.567 }' 00:15:17.567 23:37:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:17.567 23:37:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:18.133 23:37:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:18.133 [2024-07-24 23:37:03.002512] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:18.133 [2024-07-24 23:37:03.002540] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:18.133 [2024-07-24 23:37:03.004575] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:18.134 [2024-07-24 23:37:03.004601] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:18.134 [2024-07-24 23:37:03.004627] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:18.134 [2024-07-24 23:37:03.004632] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16580a0 name raid_bdev1, state offline 00:15:18.134 0 00:15:18.134 23:37:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 315503 00:15:18.134 23:37:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@950 -- # '[' -z 315503 ']' 00:15:18.134 23:37:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # kill -0 315503 00:15:18.134 23:37:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # uname 00:15:18.134 23:37:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:15:18.134 23:37:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 315503 00:15:18.134 23:37:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:15:18.134 23:37:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:15:18.134 23:37:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 315503' 00:15:18.134 killing process with pid 315503 00:15:18.134 23:37:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@969 -- # kill 315503 00:15:18.134 [2024-07-24 23:37:03.059065] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:18.134 23:37:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@974 -- # wait 315503 00:15:18.134 [2024-07-24 23:37:03.085180] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:18.392 23:37:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.EVDkk2bmZn 00:15:18.392 23:37:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:15:18.392 23:37:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:15:18.392 23:37:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.52 00:15:18.392 23:37:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:15:18.392 23:37:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:18.392 23:37:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:15:18.393 23:37:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.52 != \0\.\0\0 ]] 00:15:18.393 00:15:18.393 real 0m5.942s 00:15:18.393 user 0m9.377s 00:15:18.393 sys 0m0.848s 00:15:18.393 23:37:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:15:18.393 23:37:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:18.393 ************************************ 00:15:18.393 END TEST raid_write_error_test 00:15:18.393 ************************************ 00:15:18.393 23:37:03 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:15:18.393 23:37:03 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 4 false 00:15:18.393 23:37:03 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:15:18.393 23:37:03 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:15:18.393 23:37:03 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:18.393 ************************************ 00:15:18.393 START TEST raid_state_function_test 00:15:18.393 ************************************ 00:15:18.393 23:37:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # raid_state_function_test concat 4 false 00:15:18.393 23:37:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:15:18.393 23:37:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:15:18.393 23:37:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:15:18.393 23:37:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:15:18.393 23:37:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:15:18.393 23:37:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:18.393 23:37:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:15:18.393 23:37:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:18.393 23:37:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:18.393 23:37:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:15:18.393 23:37:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:18.393 23:37:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:18.393 23:37:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:15:18.393 23:37:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:18.393 23:37:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:18.393 23:37:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:15:18.393 23:37:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:18.393 23:37:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:18.393 23:37:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:15:18.393 23:37:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:15:18.393 23:37:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:15:18.393 23:37:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:15:18.393 23:37:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:15:18.393 23:37:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:15:18.393 23:37:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:15:18.393 23:37:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:15:18.393 23:37:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:15:18.393 23:37:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:15:18.393 23:37:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:15:18.393 23:37:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=316543 00:15:18.393 23:37:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 316543' 00:15:18.393 Process raid pid: 316543 00:15:18.393 23:37:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 316543 /var/tmp/spdk-raid.sock 00:15:18.393 23:37:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@831 -- # '[' -z 316543 ']' 00:15:18.393 23:37:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:18.393 23:37:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:15:18.393 23:37:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:18.393 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:18.393 23:37:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:15:18.393 23:37:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:18.393 23:37:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:15:18.393 [2024-07-24 23:37:03.384538] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:15:18.393 [2024-07-24 23:37:03.384577] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:15:18.655 [2024-07-24 23:37:03.447443] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:18.655 [2024-07-24 23:37:03.525249] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:18.655 [2024-07-24 23:37:03.575382] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:18.655 [2024-07-24 23:37:03.575405] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:19.302 23:37:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:15:19.302 23:37:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@864 -- # return 0 00:15:19.302 23:37:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:15:19.614 [2024-07-24 23:37:04.310059] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:19.614 [2024-07-24 23:37:04.310088] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:19.614 [2024-07-24 23:37:04.310096] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:19.614 [2024-07-24 23:37:04.310102] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:19.614 [2024-07-24 23:37:04.310106] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:19.614 [2024-07-24 23:37:04.310111] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:19.614 [2024-07-24 23:37:04.310115] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:15:19.614 [2024-07-24 23:37:04.310120] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:15:19.614 23:37:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:15:19.614 23:37:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:19.614 23:37:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:19.614 23:37:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:19.614 23:37:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:19.614 23:37:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:19.614 23:37:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:19.614 23:37:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:19.614 23:37:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:19.614 23:37:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:19.614 23:37:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:19.614 23:37:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:19.614 23:37:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:19.614 "name": "Existed_Raid", 00:15:19.614 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:19.614 "strip_size_kb": 64, 00:15:19.614 "state": "configuring", 00:15:19.614 "raid_level": "concat", 00:15:19.614 "superblock": false, 00:15:19.614 "num_base_bdevs": 4, 00:15:19.614 "num_base_bdevs_discovered": 0, 00:15:19.614 "num_base_bdevs_operational": 4, 00:15:19.614 "base_bdevs_list": [ 00:15:19.614 { 00:15:19.614 "name": "BaseBdev1", 00:15:19.614 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:19.614 "is_configured": false, 00:15:19.614 "data_offset": 0, 00:15:19.614 "data_size": 0 00:15:19.614 }, 00:15:19.614 { 00:15:19.614 "name": "BaseBdev2", 00:15:19.614 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:19.614 "is_configured": false, 00:15:19.614 "data_offset": 0, 00:15:19.614 "data_size": 0 00:15:19.614 }, 00:15:19.614 { 00:15:19.614 "name": "BaseBdev3", 00:15:19.614 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:19.614 "is_configured": false, 00:15:19.614 "data_offset": 0, 00:15:19.614 "data_size": 0 00:15:19.614 }, 00:15:19.614 { 00:15:19.614 "name": "BaseBdev4", 00:15:19.614 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:19.614 "is_configured": false, 00:15:19.614 "data_offset": 0, 00:15:19.614 "data_size": 0 00:15:19.614 } 00:15:19.614 ] 00:15:19.614 }' 00:15:19.614 23:37:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:19.614 23:37:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:20.181 23:37:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:20.181 [2024-07-24 23:37:05.132100] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:20.181 [2024-07-24 23:37:05.132121] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23c7b50 name Existed_Raid, state configuring 00:15:20.181 23:37:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:15:20.439 [2024-07-24 23:37:05.288527] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:20.439 [2024-07-24 23:37:05.288547] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:20.439 [2024-07-24 23:37:05.288551] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:20.439 [2024-07-24 23:37:05.288556] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:20.439 [2024-07-24 23:37:05.288560] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:20.439 [2024-07-24 23:37:05.288565] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:20.439 [2024-07-24 23:37:05.288569] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:15:20.439 [2024-07-24 23:37:05.288574] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:15:20.439 23:37:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:20.697 [2024-07-24 23:37:05.453157] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:20.697 BaseBdev1 00:15:20.697 23:37:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:15:20.697 23:37:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:15:20.697 23:37:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:15:20.697 23:37:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:15:20.697 23:37:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:15:20.697 23:37:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:15:20.697 23:37:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:20.697 23:37:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:20.956 [ 00:15:20.956 { 00:15:20.956 "name": "BaseBdev1", 00:15:20.956 "aliases": [ 00:15:20.956 "865f1c55-b9f4-4c43-8ff8-5bdf217cab33" 00:15:20.956 ], 00:15:20.956 "product_name": "Malloc disk", 00:15:20.956 "block_size": 512, 00:15:20.956 "num_blocks": 65536, 00:15:20.956 "uuid": "865f1c55-b9f4-4c43-8ff8-5bdf217cab33", 00:15:20.956 "assigned_rate_limits": { 00:15:20.956 "rw_ios_per_sec": 0, 00:15:20.956 "rw_mbytes_per_sec": 0, 00:15:20.956 "r_mbytes_per_sec": 0, 00:15:20.956 "w_mbytes_per_sec": 0 00:15:20.957 }, 00:15:20.957 "claimed": true, 00:15:20.957 "claim_type": "exclusive_write", 00:15:20.957 "zoned": false, 00:15:20.957 "supported_io_types": { 00:15:20.957 "read": true, 00:15:20.957 "write": true, 00:15:20.957 "unmap": true, 00:15:20.957 "flush": true, 00:15:20.957 "reset": true, 00:15:20.957 "nvme_admin": false, 00:15:20.957 "nvme_io": false, 00:15:20.957 "nvme_io_md": false, 00:15:20.957 "write_zeroes": true, 00:15:20.957 "zcopy": true, 00:15:20.957 "get_zone_info": false, 00:15:20.957 "zone_management": false, 00:15:20.957 "zone_append": false, 00:15:20.957 "compare": false, 00:15:20.957 "compare_and_write": false, 00:15:20.957 "abort": true, 00:15:20.957 "seek_hole": false, 00:15:20.957 "seek_data": false, 00:15:20.957 "copy": true, 00:15:20.957 "nvme_iov_md": false 00:15:20.957 }, 00:15:20.957 "memory_domains": [ 00:15:20.957 { 00:15:20.957 "dma_device_id": "system", 00:15:20.957 "dma_device_type": 1 00:15:20.957 }, 00:15:20.957 { 00:15:20.957 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:20.957 "dma_device_type": 2 00:15:20.957 } 00:15:20.957 ], 00:15:20.957 "driver_specific": {} 00:15:20.957 } 00:15:20.957 ] 00:15:20.957 23:37:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:15:20.957 23:37:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:15:20.957 23:37:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:20.957 23:37:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:20.957 23:37:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:20.957 23:37:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:20.957 23:37:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:20.957 23:37:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:20.957 23:37:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:20.957 23:37:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:20.957 23:37:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:20.957 23:37:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:20.957 23:37:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:21.215 23:37:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:21.215 "name": "Existed_Raid", 00:15:21.215 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:21.215 "strip_size_kb": 64, 00:15:21.215 "state": "configuring", 00:15:21.215 "raid_level": "concat", 00:15:21.215 "superblock": false, 00:15:21.215 "num_base_bdevs": 4, 00:15:21.215 "num_base_bdevs_discovered": 1, 00:15:21.215 "num_base_bdevs_operational": 4, 00:15:21.215 "base_bdevs_list": [ 00:15:21.215 { 00:15:21.215 "name": "BaseBdev1", 00:15:21.215 "uuid": "865f1c55-b9f4-4c43-8ff8-5bdf217cab33", 00:15:21.215 "is_configured": true, 00:15:21.215 "data_offset": 0, 00:15:21.215 "data_size": 65536 00:15:21.215 }, 00:15:21.215 { 00:15:21.215 "name": "BaseBdev2", 00:15:21.215 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:21.215 "is_configured": false, 00:15:21.215 "data_offset": 0, 00:15:21.215 "data_size": 0 00:15:21.215 }, 00:15:21.215 { 00:15:21.215 "name": "BaseBdev3", 00:15:21.215 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:21.215 "is_configured": false, 00:15:21.215 "data_offset": 0, 00:15:21.215 "data_size": 0 00:15:21.215 }, 00:15:21.215 { 00:15:21.215 "name": "BaseBdev4", 00:15:21.215 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:21.215 "is_configured": false, 00:15:21.216 "data_offset": 0, 00:15:21.216 "data_size": 0 00:15:21.216 } 00:15:21.216 ] 00:15:21.216 }' 00:15:21.216 23:37:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:21.216 23:37:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:21.474 23:37:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:21.732 [2024-07-24 23:37:06.564022] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:21.732 [2024-07-24 23:37:06.564049] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23c73a0 name Existed_Raid, state configuring 00:15:21.732 23:37:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:15:21.991 [2024-07-24 23:37:06.732506] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:21.991 [2024-07-24 23:37:06.733520] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:21.991 [2024-07-24 23:37:06.733544] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:21.991 [2024-07-24 23:37:06.733550] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:21.991 [2024-07-24 23:37:06.733555] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:21.991 [2024-07-24 23:37:06.733559] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:15:21.991 [2024-07-24 23:37:06.733564] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:15:21.991 23:37:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:15:21.991 23:37:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:21.991 23:37:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:15:21.991 23:37:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:21.991 23:37:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:21.991 23:37:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:21.991 23:37:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:21.991 23:37:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:21.991 23:37:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:21.991 23:37:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:21.991 23:37:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:21.991 23:37:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:21.991 23:37:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:21.991 23:37:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:21.991 23:37:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:21.991 "name": "Existed_Raid", 00:15:21.991 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:21.991 "strip_size_kb": 64, 00:15:21.991 "state": "configuring", 00:15:21.991 "raid_level": "concat", 00:15:21.991 "superblock": false, 00:15:21.991 "num_base_bdevs": 4, 00:15:21.991 "num_base_bdevs_discovered": 1, 00:15:21.991 "num_base_bdevs_operational": 4, 00:15:21.991 "base_bdevs_list": [ 00:15:21.991 { 00:15:21.991 "name": "BaseBdev1", 00:15:21.991 "uuid": "865f1c55-b9f4-4c43-8ff8-5bdf217cab33", 00:15:21.991 "is_configured": true, 00:15:21.991 "data_offset": 0, 00:15:21.992 "data_size": 65536 00:15:21.992 }, 00:15:21.992 { 00:15:21.992 "name": "BaseBdev2", 00:15:21.992 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:21.992 "is_configured": false, 00:15:21.992 "data_offset": 0, 00:15:21.992 "data_size": 0 00:15:21.992 }, 00:15:21.992 { 00:15:21.992 "name": "BaseBdev3", 00:15:21.992 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:21.992 "is_configured": false, 00:15:21.992 "data_offset": 0, 00:15:21.992 "data_size": 0 00:15:21.992 }, 00:15:21.992 { 00:15:21.992 "name": "BaseBdev4", 00:15:21.992 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:21.992 "is_configured": false, 00:15:21.992 "data_offset": 0, 00:15:21.992 "data_size": 0 00:15:21.992 } 00:15:21.992 ] 00:15:21.992 }' 00:15:21.992 23:37:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:21.992 23:37:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:22.558 23:37:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:22.558 [2024-07-24 23:37:07.553211] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:22.558 BaseBdev2 00:15:22.817 23:37:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:15:22.817 23:37:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:15:22.817 23:37:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:15:22.817 23:37:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:15:22.817 23:37:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:15:22.817 23:37:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:15:22.817 23:37:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:22.817 23:37:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:23.076 [ 00:15:23.076 { 00:15:23.076 "name": "BaseBdev2", 00:15:23.076 "aliases": [ 00:15:23.076 "5e5d512b-862d-4b90-a60d-e652d237dfa9" 00:15:23.076 ], 00:15:23.076 "product_name": "Malloc disk", 00:15:23.076 "block_size": 512, 00:15:23.076 "num_blocks": 65536, 00:15:23.076 "uuid": "5e5d512b-862d-4b90-a60d-e652d237dfa9", 00:15:23.076 "assigned_rate_limits": { 00:15:23.076 "rw_ios_per_sec": 0, 00:15:23.076 "rw_mbytes_per_sec": 0, 00:15:23.076 "r_mbytes_per_sec": 0, 00:15:23.076 "w_mbytes_per_sec": 0 00:15:23.076 }, 00:15:23.076 "claimed": true, 00:15:23.076 "claim_type": "exclusive_write", 00:15:23.076 "zoned": false, 00:15:23.076 "supported_io_types": { 00:15:23.076 "read": true, 00:15:23.076 "write": true, 00:15:23.076 "unmap": true, 00:15:23.076 "flush": true, 00:15:23.076 "reset": true, 00:15:23.076 "nvme_admin": false, 00:15:23.077 "nvme_io": false, 00:15:23.077 "nvme_io_md": false, 00:15:23.077 "write_zeroes": true, 00:15:23.077 "zcopy": true, 00:15:23.077 "get_zone_info": false, 00:15:23.077 "zone_management": false, 00:15:23.077 "zone_append": false, 00:15:23.077 "compare": false, 00:15:23.077 "compare_and_write": false, 00:15:23.077 "abort": true, 00:15:23.077 "seek_hole": false, 00:15:23.077 "seek_data": false, 00:15:23.077 "copy": true, 00:15:23.077 "nvme_iov_md": false 00:15:23.077 }, 00:15:23.077 "memory_domains": [ 00:15:23.077 { 00:15:23.077 "dma_device_id": "system", 00:15:23.077 "dma_device_type": 1 00:15:23.077 }, 00:15:23.077 { 00:15:23.077 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:23.077 "dma_device_type": 2 00:15:23.077 } 00:15:23.077 ], 00:15:23.077 "driver_specific": {} 00:15:23.077 } 00:15:23.077 ] 00:15:23.077 23:37:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:15:23.077 23:37:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:23.077 23:37:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:23.077 23:37:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:15:23.077 23:37:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:23.077 23:37:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:23.077 23:37:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:23.077 23:37:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:23.077 23:37:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:23.077 23:37:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:23.077 23:37:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:23.077 23:37:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:23.077 23:37:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:23.077 23:37:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:23.077 23:37:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:23.335 23:37:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:23.335 "name": "Existed_Raid", 00:15:23.335 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:23.335 "strip_size_kb": 64, 00:15:23.335 "state": "configuring", 00:15:23.335 "raid_level": "concat", 00:15:23.335 "superblock": false, 00:15:23.335 "num_base_bdevs": 4, 00:15:23.335 "num_base_bdevs_discovered": 2, 00:15:23.335 "num_base_bdevs_operational": 4, 00:15:23.335 "base_bdevs_list": [ 00:15:23.335 { 00:15:23.335 "name": "BaseBdev1", 00:15:23.336 "uuid": "865f1c55-b9f4-4c43-8ff8-5bdf217cab33", 00:15:23.336 "is_configured": true, 00:15:23.336 "data_offset": 0, 00:15:23.336 "data_size": 65536 00:15:23.336 }, 00:15:23.336 { 00:15:23.336 "name": "BaseBdev2", 00:15:23.336 "uuid": "5e5d512b-862d-4b90-a60d-e652d237dfa9", 00:15:23.336 "is_configured": true, 00:15:23.336 "data_offset": 0, 00:15:23.336 "data_size": 65536 00:15:23.336 }, 00:15:23.336 { 00:15:23.336 "name": "BaseBdev3", 00:15:23.336 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:23.336 "is_configured": false, 00:15:23.336 "data_offset": 0, 00:15:23.336 "data_size": 0 00:15:23.336 }, 00:15:23.336 { 00:15:23.336 "name": "BaseBdev4", 00:15:23.336 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:23.336 "is_configured": false, 00:15:23.336 "data_offset": 0, 00:15:23.336 "data_size": 0 00:15:23.336 } 00:15:23.336 ] 00:15:23.336 }' 00:15:23.336 23:37:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:23.336 23:37:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:23.902 23:37:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:23.902 [2024-07-24 23:37:08.750940] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:23.902 BaseBdev3 00:15:23.902 23:37:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:15:23.902 23:37:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:15:23.902 23:37:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:15:23.902 23:37:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:15:23.902 23:37:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:15:23.902 23:37:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:15:23.902 23:37:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:24.160 23:37:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:24.160 [ 00:15:24.160 { 00:15:24.160 "name": "BaseBdev3", 00:15:24.160 "aliases": [ 00:15:24.160 "00c5a176-a71c-4850-95ea-253a1a5ddcaf" 00:15:24.160 ], 00:15:24.160 "product_name": "Malloc disk", 00:15:24.160 "block_size": 512, 00:15:24.160 "num_blocks": 65536, 00:15:24.160 "uuid": "00c5a176-a71c-4850-95ea-253a1a5ddcaf", 00:15:24.160 "assigned_rate_limits": { 00:15:24.160 "rw_ios_per_sec": 0, 00:15:24.160 "rw_mbytes_per_sec": 0, 00:15:24.160 "r_mbytes_per_sec": 0, 00:15:24.160 "w_mbytes_per_sec": 0 00:15:24.160 }, 00:15:24.160 "claimed": true, 00:15:24.160 "claim_type": "exclusive_write", 00:15:24.160 "zoned": false, 00:15:24.160 "supported_io_types": { 00:15:24.160 "read": true, 00:15:24.160 "write": true, 00:15:24.160 "unmap": true, 00:15:24.160 "flush": true, 00:15:24.160 "reset": true, 00:15:24.160 "nvme_admin": false, 00:15:24.160 "nvme_io": false, 00:15:24.160 "nvme_io_md": false, 00:15:24.161 "write_zeroes": true, 00:15:24.161 "zcopy": true, 00:15:24.161 "get_zone_info": false, 00:15:24.161 "zone_management": false, 00:15:24.161 "zone_append": false, 00:15:24.161 "compare": false, 00:15:24.161 "compare_and_write": false, 00:15:24.161 "abort": true, 00:15:24.161 "seek_hole": false, 00:15:24.161 "seek_data": false, 00:15:24.161 "copy": true, 00:15:24.161 "nvme_iov_md": false 00:15:24.161 }, 00:15:24.161 "memory_domains": [ 00:15:24.161 { 00:15:24.161 "dma_device_id": "system", 00:15:24.161 "dma_device_type": 1 00:15:24.161 }, 00:15:24.161 { 00:15:24.161 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:24.161 "dma_device_type": 2 00:15:24.161 } 00:15:24.161 ], 00:15:24.161 "driver_specific": {} 00:15:24.161 } 00:15:24.161 ] 00:15:24.161 23:37:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:15:24.161 23:37:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:24.161 23:37:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:24.161 23:37:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:15:24.161 23:37:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:24.161 23:37:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:24.161 23:37:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:24.161 23:37:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:24.161 23:37:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:24.161 23:37:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:24.161 23:37:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:24.161 23:37:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:24.161 23:37:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:24.161 23:37:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:24.161 23:37:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:24.419 23:37:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:24.419 "name": "Existed_Raid", 00:15:24.419 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:24.419 "strip_size_kb": 64, 00:15:24.419 "state": "configuring", 00:15:24.419 "raid_level": "concat", 00:15:24.419 "superblock": false, 00:15:24.419 "num_base_bdevs": 4, 00:15:24.419 "num_base_bdevs_discovered": 3, 00:15:24.419 "num_base_bdevs_operational": 4, 00:15:24.419 "base_bdevs_list": [ 00:15:24.419 { 00:15:24.419 "name": "BaseBdev1", 00:15:24.419 "uuid": "865f1c55-b9f4-4c43-8ff8-5bdf217cab33", 00:15:24.419 "is_configured": true, 00:15:24.419 "data_offset": 0, 00:15:24.419 "data_size": 65536 00:15:24.419 }, 00:15:24.419 { 00:15:24.419 "name": "BaseBdev2", 00:15:24.419 "uuid": "5e5d512b-862d-4b90-a60d-e652d237dfa9", 00:15:24.419 "is_configured": true, 00:15:24.419 "data_offset": 0, 00:15:24.419 "data_size": 65536 00:15:24.419 }, 00:15:24.419 { 00:15:24.419 "name": "BaseBdev3", 00:15:24.419 "uuid": "00c5a176-a71c-4850-95ea-253a1a5ddcaf", 00:15:24.419 "is_configured": true, 00:15:24.419 "data_offset": 0, 00:15:24.419 "data_size": 65536 00:15:24.419 }, 00:15:24.419 { 00:15:24.419 "name": "BaseBdev4", 00:15:24.419 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:24.419 "is_configured": false, 00:15:24.419 "data_offset": 0, 00:15:24.419 "data_size": 0 00:15:24.419 } 00:15:24.419 ] 00:15:24.419 }' 00:15:24.419 23:37:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:24.419 23:37:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:24.985 23:37:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:15:24.985 [2024-07-24 23:37:09.932681] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:15:24.985 [2024-07-24 23:37:09.932712] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x23c83d0 00:15:24.985 [2024-07-24 23:37:09.932716] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:15:24.985 [2024-07-24 23:37:09.932866] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23d0080 00:15:24.985 [2024-07-24 23:37:09.932949] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x23c83d0 00:15:24.985 [2024-07-24 23:37:09.932954] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x23c83d0 00:15:24.985 [2024-07-24 23:37:09.933081] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:24.985 BaseBdev4 00:15:24.985 23:37:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:15:24.985 23:37:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev4 00:15:24.985 23:37:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:15:24.985 23:37:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:15:24.985 23:37:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:15:24.985 23:37:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:15:24.985 23:37:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:25.242 23:37:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:15:25.500 [ 00:15:25.500 { 00:15:25.500 "name": "BaseBdev4", 00:15:25.500 "aliases": [ 00:15:25.500 "3919c6de-3aca-43d0-992d-b67f2f6e85d6" 00:15:25.500 ], 00:15:25.500 "product_name": "Malloc disk", 00:15:25.500 "block_size": 512, 00:15:25.500 "num_blocks": 65536, 00:15:25.500 "uuid": "3919c6de-3aca-43d0-992d-b67f2f6e85d6", 00:15:25.500 "assigned_rate_limits": { 00:15:25.500 "rw_ios_per_sec": 0, 00:15:25.500 "rw_mbytes_per_sec": 0, 00:15:25.500 "r_mbytes_per_sec": 0, 00:15:25.500 "w_mbytes_per_sec": 0 00:15:25.500 }, 00:15:25.500 "claimed": true, 00:15:25.500 "claim_type": "exclusive_write", 00:15:25.500 "zoned": false, 00:15:25.500 "supported_io_types": { 00:15:25.500 "read": true, 00:15:25.500 "write": true, 00:15:25.500 "unmap": true, 00:15:25.500 "flush": true, 00:15:25.500 "reset": true, 00:15:25.500 "nvme_admin": false, 00:15:25.500 "nvme_io": false, 00:15:25.500 "nvme_io_md": false, 00:15:25.500 "write_zeroes": true, 00:15:25.500 "zcopy": true, 00:15:25.500 "get_zone_info": false, 00:15:25.500 "zone_management": false, 00:15:25.500 "zone_append": false, 00:15:25.500 "compare": false, 00:15:25.500 "compare_and_write": false, 00:15:25.500 "abort": true, 00:15:25.500 "seek_hole": false, 00:15:25.500 "seek_data": false, 00:15:25.500 "copy": true, 00:15:25.500 "nvme_iov_md": false 00:15:25.500 }, 00:15:25.500 "memory_domains": [ 00:15:25.500 { 00:15:25.500 "dma_device_id": "system", 00:15:25.500 "dma_device_type": 1 00:15:25.500 }, 00:15:25.500 { 00:15:25.500 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:25.500 "dma_device_type": 2 00:15:25.500 } 00:15:25.500 ], 00:15:25.500 "driver_specific": {} 00:15:25.500 } 00:15:25.500 ] 00:15:25.500 23:37:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:15:25.500 23:37:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:25.500 23:37:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:25.500 23:37:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:15:25.500 23:37:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:25.500 23:37:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:25.500 23:37:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:25.500 23:37:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:25.500 23:37:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:25.500 23:37:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:25.500 23:37:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:25.500 23:37:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:25.500 23:37:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:25.500 23:37:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:25.500 23:37:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:25.500 23:37:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:25.500 "name": "Existed_Raid", 00:15:25.500 "uuid": "353434a8-8f1b-4b87-b08e-d157ddeff6ba", 00:15:25.500 "strip_size_kb": 64, 00:15:25.500 "state": "online", 00:15:25.500 "raid_level": "concat", 00:15:25.501 "superblock": false, 00:15:25.501 "num_base_bdevs": 4, 00:15:25.501 "num_base_bdevs_discovered": 4, 00:15:25.501 "num_base_bdevs_operational": 4, 00:15:25.501 "base_bdevs_list": [ 00:15:25.501 { 00:15:25.501 "name": "BaseBdev1", 00:15:25.501 "uuid": "865f1c55-b9f4-4c43-8ff8-5bdf217cab33", 00:15:25.501 "is_configured": true, 00:15:25.501 "data_offset": 0, 00:15:25.501 "data_size": 65536 00:15:25.501 }, 00:15:25.501 { 00:15:25.501 "name": "BaseBdev2", 00:15:25.501 "uuid": "5e5d512b-862d-4b90-a60d-e652d237dfa9", 00:15:25.501 "is_configured": true, 00:15:25.501 "data_offset": 0, 00:15:25.501 "data_size": 65536 00:15:25.501 }, 00:15:25.501 { 00:15:25.501 "name": "BaseBdev3", 00:15:25.501 "uuid": "00c5a176-a71c-4850-95ea-253a1a5ddcaf", 00:15:25.501 "is_configured": true, 00:15:25.501 "data_offset": 0, 00:15:25.501 "data_size": 65536 00:15:25.501 }, 00:15:25.501 { 00:15:25.501 "name": "BaseBdev4", 00:15:25.501 "uuid": "3919c6de-3aca-43d0-992d-b67f2f6e85d6", 00:15:25.501 "is_configured": true, 00:15:25.501 "data_offset": 0, 00:15:25.501 "data_size": 65536 00:15:25.501 } 00:15:25.501 ] 00:15:25.501 }' 00:15:25.501 23:37:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:25.501 23:37:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:26.066 23:37:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:15:26.066 23:37:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:26.066 23:37:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:26.066 23:37:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:26.066 23:37:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:26.066 23:37:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:26.066 23:37:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:26.066 23:37:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:26.324 [2024-07-24 23:37:11.123976] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:26.324 23:37:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:26.324 "name": "Existed_Raid", 00:15:26.324 "aliases": [ 00:15:26.324 "353434a8-8f1b-4b87-b08e-d157ddeff6ba" 00:15:26.324 ], 00:15:26.324 "product_name": "Raid Volume", 00:15:26.324 "block_size": 512, 00:15:26.324 "num_blocks": 262144, 00:15:26.324 "uuid": "353434a8-8f1b-4b87-b08e-d157ddeff6ba", 00:15:26.324 "assigned_rate_limits": { 00:15:26.324 "rw_ios_per_sec": 0, 00:15:26.324 "rw_mbytes_per_sec": 0, 00:15:26.324 "r_mbytes_per_sec": 0, 00:15:26.324 "w_mbytes_per_sec": 0 00:15:26.324 }, 00:15:26.324 "claimed": false, 00:15:26.324 "zoned": false, 00:15:26.324 "supported_io_types": { 00:15:26.324 "read": true, 00:15:26.324 "write": true, 00:15:26.324 "unmap": true, 00:15:26.324 "flush": true, 00:15:26.324 "reset": true, 00:15:26.324 "nvme_admin": false, 00:15:26.324 "nvme_io": false, 00:15:26.324 "nvme_io_md": false, 00:15:26.324 "write_zeroes": true, 00:15:26.324 "zcopy": false, 00:15:26.324 "get_zone_info": false, 00:15:26.324 "zone_management": false, 00:15:26.324 "zone_append": false, 00:15:26.324 "compare": false, 00:15:26.324 "compare_and_write": false, 00:15:26.324 "abort": false, 00:15:26.324 "seek_hole": false, 00:15:26.324 "seek_data": false, 00:15:26.324 "copy": false, 00:15:26.324 "nvme_iov_md": false 00:15:26.324 }, 00:15:26.324 "memory_domains": [ 00:15:26.324 { 00:15:26.324 "dma_device_id": "system", 00:15:26.324 "dma_device_type": 1 00:15:26.324 }, 00:15:26.324 { 00:15:26.324 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:26.324 "dma_device_type": 2 00:15:26.324 }, 00:15:26.324 { 00:15:26.324 "dma_device_id": "system", 00:15:26.324 "dma_device_type": 1 00:15:26.324 }, 00:15:26.324 { 00:15:26.324 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:26.324 "dma_device_type": 2 00:15:26.324 }, 00:15:26.324 { 00:15:26.324 "dma_device_id": "system", 00:15:26.324 "dma_device_type": 1 00:15:26.324 }, 00:15:26.324 { 00:15:26.324 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:26.324 "dma_device_type": 2 00:15:26.324 }, 00:15:26.324 { 00:15:26.324 "dma_device_id": "system", 00:15:26.324 "dma_device_type": 1 00:15:26.324 }, 00:15:26.324 { 00:15:26.324 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:26.324 "dma_device_type": 2 00:15:26.324 } 00:15:26.324 ], 00:15:26.324 "driver_specific": { 00:15:26.324 "raid": { 00:15:26.324 "uuid": "353434a8-8f1b-4b87-b08e-d157ddeff6ba", 00:15:26.324 "strip_size_kb": 64, 00:15:26.324 "state": "online", 00:15:26.324 "raid_level": "concat", 00:15:26.324 "superblock": false, 00:15:26.324 "num_base_bdevs": 4, 00:15:26.324 "num_base_bdevs_discovered": 4, 00:15:26.324 "num_base_bdevs_operational": 4, 00:15:26.324 "base_bdevs_list": [ 00:15:26.324 { 00:15:26.324 "name": "BaseBdev1", 00:15:26.324 "uuid": "865f1c55-b9f4-4c43-8ff8-5bdf217cab33", 00:15:26.324 "is_configured": true, 00:15:26.324 "data_offset": 0, 00:15:26.324 "data_size": 65536 00:15:26.324 }, 00:15:26.325 { 00:15:26.325 "name": "BaseBdev2", 00:15:26.325 "uuid": "5e5d512b-862d-4b90-a60d-e652d237dfa9", 00:15:26.325 "is_configured": true, 00:15:26.325 "data_offset": 0, 00:15:26.325 "data_size": 65536 00:15:26.325 }, 00:15:26.325 { 00:15:26.325 "name": "BaseBdev3", 00:15:26.325 "uuid": "00c5a176-a71c-4850-95ea-253a1a5ddcaf", 00:15:26.325 "is_configured": true, 00:15:26.325 "data_offset": 0, 00:15:26.325 "data_size": 65536 00:15:26.325 }, 00:15:26.325 { 00:15:26.325 "name": "BaseBdev4", 00:15:26.325 "uuid": "3919c6de-3aca-43d0-992d-b67f2f6e85d6", 00:15:26.325 "is_configured": true, 00:15:26.325 "data_offset": 0, 00:15:26.325 "data_size": 65536 00:15:26.325 } 00:15:26.325 ] 00:15:26.325 } 00:15:26.325 } 00:15:26.325 }' 00:15:26.325 23:37:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:26.325 23:37:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:15:26.325 BaseBdev2 00:15:26.325 BaseBdev3 00:15:26.325 BaseBdev4' 00:15:26.325 23:37:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:26.325 23:37:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:15:26.325 23:37:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:26.583 23:37:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:26.583 "name": "BaseBdev1", 00:15:26.583 "aliases": [ 00:15:26.583 "865f1c55-b9f4-4c43-8ff8-5bdf217cab33" 00:15:26.583 ], 00:15:26.583 "product_name": "Malloc disk", 00:15:26.583 "block_size": 512, 00:15:26.583 "num_blocks": 65536, 00:15:26.583 "uuid": "865f1c55-b9f4-4c43-8ff8-5bdf217cab33", 00:15:26.583 "assigned_rate_limits": { 00:15:26.583 "rw_ios_per_sec": 0, 00:15:26.583 "rw_mbytes_per_sec": 0, 00:15:26.583 "r_mbytes_per_sec": 0, 00:15:26.583 "w_mbytes_per_sec": 0 00:15:26.583 }, 00:15:26.583 "claimed": true, 00:15:26.583 "claim_type": "exclusive_write", 00:15:26.583 "zoned": false, 00:15:26.583 "supported_io_types": { 00:15:26.583 "read": true, 00:15:26.583 "write": true, 00:15:26.583 "unmap": true, 00:15:26.583 "flush": true, 00:15:26.583 "reset": true, 00:15:26.583 "nvme_admin": false, 00:15:26.583 "nvme_io": false, 00:15:26.583 "nvme_io_md": false, 00:15:26.583 "write_zeroes": true, 00:15:26.583 "zcopy": true, 00:15:26.583 "get_zone_info": false, 00:15:26.583 "zone_management": false, 00:15:26.583 "zone_append": false, 00:15:26.583 "compare": false, 00:15:26.583 "compare_and_write": false, 00:15:26.583 "abort": true, 00:15:26.583 "seek_hole": false, 00:15:26.583 "seek_data": false, 00:15:26.583 "copy": true, 00:15:26.583 "nvme_iov_md": false 00:15:26.583 }, 00:15:26.583 "memory_domains": [ 00:15:26.583 { 00:15:26.583 "dma_device_id": "system", 00:15:26.583 "dma_device_type": 1 00:15:26.583 }, 00:15:26.583 { 00:15:26.583 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:26.583 "dma_device_type": 2 00:15:26.583 } 00:15:26.583 ], 00:15:26.583 "driver_specific": {} 00:15:26.583 }' 00:15:26.583 23:37:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:26.583 23:37:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:26.583 23:37:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:26.583 23:37:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:26.583 23:37:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:26.583 23:37:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:26.583 23:37:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:26.583 23:37:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:26.583 23:37:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:26.583 23:37:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:26.841 23:37:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:26.841 23:37:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:26.841 23:37:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:26.841 23:37:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:26.841 23:37:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:26.841 23:37:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:26.841 "name": "BaseBdev2", 00:15:26.841 "aliases": [ 00:15:26.841 "5e5d512b-862d-4b90-a60d-e652d237dfa9" 00:15:26.841 ], 00:15:26.841 "product_name": "Malloc disk", 00:15:26.841 "block_size": 512, 00:15:26.841 "num_blocks": 65536, 00:15:26.841 "uuid": "5e5d512b-862d-4b90-a60d-e652d237dfa9", 00:15:26.841 "assigned_rate_limits": { 00:15:26.841 "rw_ios_per_sec": 0, 00:15:26.841 "rw_mbytes_per_sec": 0, 00:15:26.841 "r_mbytes_per_sec": 0, 00:15:26.841 "w_mbytes_per_sec": 0 00:15:26.841 }, 00:15:26.841 "claimed": true, 00:15:26.841 "claim_type": "exclusive_write", 00:15:26.841 "zoned": false, 00:15:26.841 "supported_io_types": { 00:15:26.841 "read": true, 00:15:26.841 "write": true, 00:15:26.841 "unmap": true, 00:15:26.841 "flush": true, 00:15:26.841 "reset": true, 00:15:26.841 "nvme_admin": false, 00:15:26.841 "nvme_io": false, 00:15:26.841 "nvme_io_md": false, 00:15:26.841 "write_zeroes": true, 00:15:26.841 "zcopy": true, 00:15:26.841 "get_zone_info": false, 00:15:26.841 "zone_management": false, 00:15:26.841 "zone_append": false, 00:15:26.841 "compare": false, 00:15:26.841 "compare_and_write": false, 00:15:26.841 "abort": true, 00:15:26.841 "seek_hole": false, 00:15:26.841 "seek_data": false, 00:15:26.841 "copy": true, 00:15:26.841 "nvme_iov_md": false 00:15:26.841 }, 00:15:26.841 "memory_domains": [ 00:15:26.841 { 00:15:26.841 "dma_device_id": "system", 00:15:26.841 "dma_device_type": 1 00:15:26.841 }, 00:15:26.841 { 00:15:26.841 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:26.841 "dma_device_type": 2 00:15:26.841 } 00:15:26.841 ], 00:15:26.841 "driver_specific": {} 00:15:26.841 }' 00:15:26.841 23:37:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:27.099 23:37:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:27.099 23:37:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:27.099 23:37:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:27.099 23:37:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:27.099 23:37:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:27.099 23:37:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:27.099 23:37:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:27.099 23:37:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:27.099 23:37:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:27.356 23:37:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:27.356 23:37:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:27.356 23:37:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:27.356 23:37:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:27.356 23:37:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:27.356 23:37:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:27.356 "name": "BaseBdev3", 00:15:27.356 "aliases": [ 00:15:27.356 "00c5a176-a71c-4850-95ea-253a1a5ddcaf" 00:15:27.356 ], 00:15:27.356 "product_name": "Malloc disk", 00:15:27.356 "block_size": 512, 00:15:27.356 "num_blocks": 65536, 00:15:27.356 "uuid": "00c5a176-a71c-4850-95ea-253a1a5ddcaf", 00:15:27.356 "assigned_rate_limits": { 00:15:27.356 "rw_ios_per_sec": 0, 00:15:27.356 "rw_mbytes_per_sec": 0, 00:15:27.356 "r_mbytes_per_sec": 0, 00:15:27.356 "w_mbytes_per_sec": 0 00:15:27.356 }, 00:15:27.356 "claimed": true, 00:15:27.356 "claim_type": "exclusive_write", 00:15:27.356 "zoned": false, 00:15:27.356 "supported_io_types": { 00:15:27.356 "read": true, 00:15:27.356 "write": true, 00:15:27.356 "unmap": true, 00:15:27.356 "flush": true, 00:15:27.356 "reset": true, 00:15:27.356 "nvme_admin": false, 00:15:27.356 "nvme_io": false, 00:15:27.356 "nvme_io_md": false, 00:15:27.356 "write_zeroes": true, 00:15:27.356 "zcopy": true, 00:15:27.356 "get_zone_info": false, 00:15:27.356 "zone_management": false, 00:15:27.356 "zone_append": false, 00:15:27.356 "compare": false, 00:15:27.356 "compare_and_write": false, 00:15:27.356 "abort": true, 00:15:27.356 "seek_hole": false, 00:15:27.356 "seek_data": false, 00:15:27.356 "copy": true, 00:15:27.356 "nvme_iov_md": false 00:15:27.357 }, 00:15:27.357 "memory_domains": [ 00:15:27.357 { 00:15:27.357 "dma_device_id": "system", 00:15:27.357 "dma_device_type": 1 00:15:27.357 }, 00:15:27.357 { 00:15:27.357 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:27.357 "dma_device_type": 2 00:15:27.357 } 00:15:27.357 ], 00:15:27.357 "driver_specific": {} 00:15:27.357 }' 00:15:27.357 23:37:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:27.357 23:37:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:27.614 23:37:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:27.614 23:37:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:27.614 23:37:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:27.614 23:37:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:27.614 23:37:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:27.614 23:37:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:27.614 23:37:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:27.614 23:37:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:27.614 23:37:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:27.614 23:37:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:27.614 23:37:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:27.614 23:37:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:15:27.614 23:37:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:27.872 23:37:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:27.872 "name": "BaseBdev4", 00:15:27.872 "aliases": [ 00:15:27.872 "3919c6de-3aca-43d0-992d-b67f2f6e85d6" 00:15:27.872 ], 00:15:27.872 "product_name": "Malloc disk", 00:15:27.872 "block_size": 512, 00:15:27.872 "num_blocks": 65536, 00:15:27.872 "uuid": "3919c6de-3aca-43d0-992d-b67f2f6e85d6", 00:15:27.872 "assigned_rate_limits": { 00:15:27.872 "rw_ios_per_sec": 0, 00:15:27.872 "rw_mbytes_per_sec": 0, 00:15:27.872 "r_mbytes_per_sec": 0, 00:15:27.872 "w_mbytes_per_sec": 0 00:15:27.872 }, 00:15:27.872 "claimed": true, 00:15:27.872 "claim_type": "exclusive_write", 00:15:27.872 "zoned": false, 00:15:27.872 "supported_io_types": { 00:15:27.872 "read": true, 00:15:27.872 "write": true, 00:15:27.872 "unmap": true, 00:15:27.872 "flush": true, 00:15:27.872 "reset": true, 00:15:27.872 "nvme_admin": false, 00:15:27.872 "nvme_io": false, 00:15:27.872 "nvme_io_md": false, 00:15:27.872 "write_zeroes": true, 00:15:27.872 "zcopy": true, 00:15:27.872 "get_zone_info": false, 00:15:27.872 "zone_management": false, 00:15:27.872 "zone_append": false, 00:15:27.872 "compare": false, 00:15:27.872 "compare_and_write": false, 00:15:27.872 "abort": true, 00:15:27.872 "seek_hole": false, 00:15:27.872 "seek_data": false, 00:15:27.872 "copy": true, 00:15:27.872 "nvme_iov_md": false 00:15:27.872 }, 00:15:27.872 "memory_domains": [ 00:15:27.872 { 00:15:27.872 "dma_device_id": "system", 00:15:27.872 "dma_device_type": 1 00:15:27.872 }, 00:15:27.872 { 00:15:27.872 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:27.872 "dma_device_type": 2 00:15:27.872 } 00:15:27.872 ], 00:15:27.872 "driver_specific": {} 00:15:27.872 }' 00:15:27.872 23:37:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:27.872 23:37:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:27.872 23:37:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:27.872 23:37:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:28.130 23:37:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:28.130 23:37:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:28.130 23:37:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:28.130 23:37:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:28.130 23:37:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:28.130 23:37:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:28.130 23:37:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:28.130 23:37:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:28.130 23:37:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:28.388 [2024-07-24 23:37:13.241264] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:28.388 [2024-07-24 23:37:13.241284] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:28.388 [2024-07-24 23:37:13.241316] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:28.388 23:37:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:15:28.388 23:37:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:15:28.388 23:37:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:28.388 23:37:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:15:28.388 23:37:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:15:28.388 23:37:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 3 00:15:28.388 23:37:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:28.388 23:37:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:15:28.388 23:37:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:28.388 23:37:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:28.388 23:37:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:28.388 23:37:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:28.388 23:37:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:28.389 23:37:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:28.389 23:37:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:28.389 23:37:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:28.389 23:37:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:28.645 23:37:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:28.645 "name": "Existed_Raid", 00:15:28.645 "uuid": "353434a8-8f1b-4b87-b08e-d157ddeff6ba", 00:15:28.645 "strip_size_kb": 64, 00:15:28.645 "state": "offline", 00:15:28.645 "raid_level": "concat", 00:15:28.645 "superblock": false, 00:15:28.645 "num_base_bdevs": 4, 00:15:28.645 "num_base_bdevs_discovered": 3, 00:15:28.645 "num_base_bdevs_operational": 3, 00:15:28.645 "base_bdevs_list": [ 00:15:28.645 { 00:15:28.645 "name": null, 00:15:28.645 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:28.645 "is_configured": false, 00:15:28.645 "data_offset": 0, 00:15:28.645 "data_size": 65536 00:15:28.645 }, 00:15:28.645 { 00:15:28.645 "name": "BaseBdev2", 00:15:28.645 "uuid": "5e5d512b-862d-4b90-a60d-e652d237dfa9", 00:15:28.645 "is_configured": true, 00:15:28.645 "data_offset": 0, 00:15:28.645 "data_size": 65536 00:15:28.645 }, 00:15:28.645 { 00:15:28.645 "name": "BaseBdev3", 00:15:28.645 "uuid": "00c5a176-a71c-4850-95ea-253a1a5ddcaf", 00:15:28.645 "is_configured": true, 00:15:28.645 "data_offset": 0, 00:15:28.645 "data_size": 65536 00:15:28.645 }, 00:15:28.645 { 00:15:28.645 "name": "BaseBdev4", 00:15:28.645 "uuid": "3919c6de-3aca-43d0-992d-b67f2f6e85d6", 00:15:28.645 "is_configured": true, 00:15:28.645 "data_offset": 0, 00:15:28.645 "data_size": 65536 00:15:28.645 } 00:15:28.645 ] 00:15:28.645 }' 00:15:28.645 23:37:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:28.646 23:37:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:28.903 23:37:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:15:29.160 23:37:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:29.160 23:37:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:29.160 23:37:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:29.160 23:37:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:29.160 23:37:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:29.160 23:37:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:15:29.417 [2024-07-24 23:37:14.240839] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:29.417 23:37:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:29.417 23:37:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:29.417 23:37:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:29.417 23:37:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:29.673 23:37:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:29.673 23:37:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:29.673 23:37:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:15:29.673 [2024-07-24 23:37:14.587633] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:29.673 23:37:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:29.673 23:37:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:29.673 23:37:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:29.673 23:37:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:29.930 23:37:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:29.930 23:37:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:29.930 23:37:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:15:29.930 [2024-07-24 23:37:14.930333] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:15:29.930 [2024-07-24 23:37:14.930364] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23c83d0 name Existed_Raid, state offline 00:15:30.188 23:37:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:30.188 23:37:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:30.188 23:37:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:30.188 23:37:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:15:30.188 23:37:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:15:30.188 23:37:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:15:30.188 23:37:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:15:30.188 23:37:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:15:30.188 23:37:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:30.188 23:37:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:30.445 BaseBdev2 00:15:30.445 23:37:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:15:30.445 23:37:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:15:30.445 23:37:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:15:30.445 23:37:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:15:30.445 23:37:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:15:30.445 23:37:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:15:30.445 23:37:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:30.703 23:37:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:30.703 [ 00:15:30.703 { 00:15:30.703 "name": "BaseBdev2", 00:15:30.703 "aliases": [ 00:15:30.703 "b1db9a81-70c7-45ad-815d-2044b415387b" 00:15:30.703 ], 00:15:30.703 "product_name": "Malloc disk", 00:15:30.703 "block_size": 512, 00:15:30.703 "num_blocks": 65536, 00:15:30.703 "uuid": "b1db9a81-70c7-45ad-815d-2044b415387b", 00:15:30.703 "assigned_rate_limits": { 00:15:30.703 "rw_ios_per_sec": 0, 00:15:30.703 "rw_mbytes_per_sec": 0, 00:15:30.703 "r_mbytes_per_sec": 0, 00:15:30.703 "w_mbytes_per_sec": 0 00:15:30.703 }, 00:15:30.703 "claimed": false, 00:15:30.703 "zoned": false, 00:15:30.703 "supported_io_types": { 00:15:30.703 "read": true, 00:15:30.703 "write": true, 00:15:30.703 "unmap": true, 00:15:30.703 "flush": true, 00:15:30.703 "reset": true, 00:15:30.703 "nvme_admin": false, 00:15:30.703 "nvme_io": false, 00:15:30.703 "nvme_io_md": false, 00:15:30.703 "write_zeroes": true, 00:15:30.703 "zcopy": true, 00:15:30.703 "get_zone_info": false, 00:15:30.703 "zone_management": false, 00:15:30.703 "zone_append": false, 00:15:30.703 "compare": false, 00:15:30.703 "compare_and_write": false, 00:15:30.703 "abort": true, 00:15:30.703 "seek_hole": false, 00:15:30.703 "seek_data": false, 00:15:30.703 "copy": true, 00:15:30.703 "nvme_iov_md": false 00:15:30.703 }, 00:15:30.703 "memory_domains": [ 00:15:30.703 { 00:15:30.703 "dma_device_id": "system", 00:15:30.703 "dma_device_type": 1 00:15:30.703 }, 00:15:30.703 { 00:15:30.703 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:30.703 "dma_device_type": 2 00:15:30.703 } 00:15:30.703 ], 00:15:30.703 "driver_specific": {} 00:15:30.703 } 00:15:30.703 ] 00:15:30.703 23:37:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:15:30.703 23:37:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:30.703 23:37:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:30.703 23:37:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:30.961 BaseBdev3 00:15:30.961 23:37:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:15:30.961 23:37:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:15:30.961 23:37:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:15:30.961 23:37:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:15:30.961 23:37:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:15:30.961 23:37:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:15:30.961 23:37:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:30.961 23:37:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:31.218 [ 00:15:31.218 { 00:15:31.218 "name": "BaseBdev3", 00:15:31.218 "aliases": [ 00:15:31.218 "7528dd90-e9d6-4fbc-bd90-b60ae7106c08" 00:15:31.218 ], 00:15:31.218 "product_name": "Malloc disk", 00:15:31.218 "block_size": 512, 00:15:31.218 "num_blocks": 65536, 00:15:31.218 "uuid": "7528dd90-e9d6-4fbc-bd90-b60ae7106c08", 00:15:31.218 "assigned_rate_limits": { 00:15:31.218 "rw_ios_per_sec": 0, 00:15:31.218 "rw_mbytes_per_sec": 0, 00:15:31.218 "r_mbytes_per_sec": 0, 00:15:31.219 "w_mbytes_per_sec": 0 00:15:31.219 }, 00:15:31.219 "claimed": false, 00:15:31.219 "zoned": false, 00:15:31.219 "supported_io_types": { 00:15:31.219 "read": true, 00:15:31.219 "write": true, 00:15:31.219 "unmap": true, 00:15:31.219 "flush": true, 00:15:31.219 "reset": true, 00:15:31.219 "nvme_admin": false, 00:15:31.219 "nvme_io": false, 00:15:31.219 "nvme_io_md": false, 00:15:31.219 "write_zeroes": true, 00:15:31.219 "zcopy": true, 00:15:31.219 "get_zone_info": false, 00:15:31.219 "zone_management": false, 00:15:31.219 "zone_append": false, 00:15:31.219 "compare": false, 00:15:31.219 "compare_and_write": false, 00:15:31.219 "abort": true, 00:15:31.219 "seek_hole": false, 00:15:31.219 "seek_data": false, 00:15:31.219 "copy": true, 00:15:31.219 "nvme_iov_md": false 00:15:31.219 }, 00:15:31.219 "memory_domains": [ 00:15:31.219 { 00:15:31.219 "dma_device_id": "system", 00:15:31.219 "dma_device_type": 1 00:15:31.219 }, 00:15:31.219 { 00:15:31.219 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:31.219 "dma_device_type": 2 00:15:31.219 } 00:15:31.219 ], 00:15:31.219 "driver_specific": {} 00:15:31.219 } 00:15:31.219 ] 00:15:31.219 23:37:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:15:31.219 23:37:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:31.219 23:37:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:31.219 23:37:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:15:31.476 BaseBdev4 00:15:31.476 23:37:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:15:31.476 23:37:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev4 00:15:31.476 23:37:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:15:31.476 23:37:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:15:31.476 23:37:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:15:31.476 23:37:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:15:31.476 23:37:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:31.476 23:37:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:15:31.734 [ 00:15:31.734 { 00:15:31.734 "name": "BaseBdev4", 00:15:31.734 "aliases": [ 00:15:31.734 "e76a432b-ea94-45a5-9ab2-bbfc6c33abe1" 00:15:31.734 ], 00:15:31.734 "product_name": "Malloc disk", 00:15:31.734 "block_size": 512, 00:15:31.734 "num_blocks": 65536, 00:15:31.734 "uuid": "e76a432b-ea94-45a5-9ab2-bbfc6c33abe1", 00:15:31.734 "assigned_rate_limits": { 00:15:31.734 "rw_ios_per_sec": 0, 00:15:31.734 "rw_mbytes_per_sec": 0, 00:15:31.734 "r_mbytes_per_sec": 0, 00:15:31.734 "w_mbytes_per_sec": 0 00:15:31.734 }, 00:15:31.734 "claimed": false, 00:15:31.734 "zoned": false, 00:15:31.734 "supported_io_types": { 00:15:31.734 "read": true, 00:15:31.734 "write": true, 00:15:31.734 "unmap": true, 00:15:31.734 "flush": true, 00:15:31.734 "reset": true, 00:15:31.734 "nvme_admin": false, 00:15:31.734 "nvme_io": false, 00:15:31.734 "nvme_io_md": false, 00:15:31.734 "write_zeroes": true, 00:15:31.734 "zcopy": true, 00:15:31.734 "get_zone_info": false, 00:15:31.734 "zone_management": false, 00:15:31.734 "zone_append": false, 00:15:31.734 "compare": false, 00:15:31.734 "compare_and_write": false, 00:15:31.734 "abort": true, 00:15:31.734 "seek_hole": false, 00:15:31.734 "seek_data": false, 00:15:31.734 "copy": true, 00:15:31.734 "nvme_iov_md": false 00:15:31.734 }, 00:15:31.734 "memory_domains": [ 00:15:31.734 { 00:15:31.734 "dma_device_id": "system", 00:15:31.734 "dma_device_type": 1 00:15:31.734 }, 00:15:31.734 { 00:15:31.734 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:31.734 "dma_device_type": 2 00:15:31.734 } 00:15:31.734 ], 00:15:31.734 "driver_specific": {} 00:15:31.734 } 00:15:31.734 ] 00:15:31.734 23:37:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:15:31.734 23:37:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:31.734 23:37:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:31.734 23:37:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:15:31.992 [2024-07-24 23:37:16.764053] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:31.992 [2024-07-24 23:37:16.764081] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:31.992 [2024-07-24 23:37:16.764095] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:31.992 [2024-07-24 23:37:16.765070] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:31.992 [2024-07-24 23:37:16.765099] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:15:31.992 23:37:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:15:31.992 23:37:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:31.992 23:37:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:31.992 23:37:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:31.992 23:37:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:31.992 23:37:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:31.992 23:37:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:31.992 23:37:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:31.992 23:37:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:31.992 23:37:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:31.992 23:37:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:31.992 23:37:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:31.992 23:37:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:31.992 "name": "Existed_Raid", 00:15:31.992 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:31.992 "strip_size_kb": 64, 00:15:31.992 "state": "configuring", 00:15:31.992 "raid_level": "concat", 00:15:31.992 "superblock": false, 00:15:31.992 "num_base_bdevs": 4, 00:15:31.992 "num_base_bdevs_discovered": 3, 00:15:31.992 "num_base_bdevs_operational": 4, 00:15:31.992 "base_bdevs_list": [ 00:15:31.992 { 00:15:31.992 "name": "BaseBdev1", 00:15:31.992 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:31.992 "is_configured": false, 00:15:31.992 "data_offset": 0, 00:15:31.992 "data_size": 0 00:15:31.992 }, 00:15:31.992 { 00:15:31.992 "name": "BaseBdev2", 00:15:31.992 "uuid": "b1db9a81-70c7-45ad-815d-2044b415387b", 00:15:31.992 "is_configured": true, 00:15:31.992 "data_offset": 0, 00:15:31.992 "data_size": 65536 00:15:31.992 }, 00:15:31.992 { 00:15:31.992 "name": "BaseBdev3", 00:15:31.992 "uuid": "7528dd90-e9d6-4fbc-bd90-b60ae7106c08", 00:15:31.992 "is_configured": true, 00:15:31.993 "data_offset": 0, 00:15:31.993 "data_size": 65536 00:15:31.993 }, 00:15:31.993 { 00:15:31.993 "name": "BaseBdev4", 00:15:31.993 "uuid": "e76a432b-ea94-45a5-9ab2-bbfc6c33abe1", 00:15:31.993 "is_configured": true, 00:15:31.993 "data_offset": 0, 00:15:31.993 "data_size": 65536 00:15:31.993 } 00:15:31.993 ] 00:15:31.993 }' 00:15:31.993 23:37:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:31.993 23:37:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:32.557 23:37:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:15:32.815 [2024-07-24 23:37:17.582157] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:32.815 23:37:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:15:32.815 23:37:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:32.815 23:37:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:32.815 23:37:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:32.815 23:37:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:32.815 23:37:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:32.815 23:37:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:32.815 23:37:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:32.815 23:37:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:32.815 23:37:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:32.815 23:37:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:32.815 23:37:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:32.815 23:37:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:32.815 "name": "Existed_Raid", 00:15:32.815 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:32.815 "strip_size_kb": 64, 00:15:32.815 "state": "configuring", 00:15:32.815 "raid_level": "concat", 00:15:32.815 "superblock": false, 00:15:32.815 "num_base_bdevs": 4, 00:15:32.815 "num_base_bdevs_discovered": 2, 00:15:32.815 "num_base_bdevs_operational": 4, 00:15:32.815 "base_bdevs_list": [ 00:15:32.815 { 00:15:32.815 "name": "BaseBdev1", 00:15:32.815 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:32.815 "is_configured": false, 00:15:32.815 "data_offset": 0, 00:15:32.815 "data_size": 0 00:15:32.815 }, 00:15:32.815 { 00:15:32.815 "name": null, 00:15:32.815 "uuid": "b1db9a81-70c7-45ad-815d-2044b415387b", 00:15:32.815 "is_configured": false, 00:15:32.815 "data_offset": 0, 00:15:32.815 "data_size": 65536 00:15:32.815 }, 00:15:32.815 { 00:15:32.815 "name": "BaseBdev3", 00:15:32.815 "uuid": "7528dd90-e9d6-4fbc-bd90-b60ae7106c08", 00:15:32.815 "is_configured": true, 00:15:32.815 "data_offset": 0, 00:15:32.815 "data_size": 65536 00:15:32.815 }, 00:15:32.815 { 00:15:32.815 "name": "BaseBdev4", 00:15:32.815 "uuid": "e76a432b-ea94-45a5-9ab2-bbfc6c33abe1", 00:15:32.815 "is_configured": true, 00:15:32.815 "data_offset": 0, 00:15:32.815 "data_size": 65536 00:15:32.815 } 00:15:32.815 ] 00:15:32.815 }' 00:15:32.815 23:37:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:32.815 23:37:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:33.380 23:37:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:33.380 23:37:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:33.637 23:37:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:15:33.637 23:37:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:33.637 [2024-07-24 23:37:18.595501] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:33.637 BaseBdev1 00:15:33.637 23:37:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:15:33.637 23:37:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:15:33.637 23:37:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:15:33.637 23:37:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:15:33.637 23:37:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:15:33.637 23:37:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:15:33.637 23:37:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:33.894 23:37:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:34.151 [ 00:15:34.151 { 00:15:34.151 "name": "BaseBdev1", 00:15:34.151 "aliases": [ 00:15:34.151 "ccf828e3-200d-49e7-aadc-cd995266972f" 00:15:34.151 ], 00:15:34.151 "product_name": "Malloc disk", 00:15:34.151 "block_size": 512, 00:15:34.151 "num_blocks": 65536, 00:15:34.151 "uuid": "ccf828e3-200d-49e7-aadc-cd995266972f", 00:15:34.151 "assigned_rate_limits": { 00:15:34.151 "rw_ios_per_sec": 0, 00:15:34.151 "rw_mbytes_per_sec": 0, 00:15:34.151 "r_mbytes_per_sec": 0, 00:15:34.151 "w_mbytes_per_sec": 0 00:15:34.151 }, 00:15:34.151 "claimed": true, 00:15:34.151 "claim_type": "exclusive_write", 00:15:34.151 "zoned": false, 00:15:34.151 "supported_io_types": { 00:15:34.151 "read": true, 00:15:34.151 "write": true, 00:15:34.151 "unmap": true, 00:15:34.151 "flush": true, 00:15:34.151 "reset": true, 00:15:34.151 "nvme_admin": false, 00:15:34.151 "nvme_io": false, 00:15:34.151 "nvme_io_md": false, 00:15:34.151 "write_zeroes": true, 00:15:34.151 "zcopy": true, 00:15:34.151 "get_zone_info": false, 00:15:34.151 "zone_management": false, 00:15:34.151 "zone_append": false, 00:15:34.151 "compare": false, 00:15:34.151 "compare_and_write": false, 00:15:34.151 "abort": true, 00:15:34.151 "seek_hole": false, 00:15:34.151 "seek_data": false, 00:15:34.151 "copy": true, 00:15:34.151 "nvme_iov_md": false 00:15:34.151 }, 00:15:34.151 "memory_domains": [ 00:15:34.151 { 00:15:34.151 "dma_device_id": "system", 00:15:34.151 "dma_device_type": 1 00:15:34.151 }, 00:15:34.151 { 00:15:34.151 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:34.151 "dma_device_type": 2 00:15:34.151 } 00:15:34.151 ], 00:15:34.151 "driver_specific": {} 00:15:34.151 } 00:15:34.151 ] 00:15:34.151 23:37:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:15:34.151 23:37:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:15:34.151 23:37:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:34.151 23:37:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:34.151 23:37:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:34.151 23:37:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:34.151 23:37:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:34.151 23:37:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:34.151 23:37:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:34.151 23:37:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:34.151 23:37:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:34.152 23:37:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:34.152 23:37:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:34.152 23:37:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:34.152 "name": "Existed_Raid", 00:15:34.152 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:34.152 "strip_size_kb": 64, 00:15:34.152 "state": "configuring", 00:15:34.152 "raid_level": "concat", 00:15:34.152 "superblock": false, 00:15:34.152 "num_base_bdevs": 4, 00:15:34.152 "num_base_bdevs_discovered": 3, 00:15:34.152 "num_base_bdevs_operational": 4, 00:15:34.152 "base_bdevs_list": [ 00:15:34.152 { 00:15:34.152 "name": "BaseBdev1", 00:15:34.152 "uuid": "ccf828e3-200d-49e7-aadc-cd995266972f", 00:15:34.152 "is_configured": true, 00:15:34.152 "data_offset": 0, 00:15:34.152 "data_size": 65536 00:15:34.152 }, 00:15:34.152 { 00:15:34.152 "name": null, 00:15:34.152 "uuid": "b1db9a81-70c7-45ad-815d-2044b415387b", 00:15:34.152 "is_configured": false, 00:15:34.152 "data_offset": 0, 00:15:34.152 "data_size": 65536 00:15:34.152 }, 00:15:34.152 { 00:15:34.152 "name": "BaseBdev3", 00:15:34.152 "uuid": "7528dd90-e9d6-4fbc-bd90-b60ae7106c08", 00:15:34.152 "is_configured": true, 00:15:34.152 "data_offset": 0, 00:15:34.152 "data_size": 65536 00:15:34.152 }, 00:15:34.152 { 00:15:34.152 "name": "BaseBdev4", 00:15:34.152 "uuid": "e76a432b-ea94-45a5-9ab2-bbfc6c33abe1", 00:15:34.152 "is_configured": true, 00:15:34.152 "data_offset": 0, 00:15:34.152 "data_size": 65536 00:15:34.152 } 00:15:34.152 ] 00:15:34.152 }' 00:15:34.152 23:37:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:34.152 23:37:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:34.717 23:37:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:34.717 23:37:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:34.975 23:37:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:15:34.975 23:37:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:15:34.975 [2024-07-24 23:37:19.899017] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:34.975 23:37:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:15:34.975 23:37:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:34.975 23:37:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:34.975 23:37:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:34.975 23:37:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:34.975 23:37:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:34.975 23:37:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:34.975 23:37:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:34.975 23:37:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:34.975 23:37:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:34.975 23:37:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:34.975 23:37:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:35.233 23:37:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:35.233 "name": "Existed_Raid", 00:15:35.233 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:35.233 "strip_size_kb": 64, 00:15:35.233 "state": "configuring", 00:15:35.233 "raid_level": "concat", 00:15:35.233 "superblock": false, 00:15:35.233 "num_base_bdevs": 4, 00:15:35.233 "num_base_bdevs_discovered": 2, 00:15:35.233 "num_base_bdevs_operational": 4, 00:15:35.233 "base_bdevs_list": [ 00:15:35.233 { 00:15:35.233 "name": "BaseBdev1", 00:15:35.233 "uuid": "ccf828e3-200d-49e7-aadc-cd995266972f", 00:15:35.233 "is_configured": true, 00:15:35.233 "data_offset": 0, 00:15:35.233 "data_size": 65536 00:15:35.233 }, 00:15:35.233 { 00:15:35.233 "name": null, 00:15:35.233 "uuid": "b1db9a81-70c7-45ad-815d-2044b415387b", 00:15:35.233 "is_configured": false, 00:15:35.233 "data_offset": 0, 00:15:35.233 "data_size": 65536 00:15:35.233 }, 00:15:35.233 { 00:15:35.233 "name": null, 00:15:35.233 "uuid": "7528dd90-e9d6-4fbc-bd90-b60ae7106c08", 00:15:35.233 "is_configured": false, 00:15:35.233 "data_offset": 0, 00:15:35.233 "data_size": 65536 00:15:35.233 }, 00:15:35.233 { 00:15:35.233 "name": "BaseBdev4", 00:15:35.233 "uuid": "e76a432b-ea94-45a5-9ab2-bbfc6c33abe1", 00:15:35.233 "is_configured": true, 00:15:35.233 "data_offset": 0, 00:15:35.233 "data_size": 65536 00:15:35.233 } 00:15:35.233 ] 00:15:35.233 }' 00:15:35.233 23:37:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:35.233 23:37:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:35.798 23:37:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:15:35.798 23:37:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:35.798 23:37:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:15:35.798 23:37:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:15:36.055 [2024-07-24 23:37:20.917799] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:36.055 23:37:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:15:36.055 23:37:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:36.055 23:37:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:36.055 23:37:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:36.055 23:37:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:36.055 23:37:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:36.055 23:37:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:36.055 23:37:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:36.055 23:37:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:36.055 23:37:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:36.055 23:37:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:36.055 23:37:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:36.313 23:37:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:36.313 "name": "Existed_Raid", 00:15:36.313 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:36.313 "strip_size_kb": 64, 00:15:36.313 "state": "configuring", 00:15:36.313 "raid_level": "concat", 00:15:36.313 "superblock": false, 00:15:36.313 "num_base_bdevs": 4, 00:15:36.313 "num_base_bdevs_discovered": 3, 00:15:36.313 "num_base_bdevs_operational": 4, 00:15:36.313 "base_bdevs_list": [ 00:15:36.313 { 00:15:36.313 "name": "BaseBdev1", 00:15:36.313 "uuid": "ccf828e3-200d-49e7-aadc-cd995266972f", 00:15:36.313 "is_configured": true, 00:15:36.313 "data_offset": 0, 00:15:36.313 "data_size": 65536 00:15:36.313 }, 00:15:36.313 { 00:15:36.313 "name": null, 00:15:36.313 "uuid": "b1db9a81-70c7-45ad-815d-2044b415387b", 00:15:36.313 "is_configured": false, 00:15:36.313 "data_offset": 0, 00:15:36.313 "data_size": 65536 00:15:36.313 }, 00:15:36.313 { 00:15:36.313 "name": "BaseBdev3", 00:15:36.313 "uuid": "7528dd90-e9d6-4fbc-bd90-b60ae7106c08", 00:15:36.313 "is_configured": true, 00:15:36.313 "data_offset": 0, 00:15:36.313 "data_size": 65536 00:15:36.313 }, 00:15:36.313 { 00:15:36.313 "name": "BaseBdev4", 00:15:36.313 "uuid": "e76a432b-ea94-45a5-9ab2-bbfc6c33abe1", 00:15:36.313 "is_configured": true, 00:15:36.313 "data_offset": 0, 00:15:36.313 "data_size": 65536 00:15:36.313 } 00:15:36.313 ] 00:15:36.313 }' 00:15:36.313 23:37:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:36.313 23:37:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:36.878 23:37:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:36.878 23:37:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:15:36.878 23:37:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:15:36.878 23:37:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:37.136 [2024-07-24 23:37:21.924434] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:37.136 23:37:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:15:37.136 23:37:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:37.136 23:37:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:37.136 23:37:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:37.136 23:37:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:37.136 23:37:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:37.136 23:37:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:37.136 23:37:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:37.136 23:37:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:37.136 23:37:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:37.136 23:37:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:37.136 23:37:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:37.395 23:37:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:37.395 "name": "Existed_Raid", 00:15:37.395 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:37.395 "strip_size_kb": 64, 00:15:37.395 "state": "configuring", 00:15:37.395 "raid_level": "concat", 00:15:37.395 "superblock": false, 00:15:37.395 "num_base_bdevs": 4, 00:15:37.395 "num_base_bdevs_discovered": 2, 00:15:37.395 "num_base_bdevs_operational": 4, 00:15:37.395 "base_bdevs_list": [ 00:15:37.395 { 00:15:37.395 "name": null, 00:15:37.395 "uuid": "ccf828e3-200d-49e7-aadc-cd995266972f", 00:15:37.395 "is_configured": false, 00:15:37.395 "data_offset": 0, 00:15:37.395 "data_size": 65536 00:15:37.395 }, 00:15:37.395 { 00:15:37.395 "name": null, 00:15:37.395 "uuid": "b1db9a81-70c7-45ad-815d-2044b415387b", 00:15:37.395 "is_configured": false, 00:15:37.395 "data_offset": 0, 00:15:37.395 "data_size": 65536 00:15:37.395 }, 00:15:37.395 { 00:15:37.395 "name": "BaseBdev3", 00:15:37.395 "uuid": "7528dd90-e9d6-4fbc-bd90-b60ae7106c08", 00:15:37.395 "is_configured": true, 00:15:37.395 "data_offset": 0, 00:15:37.395 "data_size": 65536 00:15:37.395 }, 00:15:37.395 { 00:15:37.395 "name": "BaseBdev4", 00:15:37.395 "uuid": "e76a432b-ea94-45a5-9ab2-bbfc6c33abe1", 00:15:37.395 "is_configured": true, 00:15:37.395 "data_offset": 0, 00:15:37.395 "data_size": 65536 00:15:37.395 } 00:15:37.395 ] 00:15:37.395 }' 00:15:37.395 23:37:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:37.395 23:37:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:37.652 23:37:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:37.652 23:37:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:37.910 23:37:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:15:37.910 23:37:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:15:38.167 [2024-07-24 23:37:22.972694] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:38.167 23:37:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:15:38.167 23:37:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:38.167 23:37:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:38.167 23:37:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:38.167 23:37:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:38.167 23:37:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:38.167 23:37:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:38.167 23:37:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:38.167 23:37:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:38.167 23:37:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:38.167 23:37:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:38.167 23:37:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:38.167 23:37:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:38.167 "name": "Existed_Raid", 00:15:38.167 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:38.167 "strip_size_kb": 64, 00:15:38.167 "state": "configuring", 00:15:38.167 "raid_level": "concat", 00:15:38.167 "superblock": false, 00:15:38.167 "num_base_bdevs": 4, 00:15:38.167 "num_base_bdevs_discovered": 3, 00:15:38.167 "num_base_bdevs_operational": 4, 00:15:38.167 "base_bdevs_list": [ 00:15:38.167 { 00:15:38.167 "name": null, 00:15:38.167 "uuid": "ccf828e3-200d-49e7-aadc-cd995266972f", 00:15:38.167 "is_configured": false, 00:15:38.167 "data_offset": 0, 00:15:38.167 "data_size": 65536 00:15:38.167 }, 00:15:38.167 { 00:15:38.167 "name": "BaseBdev2", 00:15:38.167 "uuid": "b1db9a81-70c7-45ad-815d-2044b415387b", 00:15:38.167 "is_configured": true, 00:15:38.167 "data_offset": 0, 00:15:38.167 "data_size": 65536 00:15:38.167 }, 00:15:38.167 { 00:15:38.167 "name": "BaseBdev3", 00:15:38.167 "uuid": "7528dd90-e9d6-4fbc-bd90-b60ae7106c08", 00:15:38.167 "is_configured": true, 00:15:38.167 "data_offset": 0, 00:15:38.167 "data_size": 65536 00:15:38.167 }, 00:15:38.167 { 00:15:38.167 "name": "BaseBdev4", 00:15:38.167 "uuid": "e76a432b-ea94-45a5-9ab2-bbfc6c33abe1", 00:15:38.167 "is_configured": true, 00:15:38.167 "data_offset": 0, 00:15:38.167 "data_size": 65536 00:15:38.167 } 00:15:38.167 ] 00:15:38.167 }' 00:15:38.167 23:37:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:38.167 23:37:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:38.732 23:37:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:38.732 23:37:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:38.989 23:37:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:15:38.989 23:37:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:38.989 23:37:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:15:39.247 23:37:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u ccf828e3-200d-49e7-aadc-cd995266972f 00:15:39.247 [2024-07-24 23:37:24.150315] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:15:39.247 [2024-07-24 23:37:24.150341] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x23cc7e0 00:15:39.247 [2024-07-24 23:37:24.150345] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:15:39.247 [2024-07-24 23:37:24.150475] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23cbba0 00:15:39.247 [2024-07-24 23:37:24.150574] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x23cc7e0 00:15:39.247 [2024-07-24 23:37:24.150579] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x23cc7e0 00:15:39.247 [2024-07-24 23:37:24.150693] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:39.247 NewBaseBdev 00:15:39.247 23:37:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:15:39.247 23:37:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=NewBaseBdev 00:15:39.247 23:37:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:15:39.247 23:37:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:15:39.247 23:37:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:15:39.247 23:37:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:15:39.247 23:37:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:39.505 23:37:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:15:39.505 [ 00:15:39.505 { 00:15:39.505 "name": "NewBaseBdev", 00:15:39.505 "aliases": [ 00:15:39.505 "ccf828e3-200d-49e7-aadc-cd995266972f" 00:15:39.505 ], 00:15:39.505 "product_name": "Malloc disk", 00:15:39.505 "block_size": 512, 00:15:39.505 "num_blocks": 65536, 00:15:39.505 "uuid": "ccf828e3-200d-49e7-aadc-cd995266972f", 00:15:39.505 "assigned_rate_limits": { 00:15:39.505 "rw_ios_per_sec": 0, 00:15:39.505 "rw_mbytes_per_sec": 0, 00:15:39.505 "r_mbytes_per_sec": 0, 00:15:39.505 "w_mbytes_per_sec": 0 00:15:39.505 }, 00:15:39.505 "claimed": true, 00:15:39.505 "claim_type": "exclusive_write", 00:15:39.505 "zoned": false, 00:15:39.505 "supported_io_types": { 00:15:39.505 "read": true, 00:15:39.505 "write": true, 00:15:39.505 "unmap": true, 00:15:39.505 "flush": true, 00:15:39.505 "reset": true, 00:15:39.505 "nvme_admin": false, 00:15:39.505 "nvme_io": false, 00:15:39.505 "nvme_io_md": false, 00:15:39.505 "write_zeroes": true, 00:15:39.505 "zcopy": true, 00:15:39.505 "get_zone_info": false, 00:15:39.505 "zone_management": false, 00:15:39.505 "zone_append": false, 00:15:39.505 "compare": false, 00:15:39.505 "compare_and_write": false, 00:15:39.505 "abort": true, 00:15:39.505 "seek_hole": false, 00:15:39.505 "seek_data": false, 00:15:39.505 "copy": true, 00:15:39.505 "nvme_iov_md": false 00:15:39.505 }, 00:15:39.505 "memory_domains": [ 00:15:39.505 { 00:15:39.505 "dma_device_id": "system", 00:15:39.505 "dma_device_type": 1 00:15:39.505 }, 00:15:39.505 { 00:15:39.505 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:39.505 "dma_device_type": 2 00:15:39.505 } 00:15:39.505 ], 00:15:39.505 "driver_specific": {} 00:15:39.505 } 00:15:39.505 ] 00:15:39.763 23:37:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:15:39.763 23:37:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:15:39.763 23:37:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:39.763 23:37:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:39.763 23:37:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:39.763 23:37:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:39.763 23:37:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:39.763 23:37:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:39.763 23:37:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:39.763 23:37:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:39.763 23:37:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:39.763 23:37:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:39.763 23:37:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:39.763 23:37:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:39.763 "name": "Existed_Raid", 00:15:39.763 "uuid": "ec32ce63-9264-440c-a46b-badfc80f6a53", 00:15:39.763 "strip_size_kb": 64, 00:15:39.763 "state": "online", 00:15:39.763 "raid_level": "concat", 00:15:39.763 "superblock": false, 00:15:39.763 "num_base_bdevs": 4, 00:15:39.763 "num_base_bdevs_discovered": 4, 00:15:39.763 "num_base_bdevs_operational": 4, 00:15:39.763 "base_bdevs_list": [ 00:15:39.763 { 00:15:39.763 "name": "NewBaseBdev", 00:15:39.763 "uuid": "ccf828e3-200d-49e7-aadc-cd995266972f", 00:15:39.763 "is_configured": true, 00:15:39.763 "data_offset": 0, 00:15:39.763 "data_size": 65536 00:15:39.763 }, 00:15:39.763 { 00:15:39.763 "name": "BaseBdev2", 00:15:39.763 "uuid": "b1db9a81-70c7-45ad-815d-2044b415387b", 00:15:39.763 "is_configured": true, 00:15:39.763 "data_offset": 0, 00:15:39.763 "data_size": 65536 00:15:39.763 }, 00:15:39.763 { 00:15:39.763 "name": "BaseBdev3", 00:15:39.763 "uuid": "7528dd90-e9d6-4fbc-bd90-b60ae7106c08", 00:15:39.763 "is_configured": true, 00:15:39.763 "data_offset": 0, 00:15:39.763 "data_size": 65536 00:15:39.763 }, 00:15:39.763 { 00:15:39.763 "name": "BaseBdev4", 00:15:39.763 "uuid": "e76a432b-ea94-45a5-9ab2-bbfc6c33abe1", 00:15:39.763 "is_configured": true, 00:15:39.763 "data_offset": 0, 00:15:39.763 "data_size": 65536 00:15:39.763 } 00:15:39.763 ] 00:15:39.763 }' 00:15:39.763 23:37:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:39.763 23:37:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:40.328 23:37:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:15:40.328 23:37:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:40.329 23:37:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:40.329 23:37:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:40.329 23:37:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:40.329 23:37:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:40.329 23:37:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:40.329 23:37:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:40.329 [2024-07-24 23:37:25.317577] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:40.587 23:37:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:40.587 "name": "Existed_Raid", 00:15:40.587 "aliases": [ 00:15:40.587 "ec32ce63-9264-440c-a46b-badfc80f6a53" 00:15:40.587 ], 00:15:40.587 "product_name": "Raid Volume", 00:15:40.587 "block_size": 512, 00:15:40.587 "num_blocks": 262144, 00:15:40.587 "uuid": "ec32ce63-9264-440c-a46b-badfc80f6a53", 00:15:40.587 "assigned_rate_limits": { 00:15:40.587 "rw_ios_per_sec": 0, 00:15:40.587 "rw_mbytes_per_sec": 0, 00:15:40.587 "r_mbytes_per_sec": 0, 00:15:40.587 "w_mbytes_per_sec": 0 00:15:40.587 }, 00:15:40.587 "claimed": false, 00:15:40.587 "zoned": false, 00:15:40.587 "supported_io_types": { 00:15:40.587 "read": true, 00:15:40.587 "write": true, 00:15:40.587 "unmap": true, 00:15:40.587 "flush": true, 00:15:40.587 "reset": true, 00:15:40.587 "nvme_admin": false, 00:15:40.587 "nvme_io": false, 00:15:40.587 "nvme_io_md": false, 00:15:40.587 "write_zeroes": true, 00:15:40.587 "zcopy": false, 00:15:40.587 "get_zone_info": false, 00:15:40.587 "zone_management": false, 00:15:40.587 "zone_append": false, 00:15:40.587 "compare": false, 00:15:40.587 "compare_and_write": false, 00:15:40.587 "abort": false, 00:15:40.587 "seek_hole": false, 00:15:40.587 "seek_data": false, 00:15:40.587 "copy": false, 00:15:40.587 "nvme_iov_md": false 00:15:40.587 }, 00:15:40.587 "memory_domains": [ 00:15:40.587 { 00:15:40.587 "dma_device_id": "system", 00:15:40.587 "dma_device_type": 1 00:15:40.587 }, 00:15:40.587 { 00:15:40.587 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:40.587 "dma_device_type": 2 00:15:40.587 }, 00:15:40.587 { 00:15:40.587 "dma_device_id": "system", 00:15:40.587 "dma_device_type": 1 00:15:40.587 }, 00:15:40.587 { 00:15:40.587 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:40.587 "dma_device_type": 2 00:15:40.587 }, 00:15:40.587 { 00:15:40.587 "dma_device_id": "system", 00:15:40.587 "dma_device_type": 1 00:15:40.587 }, 00:15:40.587 { 00:15:40.587 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:40.587 "dma_device_type": 2 00:15:40.587 }, 00:15:40.587 { 00:15:40.587 "dma_device_id": "system", 00:15:40.587 "dma_device_type": 1 00:15:40.587 }, 00:15:40.587 { 00:15:40.587 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:40.587 "dma_device_type": 2 00:15:40.587 } 00:15:40.587 ], 00:15:40.587 "driver_specific": { 00:15:40.587 "raid": { 00:15:40.587 "uuid": "ec32ce63-9264-440c-a46b-badfc80f6a53", 00:15:40.587 "strip_size_kb": 64, 00:15:40.587 "state": "online", 00:15:40.587 "raid_level": "concat", 00:15:40.587 "superblock": false, 00:15:40.587 "num_base_bdevs": 4, 00:15:40.587 "num_base_bdevs_discovered": 4, 00:15:40.587 "num_base_bdevs_operational": 4, 00:15:40.587 "base_bdevs_list": [ 00:15:40.587 { 00:15:40.587 "name": "NewBaseBdev", 00:15:40.587 "uuid": "ccf828e3-200d-49e7-aadc-cd995266972f", 00:15:40.587 "is_configured": true, 00:15:40.587 "data_offset": 0, 00:15:40.587 "data_size": 65536 00:15:40.587 }, 00:15:40.587 { 00:15:40.587 "name": "BaseBdev2", 00:15:40.587 "uuid": "b1db9a81-70c7-45ad-815d-2044b415387b", 00:15:40.587 "is_configured": true, 00:15:40.587 "data_offset": 0, 00:15:40.587 "data_size": 65536 00:15:40.587 }, 00:15:40.587 { 00:15:40.587 "name": "BaseBdev3", 00:15:40.587 "uuid": "7528dd90-e9d6-4fbc-bd90-b60ae7106c08", 00:15:40.587 "is_configured": true, 00:15:40.587 "data_offset": 0, 00:15:40.587 "data_size": 65536 00:15:40.587 }, 00:15:40.587 { 00:15:40.587 "name": "BaseBdev4", 00:15:40.587 "uuid": "e76a432b-ea94-45a5-9ab2-bbfc6c33abe1", 00:15:40.587 "is_configured": true, 00:15:40.587 "data_offset": 0, 00:15:40.587 "data_size": 65536 00:15:40.587 } 00:15:40.587 ] 00:15:40.587 } 00:15:40.587 } 00:15:40.587 }' 00:15:40.587 23:37:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:40.587 23:37:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:15:40.587 BaseBdev2 00:15:40.587 BaseBdev3 00:15:40.587 BaseBdev4' 00:15:40.587 23:37:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:40.587 23:37:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:15:40.587 23:37:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:40.587 23:37:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:40.587 "name": "NewBaseBdev", 00:15:40.587 "aliases": [ 00:15:40.587 "ccf828e3-200d-49e7-aadc-cd995266972f" 00:15:40.587 ], 00:15:40.587 "product_name": "Malloc disk", 00:15:40.587 "block_size": 512, 00:15:40.587 "num_blocks": 65536, 00:15:40.587 "uuid": "ccf828e3-200d-49e7-aadc-cd995266972f", 00:15:40.587 "assigned_rate_limits": { 00:15:40.587 "rw_ios_per_sec": 0, 00:15:40.587 "rw_mbytes_per_sec": 0, 00:15:40.587 "r_mbytes_per_sec": 0, 00:15:40.587 "w_mbytes_per_sec": 0 00:15:40.587 }, 00:15:40.587 "claimed": true, 00:15:40.587 "claim_type": "exclusive_write", 00:15:40.587 "zoned": false, 00:15:40.587 "supported_io_types": { 00:15:40.587 "read": true, 00:15:40.587 "write": true, 00:15:40.587 "unmap": true, 00:15:40.587 "flush": true, 00:15:40.587 "reset": true, 00:15:40.587 "nvme_admin": false, 00:15:40.587 "nvme_io": false, 00:15:40.587 "nvme_io_md": false, 00:15:40.587 "write_zeroes": true, 00:15:40.587 "zcopy": true, 00:15:40.587 "get_zone_info": false, 00:15:40.587 "zone_management": false, 00:15:40.587 "zone_append": false, 00:15:40.587 "compare": false, 00:15:40.587 "compare_and_write": false, 00:15:40.587 "abort": true, 00:15:40.587 "seek_hole": false, 00:15:40.587 "seek_data": false, 00:15:40.587 "copy": true, 00:15:40.587 "nvme_iov_md": false 00:15:40.587 }, 00:15:40.587 "memory_domains": [ 00:15:40.587 { 00:15:40.588 "dma_device_id": "system", 00:15:40.588 "dma_device_type": 1 00:15:40.588 }, 00:15:40.588 { 00:15:40.588 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:40.588 "dma_device_type": 2 00:15:40.588 } 00:15:40.588 ], 00:15:40.588 "driver_specific": {} 00:15:40.588 }' 00:15:40.588 23:37:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:40.845 23:37:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:40.845 23:37:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:40.845 23:37:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:40.845 23:37:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:40.845 23:37:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:40.845 23:37:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:40.845 23:37:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:40.845 23:37:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:40.845 23:37:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:40.845 23:37:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:41.103 23:37:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:41.103 23:37:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:41.103 23:37:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:41.103 23:37:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:41.103 23:37:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:41.103 "name": "BaseBdev2", 00:15:41.103 "aliases": [ 00:15:41.103 "b1db9a81-70c7-45ad-815d-2044b415387b" 00:15:41.103 ], 00:15:41.103 "product_name": "Malloc disk", 00:15:41.103 "block_size": 512, 00:15:41.103 "num_blocks": 65536, 00:15:41.103 "uuid": "b1db9a81-70c7-45ad-815d-2044b415387b", 00:15:41.103 "assigned_rate_limits": { 00:15:41.103 "rw_ios_per_sec": 0, 00:15:41.103 "rw_mbytes_per_sec": 0, 00:15:41.103 "r_mbytes_per_sec": 0, 00:15:41.103 "w_mbytes_per_sec": 0 00:15:41.103 }, 00:15:41.103 "claimed": true, 00:15:41.103 "claim_type": "exclusive_write", 00:15:41.103 "zoned": false, 00:15:41.103 "supported_io_types": { 00:15:41.103 "read": true, 00:15:41.103 "write": true, 00:15:41.103 "unmap": true, 00:15:41.103 "flush": true, 00:15:41.103 "reset": true, 00:15:41.103 "nvme_admin": false, 00:15:41.103 "nvme_io": false, 00:15:41.103 "nvme_io_md": false, 00:15:41.103 "write_zeroes": true, 00:15:41.103 "zcopy": true, 00:15:41.103 "get_zone_info": false, 00:15:41.103 "zone_management": false, 00:15:41.103 "zone_append": false, 00:15:41.103 "compare": false, 00:15:41.103 "compare_and_write": false, 00:15:41.103 "abort": true, 00:15:41.103 "seek_hole": false, 00:15:41.103 "seek_data": false, 00:15:41.103 "copy": true, 00:15:41.103 "nvme_iov_md": false 00:15:41.103 }, 00:15:41.103 "memory_domains": [ 00:15:41.105 { 00:15:41.105 "dma_device_id": "system", 00:15:41.105 "dma_device_type": 1 00:15:41.105 }, 00:15:41.105 { 00:15:41.105 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:41.105 "dma_device_type": 2 00:15:41.105 } 00:15:41.105 ], 00:15:41.105 "driver_specific": {} 00:15:41.105 }' 00:15:41.105 23:37:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:41.105 23:37:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:41.105 23:37:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:41.105 23:37:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:41.362 23:37:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:41.362 23:37:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:41.362 23:37:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:41.362 23:37:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:41.362 23:37:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:41.362 23:37:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:41.362 23:37:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:41.363 23:37:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:41.363 23:37:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:41.363 23:37:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:41.363 23:37:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:41.620 23:37:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:41.620 "name": "BaseBdev3", 00:15:41.620 "aliases": [ 00:15:41.620 "7528dd90-e9d6-4fbc-bd90-b60ae7106c08" 00:15:41.620 ], 00:15:41.620 "product_name": "Malloc disk", 00:15:41.620 "block_size": 512, 00:15:41.620 "num_blocks": 65536, 00:15:41.620 "uuid": "7528dd90-e9d6-4fbc-bd90-b60ae7106c08", 00:15:41.620 "assigned_rate_limits": { 00:15:41.620 "rw_ios_per_sec": 0, 00:15:41.620 "rw_mbytes_per_sec": 0, 00:15:41.621 "r_mbytes_per_sec": 0, 00:15:41.621 "w_mbytes_per_sec": 0 00:15:41.621 }, 00:15:41.621 "claimed": true, 00:15:41.621 "claim_type": "exclusive_write", 00:15:41.621 "zoned": false, 00:15:41.621 "supported_io_types": { 00:15:41.621 "read": true, 00:15:41.621 "write": true, 00:15:41.621 "unmap": true, 00:15:41.621 "flush": true, 00:15:41.621 "reset": true, 00:15:41.621 "nvme_admin": false, 00:15:41.621 "nvme_io": false, 00:15:41.621 "nvme_io_md": false, 00:15:41.621 "write_zeroes": true, 00:15:41.621 "zcopy": true, 00:15:41.621 "get_zone_info": false, 00:15:41.621 "zone_management": false, 00:15:41.621 "zone_append": false, 00:15:41.621 "compare": false, 00:15:41.621 "compare_and_write": false, 00:15:41.621 "abort": true, 00:15:41.621 "seek_hole": false, 00:15:41.621 "seek_data": false, 00:15:41.621 "copy": true, 00:15:41.621 "nvme_iov_md": false 00:15:41.621 }, 00:15:41.621 "memory_domains": [ 00:15:41.621 { 00:15:41.621 "dma_device_id": "system", 00:15:41.621 "dma_device_type": 1 00:15:41.621 }, 00:15:41.621 { 00:15:41.621 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:41.621 "dma_device_type": 2 00:15:41.621 } 00:15:41.621 ], 00:15:41.621 "driver_specific": {} 00:15:41.621 }' 00:15:41.621 23:37:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:41.621 23:37:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:41.621 23:37:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:41.621 23:37:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:41.902 23:37:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:41.902 23:37:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:41.902 23:37:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:41.902 23:37:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:41.902 23:37:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:41.902 23:37:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:41.902 23:37:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:41.902 23:37:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:41.902 23:37:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:41.902 23:37:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:15:41.902 23:37:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:42.166 23:37:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:42.166 "name": "BaseBdev4", 00:15:42.166 "aliases": [ 00:15:42.166 "e76a432b-ea94-45a5-9ab2-bbfc6c33abe1" 00:15:42.166 ], 00:15:42.166 "product_name": "Malloc disk", 00:15:42.166 "block_size": 512, 00:15:42.166 "num_blocks": 65536, 00:15:42.166 "uuid": "e76a432b-ea94-45a5-9ab2-bbfc6c33abe1", 00:15:42.166 "assigned_rate_limits": { 00:15:42.166 "rw_ios_per_sec": 0, 00:15:42.166 "rw_mbytes_per_sec": 0, 00:15:42.166 "r_mbytes_per_sec": 0, 00:15:42.166 "w_mbytes_per_sec": 0 00:15:42.166 }, 00:15:42.166 "claimed": true, 00:15:42.166 "claim_type": "exclusive_write", 00:15:42.166 "zoned": false, 00:15:42.166 "supported_io_types": { 00:15:42.166 "read": true, 00:15:42.166 "write": true, 00:15:42.166 "unmap": true, 00:15:42.166 "flush": true, 00:15:42.166 "reset": true, 00:15:42.166 "nvme_admin": false, 00:15:42.166 "nvme_io": false, 00:15:42.166 "nvme_io_md": false, 00:15:42.166 "write_zeroes": true, 00:15:42.166 "zcopy": true, 00:15:42.166 "get_zone_info": false, 00:15:42.166 "zone_management": false, 00:15:42.166 "zone_append": false, 00:15:42.166 "compare": false, 00:15:42.166 "compare_and_write": false, 00:15:42.166 "abort": true, 00:15:42.166 "seek_hole": false, 00:15:42.166 "seek_data": false, 00:15:42.166 "copy": true, 00:15:42.166 "nvme_iov_md": false 00:15:42.166 }, 00:15:42.166 "memory_domains": [ 00:15:42.166 { 00:15:42.166 "dma_device_id": "system", 00:15:42.166 "dma_device_type": 1 00:15:42.166 }, 00:15:42.166 { 00:15:42.166 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:42.166 "dma_device_type": 2 00:15:42.166 } 00:15:42.166 ], 00:15:42.166 "driver_specific": {} 00:15:42.166 }' 00:15:42.166 23:37:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:42.166 23:37:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:42.166 23:37:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:42.166 23:37:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:42.166 23:37:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:42.166 23:37:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:42.166 23:37:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:42.424 23:37:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:42.424 23:37:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:42.424 23:37:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:42.424 23:37:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:42.424 23:37:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:42.424 23:37:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:42.682 [2024-07-24 23:37:27.458900] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:42.682 [2024-07-24 23:37:27.458919] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:42.682 [2024-07-24 23:37:27.458954] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:42.682 [2024-07-24 23:37:27.458995] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:42.682 [2024-07-24 23:37:27.459001] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23cc7e0 name Existed_Raid, state offline 00:15:42.682 23:37:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 316543 00:15:42.682 23:37:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@950 -- # '[' -z 316543 ']' 00:15:42.682 23:37:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # kill -0 316543 00:15:42.682 23:37:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # uname 00:15:42.682 23:37:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:15:42.682 23:37:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 316543 00:15:42.682 23:37:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:15:42.682 23:37:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:15:42.682 23:37:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 316543' 00:15:42.682 killing process with pid 316543 00:15:42.682 23:37:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@969 -- # kill 316543 00:15:42.682 [2024-07-24 23:37:27.518719] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:42.682 23:37:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@974 -- # wait 316543 00:15:42.682 [2024-07-24 23:37:27.549089] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:42.940 23:37:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:15:42.940 00:15:42.940 real 0m24.387s 00:15:42.940 user 0m45.433s 00:15:42.940 sys 0m3.691s 00:15:42.940 23:37:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:15:42.940 23:37:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:42.940 ************************************ 00:15:42.940 END TEST raid_state_function_test 00:15:42.940 ************************************ 00:15:42.940 23:37:27 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 4 true 00:15:42.940 23:37:27 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:15:42.940 23:37:27 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:15:42.940 23:37:27 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:42.940 ************************************ 00:15:42.940 START TEST raid_state_function_test_sb 00:15:42.940 ************************************ 00:15:42.940 23:37:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # raid_state_function_test concat 4 true 00:15:42.940 23:37:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:15:42.940 23:37:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:15:42.940 23:37:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:15:42.940 23:37:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:15:42.940 23:37:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:15:42.940 23:37:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:42.940 23:37:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:15:42.940 23:37:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:42.940 23:37:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:42.940 23:37:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:15:42.940 23:37:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:42.940 23:37:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:42.940 23:37:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:15:42.940 23:37:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:42.940 23:37:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:42.940 23:37:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:15:42.940 23:37:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:42.941 23:37:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:42.941 23:37:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:15:42.941 23:37:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:15:42.941 23:37:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:15:42.941 23:37:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:15:42.941 23:37:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:15:42.941 23:37:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:15:42.941 23:37:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:15:42.941 23:37:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:15:42.941 23:37:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:15:42.941 23:37:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:15:42.941 23:37:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:15:42.941 23:37:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=321249 00:15:42.941 23:37:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 321249' 00:15:42.941 Process raid pid: 321249 00:15:42.941 23:37:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:15:42.941 23:37:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 321249 /var/tmp/spdk-raid.sock 00:15:42.941 23:37:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@831 -- # '[' -z 321249 ']' 00:15:42.941 23:37:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:42.941 23:37:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # local max_retries=100 00:15:42.941 23:37:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:42.941 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:42.941 23:37:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@840 -- # xtrace_disable 00:15:42.941 23:37:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:42.941 [2024-07-24 23:37:27.851761] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:15:42.941 [2024-07-24 23:37:27.851801] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:15:42.941 [2024-07-24 23:37:27.914868] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:43.199 [2024-07-24 23:37:27.993886] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:43.199 [2024-07-24 23:37:28.045187] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:43.199 [2024-07-24 23:37:28.045211] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:43.764 23:37:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:15:43.764 23:37:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@864 -- # return 0 00:15:43.764 23:37:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:15:44.022 [2024-07-24 23:37:28.780060] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:44.022 [2024-07-24 23:37:28.780086] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:44.022 [2024-07-24 23:37:28.780093] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:44.022 [2024-07-24 23:37:28.780099] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:44.022 [2024-07-24 23:37:28.780103] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:44.022 [2024-07-24 23:37:28.780123] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:44.022 [2024-07-24 23:37:28.780128] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:15:44.022 [2024-07-24 23:37:28.780133] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:15:44.022 23:37:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:15:44.022 23:37:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:44.022 23:37:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:44.022 23:37:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:44.022 23:37:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:44.022 23:37:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:44.022 23:37:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:44.022 23:37:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:44.022 23:37:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:44.022 23:37:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:44.022 23:37:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:44.022 23:37:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:44.022 23:37:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:44.022 "name": "Existed_Raid", 00:15:44.022 "uuid": "af2f2917-fbe5-4c49-a37a-ee1ec679cf16", 00:15:44.022 "strip_size_kb": 64, 00:15:44.022 "state": "configuring", 00:15:44.022 "raid_level": "concat", 00:15:44.022 "superblock": true, 00:15:44.022 "num_base_bdevs": 4, 00:15:44.022 "num_base_bdevs_discovered": 0, 00:15:44.022 "num_base_bdevs_operational": 4, 00:15:44.022 "base_bdevs_list": [ 00:15:44.022 { 00:15:44.022 "name": "BaseBdev1", 00:15:44.022 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:44.022 "is_configured": false, 00:15:44.022 "data_offset": 0, 00:15:44.022 "data_size": 0 00:15:44.022 }, 00:15:44.022 { 00:15:44.022 "name": "BaseBdev2", 00:15:44.022 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:44.022 "is_configured": false, 00:15:44.022 "data_offset": 0, 00:15:44.022 "data_size": 0 00:15:44.022 }, 00:15:44.022 { 00:15:44.022 "name": "BaseBdev3", 00:15:44.022 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:44.022 "is_configured": false, 00:15:44.022 "data_offset": 0, 00:15:44.022 "data_size": 0 00:15:44.022 }, 00:15:44.022 { 00:15:44.022 "name": "BaseBdev4", 00:15:44.022 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:44.022 "is_configured": false, 00:15:44.022 "data_offset": 0, 00:15:44.022 "data_size": 0 00:15:44.022 } 00:15:44.022 ] 00:15:44.022 }' 00:15:44.022 23:37:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:44.022 23:37:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:44.588 23:37:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:44.846 [2024-07-24 23:37:29.614121] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:44.846 [2024-07-24 23:37:29.614139] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe14b50 name Existed_Raid, state configuring 00:15:44.846 23:37:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:15:44.846 [2024-07-24 23:37:29.782581] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:44.846 [2024-07-24 23:37:29.782599] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:44.846 [2024-07-24 23:37:29.782604] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:44.846 [2024-07-24 23:37:29.782608] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:44.846 [2024-07-24 23:37:29.782612] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:44.846 [2024-07-24 23:37:29.782617] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:44.846 [2024-07-24 23:37:29.782636] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:15:44.846 [2024-07-24 23:37:29.782641] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:15:44.846 23:37:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:45.103 [2024-07-24 23:37:29.962944] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:45.103 BaseBdev1 00:15:45.103 23:37:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:15:45.103 23:37:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:15:45.103 23:37:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:15:45.103 23:37:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:15:45.103 23:37:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:15:45.103 23:37:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:15:45.103 23:37:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:45.361 23:37:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:45.361 [ 00:15:45.361 { 00:15:45.361 "name": "BaseBdev1", 00:15:45.361 "aliases": [ 00:15:45.361 "0649d84d-6efc-4a72-950a-35cd060ab8a5" 00:15:45.361 ], 00:15:45.361 "product_name": "Malloc disk", 00:15:45.361 "block_size": 512, 00:15:45.361 "num_blocks": 65536, 00:15:45.361 "uuid": "0649d84d-6efc-4a72-950a-35cd060ab8a5", 00:15:45.361 "assigned_rate_limits": { 00:15:45.361 "rw_ios_per_sec": 0, 00:15:45.361 "rw_mbytes_per_sec": 0, 00:15:45.361 "r_mbytes_per_sec": 0, 00:15:45.361 "w_mbytes_per_sec": 0 00:15:45.361 }, 00:15:45.361 "claimed": true, 00:15:45.361 "claim_type": "exclusive_write", 00:15:45.361 "zoned": false, 00:15:45.361 "supported_io_types": { 00:15:45.361 "read": true, 00:15:45.361 "write": true, 00:15:45.361 "unmap": true, 00:15:45.361 "flush": true, 00:15:45.361 "reset": true, 00:15:45.361 "nvme_admin": false, 00:15:45.361 "nvme_io": false, 00:15:45.361 "nvme_io_md": false, 00:15:45.361 "write_zeroes": true, 00:15:45.361 "zcopy": true, 00:15:45.361 "get_zone_info": false, 00:15:45.361 "zone_management": false, 00:15:45.361 "zone_append": false, 00:15:45.361 "compare": false, 00:15:45.361 "compare_and_write": false, 00:15:45.361 "abort": true, 00:15:45.361 "seek_hole": false, 00:15:45.361 "seek_data": false, 00:15:45.361 "copy": true, 00:15:45.361 "nvme_iov_md": false 00:15:45.361 }, 00:15:45.361 "memory_domains": [ 00:15:45.361 { 00:15:45.361 "dma_device_id": "system", 00:15:45.361 "dma_device_type": 1 00:15:45.361 }, 00:15:45.361 { 00:15:45.361 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:45.361 "dma_device_type": 2 00:15:45.361 } 00:15:45.361 ], 00:15:45.361 "driver_specific": {} 00:15:45.361 } 00:15:45.361 ] 00:15:45.361 23:37:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:15:45.361 23:37:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:15:45.361 23:37:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:45.361 23:37:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:45.361 23:37:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:45.361 23:37:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:45.361 23:37:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:45.361 23:37:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:45.361 23:37:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:45.361 23:37:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:45.361 23:37:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:45.361 23:37:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:45.361 23:37:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:45.619 23:37:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:45.619 "name": "Existed_Raid", 00:15:45.619 "uuid": "65079afe-cc3f-46fa-994e-70527cbdb4e9", 00:15:45.619 "strip_size_kb": 64, 00:15:45.619 "state": "configuring", 00:15:45.619 "raid_level": "concat", 00:15:45.619 "superblock": true, 00:15:45.619 "num_base_bdevs": 4, 00:15:45.619 "num_base_bdevs_discovered": 1, 00:15:45.619 "num_base_bdevs_operational": 4, 00:15:45.619 "base_bdevs_list": [ 00:15:45.619 { 00:15:45.619 "name": "BaseBdev1", 00:15:45.619 "uuid": "0649d84d-6efc-4a72-950a-35cd060ab8a5", 00:15:45.619 "is_configured": true, 00:15:45.619 "data_offset": 2048, 00:15:45.619 "data_size": 63488 00:15:45.619 }, 00:15:45.619 { 00:15:45.619 "name": "BaseBdev2", 00:15:45.619 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:45.619 "is_configured": false, 00:15:45.619 "data_offset": 0, 00:15:45.619 "data_size": 0 00:15:45.619 }, 00:15:45.619 { 00:15:45.619 "name": "BaseBdev3", 00:15:45.619 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:45.619 "is_configured": false, 00:15:45.619 "data_offset": 0, 00:15:45.619 "data_size": 0 00:15:45.619 }, 00:15:45.619 { 00:15:45.619 "name": "BaseBdev4", 00:15:45.619 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:45.619 "is_configured": false, 00:15:45.619 "data_offset": 0, 00:15:45.619 "data_size": 0 00:15:45.619 } 00:15:45.619 ] 00:15:45.619 }' 00:15:45.619 23:37:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:45.619 23:37:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:46.180 23:37:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:46.180 [2024-07-24 23:37:31.129960] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:46.180 [2024-07-24 23:37:31.129990] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe143a0 name Existed_Raid, state configuring 00:15:46.180 23:37:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:15:46.437 [2024-07-24 23:37:31.290401] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:46.437 [2024-07-24 23:37:31.291448] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:46.437 [2024-07-24 23:37:31.291477] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:46.437 [2024-07-24 23:37:31.291483] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:46.437 [2024-07-24 23:37:31.291488] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:46.437 [2024-07-24 23:37:31.291492] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:15:46.437 [2024-07-24 23:37:31.291496] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:15:46.437 23:37:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:15:46.437 23:37:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:46.437 23:37:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:15:46.437 23:37:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:46.437 23:37:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:46.437 23:37:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:46.437 23:37:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:46.437 23:37:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:46.437 23:37:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:46.437 23:37:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:46.437 23:37:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:46.437 23:37:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:46.437 23:37:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:46.437 23:37:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:46.695 23:37:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:46.695 "name": "Existed_Raid", 00:15:46.695 "uuid": "2fc3cf68-abe6-4b22-af5b-df7fdf39fa87", 00:15:46.695 "strip_size_kb": 64, 00:15:46.695 "state": "configuring", 00:15:46.695 "raid_level": "concat", 00:15:46.695 "superblock": true, 00:15:46.695 "num_base_bdevs": 4, 00:15:46.695 "num_base_bdevs_discovered": 1, 00:15:46.695 "num_base_bdevs_operational": 4, 00:15:46.695 "base_bdevs_list": [ 00:15:46.695 { 00:15:46.695 "name": "BaseBdev1", 00:15:46.695 "uuid": "0649d84d-6efc-4a72-950a-35cd060ab8a5", 00:15:46.695 "is_configured": true, 00:15:46.695 "data_offset": 2048, 00:15:46.695 "data_size": 63488 00:15:46.695 }, 00:15:46.695 { 00:15:46.695 "name": "BaseBdev2", 00:15:46.695 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:46.695 "is_configured": false, 00:15:46.695 "data_offset": 0, 00:15:46.695 "data_size": 0 00:15:46.695 }, 00:15:46.695 { 00:15:46.695 "name": "BaseBdev3", 00:15:46.695 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:46.695 "is_configured": false, 00:15:46.695 "data_offset": 0, 00:15:46.695 "data_size": 0 00:15:46.695 }, 00:15:46.695 { 00:15:46.695 "name": "BaseBdev4", 00:15:46.695 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:46.695 "is_configured": false, 00:15:46.695 "data_offset": 0, 00:15:46.695 "data_size": 0 00:15:46.695 } 00:15:46.695 ] 00:15:46.695 }' 00:15:46.695 23:37:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:46.695 23:37:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:46.953 23:37:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:47.211 [2024-07-24 23:37:32.111093] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:47.211 BaseBdev2 00:15:47.211 23:37:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:15:47.211 23:37:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:15:47.211 23:37:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:15:47.211 23:37:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:15:47.211 23:37:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:15:47.211 23:37:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:15:47.211 23:37:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:47.469 23:37:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:47.469 [ 00:15:47.469 { 00:15:47.469 "name": "BaseBdev2", 00:15:47.469 "aliases": [ 00:15:47.469 "09cb4957-9456-4829-ae40-7615fbcafffa" 00:15:47.469 ], 00:15:47.469 "product_name": "Malloc disk", 00:15:47.469 "block_size": 512, 00:15:47.469 "num_blocks": 65536, 00:15:47.469 "uuid": "09cb4957-9456-4829-ae40-7615fbcafffa", 00:15:47.469 "assigned_rate_limits": { 00:15:47.469 "rw_ios_per_sec": 0, 00:15:47.469 "rw_mbytes_per_sec": 0, 00:15:47.469 "r_mbytes_per_sec": 0, 00:15:47.469 "w_mbytes_per_sec": 0 00:15:47.469 }, 00:15:47.469 "claimed": true, 00:15:47.469 "claim_type": "exclusive_write", 00:15:47.469 "zoned": false, 00:15:47.469 "supported_io_types": { 00:15:47.469 "read": true, 00:15:47.469 "write": true, 00:15:47.469 "unmap": true, 00:15:47.469 "flush": true, 00:15:47.469 "reset": true, 00:15:47.469 "nvme_admin": false, 00:15:47.469 "nvme_io": false, 00:15:47.469 "nvme_io_md": false, 00:15:47.469 "write_zeroes": true, 00:15:47.469 "zcopy": true, 00:15:47.469 "get_zone_info": false, 00:15:47.469 "zone_management": false, 00:15:47.469 "zone_append": false, 00:15:47.469 "compare": false, 00:15:47.469 "compare_and_write": false, 00:15:47.469 "abort": true, 00:15:47.469 "seek_hole": false, 00:15:47.469 "seek_data": false, 00:15:47.469 "copy": true, 00:15:47.469 "nvme_iov_md": false 00:15:47.469 }, 00:15:47.469 "memory_domains": [ 00:15:47.469 { 00:15:47.469 "dma_device_id": "system", 00:15:47.469 "dma_device_type": 1 00:15:47.469 }, 00:15:47.469 { 00:15:47.469 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:47.469 "dma_device_type": 2 00:15:47.469 } 00:15:47.469 ], 00:15:47.469 "driver_specific": {} 00:15:47.469 } 00:15:47.469 ] 00:15:47.469 23:37:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:15:47.469 23:37:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:47.469 23:37:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:47.469 23:37:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:15:47.469 23:37:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:47.469 23:37:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:47.469 23:37:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:47.469 23:37:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:47.469 23:37:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:47.469 23:37:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:47.469 23:37:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:47.469 23:37:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:47.469 23:37:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:47.469 23:37:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:47.469 23:37:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:47.727 23:37:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:47.727 "name": "Existed_Raid", 00:15:47.727 "uuid": "2fc3cf68-abe6-4b22-af5b-df7fdf39fa87", 00:15:47.727 "strip_size_kb": 64, 00:15:47.727 "state": "configuring", 00:15:47.727 "raid_level": "concat", 00:15:47.727 "superblock": true, 00:15:47.727 "num_base_bdevs": 4, 00:15:47.727 "num_base_bdevs_discovered": 2, 00:15:47.727 "num_base_bdevs_operational": 4, 00:15:47.727 "base_bdevs_list": [ 00:15:47.727 { 00:15:47.727 "name": "BaseBdev1", 00:15:47.727 "uuid": "0649d84d-6efc-4a72-950a-35cd060ab8a5", 00:15:47.727 "is_configured": true, 00:15:47.727 "data_offset": 2048, 00:15:47.727 "data_size": 63488 00:15:47.727 }, 00:15:47.727 { 00:15:47.727 "name": "BaseBdev2", 00:15:47.727 "uuid": "09cb4957-9456-4829-ae40-7615fbcafffa", 00:15:47.727 "is_configured": true, 00:15:47.727 "data_offset": 2048, 00:15:47.727 "data_size": 63488 00:15:47.727 }, 00:15:47.727 { 00:15:47.727 "name": "BaseBdev3", 00:15:47.728 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:47.728 "is_configured": false, 00:15:47.728 "data_offset": 0, 00:15:47.728 "data_size": 0 00:15:47.728 }, 00:15:47.728 { 00:15:47.728 "name": "BaseBdev4", 00:15:47.728 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:47.728 "is_configured": false, 00:15:47.728 "data_offset": 0, 00:15:47.728 "data_size": 0 00:15:47.728 } 00:15:47.728 ] 00:15:47.728 }' 00:15:47.728 23:37:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:47.728 23:37:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:48.292 23:37:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:48.292 [2024-07-24 23:37:33.236818] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:48.292 BaseBdev3 00:15:48.292 23:37:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:15:48.292 23:37:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:15:48.292 23:37:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:15:48.292 23:37:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:15:48.292 23:37:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:15:48.292 23:37:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:15:48.292 23:37:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:48.549 23:37:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:48.806 [ 00:15:48.806 { 00:15:48.806 "name": "BaseBdev3", 00:15:48.806 "aliases": [ 00:15:48.806 "8a05c107-ff54-4b76-8ae0-392607caf6f7" 00:15:48.806 ], 00:15:48.806 "product_name": "Malloc disk", 00:15:48.806 "block_size": 512, 00:15:48.806 "num_blocks": 65536, 00:15:48.806 "uuid": "8a05c107-ff54-4b76-8ae0-392607caf6f7", 00:15:48.806 "assigned_rate_limits": { 00:15:48.806 "rw_ios_per_sec": 0, 00:15:48.806 "rw_mbytes_per_sec": 0, 00:15:48.806 "r_mbytes_per_sec": 0, 00:15:48.806 "w_mbytes_per_sec": 0 00:15:48.806 }, 00:15:48.806 "claimed": true, 00:15:48.806 "claim_type": "exclusive_write", 00:15:48.806 "zoned": false, 00:15:48.806 "supported_io_types": { 00:15:48.806 "read": true, 00:15:48.806 "write": true, 00:15:48.806 "unmap": true, 00:15:48.806 "flush": true, 00:15:48.806 "reset": true, 00:15:48.806 "nvme_admin": false, 00:15:48.806 "nvme_io": false, 00:15:48.806 "nvme_io_md": false, 00:15:48.806 "write_zeroes": true, 00:15:48.806 "zcopy": true, 00:15:48.806 "get_zone_info": false, 00:15:48.806 "zone_management": false, 00:15:48.806 "zone_append": false, 00:15:48.806 "compare": false, 00:15:48.806 "compare_and_write": false, 00:15:48.806 "abort": true, 00:15:48.806 "seek_hole": false, 00:15:48.806 "seek_data": false, 00:15:48.806 "copy": true, 00:15:48.806 "nvme_iov_md": false 00:15:48.806 }, 00:15:48.806 "memory_domains": [ 00:15:48.806 { 00:15:48.806 "dma_device_id": "system", 00:15:48.806 "dma_device_type": 1 00:15:48.806 }, 00:15:48.806 { 00:15:48.806 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:48.806 "dma_device_type": 2 00:15:48.806 } 00:15:48.806 ], 00:15:48.806 "driver_specific": {} 00:15:48.806 } 00:15:48.806 ] 00:15:48.806 23:37:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:15:48.806 23:37:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:48.806 23:37:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:48.806 23:37:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:15:48.806 23:37:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:48.806 23:37:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:48.806 23:37:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:48.806 23:37:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:48.806 23:37:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:48.806 23:37:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:48.806 23:37:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:48.806 23:37:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:48.806 23:37:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:48.806 23:37:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:48.806 23:37:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:48.806 23:37:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:48.806 "name": "Existed_Raid", 00:15:48.806 "uuid": "2fc3cf68-abe6-4b22-af5b-df7fdf39fa87", 00:15:48.806 "strip_size_kb": 64, 00:15:48.806 "state": "configuring", 00:15:48.806 "raid_level": "concat", 00:15:48.806 "superblock": true, 00:15:48.806 "num_base_bdevs": 4, 00:15:48.806 "num_base_bdevs_discovered": 3, 00:15:48.806 "num_base_bdevs_operational": 4, 00:15:48.806 "base_bdevs_list": [ 00:15:48.806 { 00:15:48.806 "name": "BaseBdev1", 00:15:48.806 "uuid": "0649d84d-6efc-4a72-950a-35cd060ab8a5", 00:15:48.806 "is_configured": true, 00:15:48.806 "data_offset": 2048, 00:15:48.806 "data_size": 63488 00:15:48.806 }, 00:15:48.806 { 00:15:48.806 "name": "BaseBdev2", 00:15:48.806 "uuid": "09cb4957-9456-4829-ae40-7615fbcafffa", 00:15:48.806 "is_configured": true, 00:15:48.806 "data_offset": 2048, 00:15:48.806 "data_size": 63488 00:15:48.806 }, 00:15:48.806 { 00:15:48.806 "name": "BaseBdev3", 00:15:48.806 "uuid": "8a05c107-ff54-4b76-8ae0-392607caf6f7", 00:15:48.806 "is_configured": true, 00:15:48.806 "data_offset": 2048, 00:15:48.806 "data_size": 63488 00:15:48.806 }, 00:15:48.806 { 00:15:48.806 "name": "BaseBdev4", 00:15:48.806 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:48.806 "is_configured": false, 00:15:48.806 "data_offset": 0, 00:15:48.806 "data_size": 0 00:15:48.806 } 00:15:48.806 ] 00:15:48.806 }' 00:15:48.806 23:37:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:48.806 23:37:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:49.369 23:37:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:15:49.369 [2024-07-24 23:37:34.366325] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:15:49.369 [2024-07-24 23:37:34.366441] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xe153d0 00:15:49.369 [2024-07-24 23:37:34.366449] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:15:49.369 [2024-07-24 23:37:34.366576] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe150a0 00:15:49.369 [2024-07-24 23:37:34.366675] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe153d0 00:15:49.369 [2024-07-24 23:37:34.366680] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xe153d0 00:15:49.369 [2024-07-24 23:37:34.366742] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:49.369 BaseBdev4 00:15:49.627 23:37:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:15:49.627 23:37:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev4 00:15:49.627 23:37:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:15:49.627 23:37:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:15:49.627 23:37:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:15:49.627 23:37:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:15:49.627 23:37:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:49.627 23:37:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:15:49.884 [ 00:15:49.884 { 00:15:49.884 "name": "BaseBdev4", 00:15:49.884 "aliases": [ 00:15:49.884 "d3285db1-af31-490e-9545-6f3c9ec0fcf8" 00:15:49.884 ], 00:15:49.884 "product_name": "Malloc disk", 00:15:49.884 "block_size": 512, 00:15:49.884 "num_blocks": 65536, 00:15:49.884 "uuid": "d3285db1-af31-490e-9545-6f3c9ec0fcf8", 00:15:49.884 "assigned_rate_limits": { 00:15:49.884 "rw_ios_per_sec": 0, 00:15:49.884 "rw_mbytes_per_sec": 0, 00:15:49.884 "r_mbytes_per_sec": 0, 00:15:49.884 "w_mbytes_per_sec": 0 00:15:49.884 }, 00:15:49.884 "claimed": true, 00:15:49.884 "claim_type": "exclusive_write", 00:15:49.884 "zoned": false, 00:15:49.884 "supported_io_types": { 00:15:49.884 "read": true, 00:15:49.884 "write": true, 00:15:49.884 "unmap": true, 00:15:49.884 "flush": true, 00:15:49.884 "reset": true, 00:15:49.884 "nvme_admin": false, 00:15:49.884 "nvme_io": false, 00:15:49.884 "nvme_io_md": false, 00:15:49.884 "write_zeroes": true, 00:15:49.884 "zcopy": true, 00:15:49.884 "get_zone_info": false, 00:15:49.884 "zone_management": false, 00:15:49.884 "zone_append": false, 00:15:49.884 "compare": false, 00:15:49.885 "compare_and_write": false, 00:15:49.885 "abort": true, 00:15:49.885 "seek_hole": false, 00:15:49.885 "seek_data": false, 00:15:49.885 "copy": true, 00:15:49.885 "nvme_iov_md": false 00:15:49.885 }, 00:15:49.885 "memory_domains": [ 00:15:49.885 { 00:15:49.885 "dma_device_id": "system", 00:15:49.885 "dma_device_type": 1 00:15:49.885 }, 00:15:49.885 { 00:15:49.885 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:49.885 "dma_device_type": 2 00:15:49.885 } 00:15:49.885 ], 00:15:49.885 "driver_specific": {} 00:15:49.885 } 00:15:49.885 ] 00:15:49.885 23:37:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:15:49.885 23:37:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:49.885 23:37:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:49.885 23:37:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:15:49.885 23:37:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:49.885 23:37:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:49.885 23:37:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:49.885 23:37:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:49.885 23:37:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:49.885 23:37:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:49.885 23:37:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:49.885 23:37:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:49.885 23:37:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:49.885 23:37:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:49.885 23:37:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:50.143 23:37:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:50.143 "name": "Existed_Raid", 00:15:50.143 "uuid": "2fc3cf68-abe6-4b22-af5b-df7fdf39fa87", 00:15:50.143 "strip_size_kb": 64, 00:15:50.143 "state": "online", 00:15:50.143 "raid_level": "concat", 00:15:50.143 "superblock": true, 00:15:50.143 "num_base_bdevs": 4, 00:15:50.143 "num_base_bdevs_discovered": 4, 00:15:50.143 "num_base_bdevs_operational": 4, 00:15:50.143 "base_bdevs_list": [ 00:15:50.143 { 00:15:50.143 "name": "BaseBdev1", 00:15:50.143 "uuid": "0649d84d-6efc-4a72-950a-35cd060ab8a5", 00:15:50.143 "is_configured": true, 00:15:50.143 "data_offset": 2048, 00:15:50.143 "data_size": 63488 00:15:50.143 }, 00:15:50.143 { 00:15:50.143 "name": "BaseBdev2", 00:15:50.143 "uuid": "09cb4957-9456-4829-ae40-7615fbcafffa", 00:15:50.143 "is_configured": true, 00:15:50.143 "data_offset": 2048, 00:15:50.143 "data_size": 63488 00:15:50.143 }, 00:15:50.143 { 00:15:50.143 "name": "BaseBdev3", 00:15:50.143 "uuid": "8a05c107-ff54-4b76-8ae0-392607caf6f7", 00:15:50.143 "is_configured": true, 00:15:50.143 "data_offset": 2048, 00:15:50.143 "data_size": 63488 00:15:50.143 }, 00:15:50.143 { 00:15:50.143 "name": "BaseBdev4", 00:15:50.143 "uuid": "d3285db1-af31-490e-9545-6f3c9ec0fcf8", 00:15:50.143 "is_configured": true, 00:15:50.143 "data_offset": 2048, 00:15:50.143 "data_size": 63488 00:15:50.143 } 00:15:50.143 ] 00:15:50.143 }' 00:15:50.143 23:37:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:50.143 23:37:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:50.400 23:37:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:15:50.400 23:37:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:50.400 23:37:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:50.400 23:37:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:50.400 23:37:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:50.400 23:37:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:15:50.658 23:37:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:50.658 23:37:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:50.658 [2024-07-24 23:37:35.549595] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:50.658 23:37:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:50.658 "name": "Existed_Raid", 00:15:50.658 "aliases": [ 00:15:50.658 "2fc3cf68-abe6-4b22-af5b-df7fdf39fa87" 00:15:50.658 ], 00:15:50.658 "product_name": "Raid Volume", 00:15:50.658 "block_size": 512, 00:15:50.658 "num_blocks": 253952, 00:15:50.658 "uuid": "2fc3cf68-abe6-4b22-af5b-df7fdf39fa87", 00:15:50.658 "assigned_rate_limits": { 00:15:50.658 "rw_ios_per_sec": 0, 00:15:50.658 "rw_mbytes_per_sec": 0, 00:15:50.658 "r_mbytes_per_sec": 0, 00:15:50.658 "w_mbytes_per_sec": 0 00:15:50.658 }, 00:15:50.658 "claimed": false, 00:15:50.658 "zoned": false, 00:15:50.658 "supported_io_types": { 00:15:50.658 "read": true, 00:15:50.658 "write": true, 00:15:50.658 "unmap": true, 00:15:50.658 "flush": true, 00:15:50.658 "reset": true, 00:15:50.658 "nvme_admin": false, 00:15:50.658 "nvme_io": false, 00:15:50.658 "nvme_io_md": false, 00:15:50.658 "write_zeroes": true, 00:15:50.658 "zcopy": false, 00:15:50.658 "get_zone_info": false, 00:15:50.658 "zone_management": false, 00:15:50.658 "zone_append": false, 00:15:50.658 "compare": false, 00:15:50.658 "compare_and_write": false, 00:15:50.658 "abort": false, 00:15:50.658 "seek_hole": false, 00:15:50.658 "seek_data": false, 00:15:50.658 "copy": false, 00:15:50.658 "nvme_iov_md": false 00:15:50.658 }, 00:15:50.658 "memory_domains": [ 00:15:50.658 { 00:15:50.658 "dma_device_id": "system", 00:15:50.658 "dma_device_type": 1 00:15:50.658 }, 00:15:50.658 { 00:15:50.658 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:50.658 "dma_device_type": 2 00:15:50.658 }, 00:15:50.658 { 00:15:50.658 "dma_device_id": "system", 00:15:50.658 "dma_device_type": 1 00:15:50.658 }, 00:15:50.658 { 00:15:50.658 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:50.658 "dma_device_type": 2 00:15:50.658 }, 00:15:50.658 { 00:15:50.658 "dma_device_id": "system", 00:15:50.658 "dma_device_type": 1 00:15:50.658 }, 00:15:50.658 { 00:15:50.658 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:50.658 "dma_device_type": 2 00:15:50.658 }, 00:15:50.658 { 00:15:50.658 "dma_device_id": "system", 00:15:50.658 "dma_device_type": 1 00:15:50.658 }, 00:15:50.658 { 00:15:50.658 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:50.658 "dma_device_type": 2 00:15:50.658 } 00:15:50.658 ], 00:15:50.658 "driver_specific": { 00:15:50.658 "raid": { 00:15:50.658 "uuid": "2fc3cf68-abe6-4b22-af5b-df7fdf39fa87", 00:15:50.658 "strip_size_kb": 64, 00:15:50.658 "state": "online", 00:15:50.658 "raid_level": "concat", 00:15:50.658 "superblock": true, 00:15:50.658 "num_base_bdevs": 4, 00:15:50.658 "num_base_bdevs_discovered": 4, 00:15:50.658 "num_base_bdevs_operational": 4, 00:15:50.658 "base_bdevs_list": [ 00:15:50.658 { 00:15:50.658 "name": "BaseBdev1", 00:15:50.658 "uuid": "0649d84d-6efc-4a72-950a-35cd060ab8a5", 00:15:50.658 "is_configured": true, 00:15:50.658 "data_offset": 2048, 00:15:50.658 "data_size": 63488 00:15:50.658 }, 00:15:50.658 { 00:15:50.658 "name": "BaseBdev2", 00:15:50.658 "uuid": "09cb4957-9456-4829-ae40-7615fbcafffa", 00:15:50.658 "is_configured": true, 00:15:50.658 "data_offset": 2048, 00:15:50.658 "data_size": 63488 00:15:50.658 }, 00:15:50.658 { 00:15:50.658 "name": "BaseBdev3", 00:15:50.658 "uuid": "8a05c107-ff54-4b76-8ae0-392607caf6f7", 00:15:50.658 "is_configured": true, 00:15:50.658 "data_offset": 2048, 00:15:50.658 "data_size": 63488 00:15:50.658 }, 00:15:50.658 { 00:15:50.658 "name": "BaseBdev4", 00:15:50.658 "uuid": "d3285db1-af31-490e-9545-6f3c9ec0fcf8", 00:15:50.658 "is_configured": true, 00:15:50.658 "data_offset": 2048, 00:15:50.658 "data_size": 63488 00:15:50.658 } 00:15:50.658 ] 00:15:50.658 } 00:15:50.658 } 00:15:50.658 }' 00:15:50.658 23:37:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:50.658 23:37:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:15:50.658 BaseBdev2 00:15:50.658 BaseBdev3 00:15:50.658 BaseBdev4' 00:15:50.658 23:37:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:50.658 23:37:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:15:50.658 23:37:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:50.916 23:37:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:50.916 "name": "BaseBdev1", 00:15:50.916 "aliases": [ 00:15:50.916 "0649d84d-6efc-4a72-950a-35cd060ab8a5" 00:15:50.916 ], 00:15:50.916 "product_name": "Malloc disk", 00:15:50.916 "block_size": 512, 00:15:50.916 "num_blocks": 65536, 00:15:50.916 "uuid": "0649d84d-6efc-4a72-950a-35cd060ab8a5", 00:15:50.916 "assigned_rate_limits": { 00:15:50.916 "rw_ios_per_sec": 0, 00:15:50.916 "rw_mbytes_per_sec": 0, 00:15:50.916 "r_mbytes_per_sec": 0, 00:15:50.916 "w_mbytes_per_sec": 0 00:15:50.916 }, 00:15:50.916 "claimed": true, 00:15:50.916 "claim_type": "exclusive_write", 00:15:50.916 "zoned": false, 00:15:50.916 "supported_io_types": { 00:15:50.916 "read": true, 00:15:50.916 "write": true, 00:15:50.916 "unmap": true, 00:15:50.916 "flush": true, 00:15:50.916 "reset": true, 00:15:50.916 "nvme_admin": false, 00:15:50.916 "nvme_io": false, 00:15:50.916 "nvme_io_md": false, 00:15:50.916 "write_zeroes": true, 00:15:50.916 "zcopy": true, 00:15:50.916 "get_zone_info": false, 00:15:50.916 "zone_management": false, 00:15:50.916 "zone_append": false, 00:15:50.916 "compare": false, 00:15:50.916 "compare_and_write": false, 00:15:50.916 "abort": true, 00:15:50.916 "seek_hole": false, 00:15:50.916 "seek_data": false, 00:15:50.916 "copy": true, 00:15:50.916 "nvme_iov_md": false 00:15:50.916 }, 00:15:50.916 "memory_domains": [ 00:15:50.916 { 00:15:50.916 "dma_device_id": "system", 00:15:50.916 "dma_device_type": 1 00:15:50.916 }, 00:15:50.916 { 00:15:50.916 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:50.916 "dma_device_type": 2 00:15:50.916 } 00:15:50.916 ], 00:15:50.916 "driver_specific": {} 00:15:50.916 }' 00:15:50.916 23:37:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:50.916 23:37:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:50.916 23:37:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:50.916 23:37:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:50.916 23:37:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:51.174 23:37:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:51.174 23:37:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:51.174 23:37:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:51.174 23:37:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:51.174 23:37:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:51.174 23:37:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:51.174 23:37:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:51.174 23:37:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:51.174 23:37:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:51.174 23:37:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:51.432 23:37:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:51.432 "name": "BaseBdev2", 00:15:51.432 "aliases": [ 00:15:51.432 "09cb4957-9456-4829-ae40-7615fbcafffa" 00:15:51.432 ], 00:15:51.432 "product_name": "Malloc disk", 00:15:51.432 "block_size": 512, 00:15:51.432 "num_blocks": 65536, 00:15:51.432 "uuid": "09cb4957-9456-4829-ae40-7615fbcafffa", 00:15:51.432 "assigned_rate_limits": { 00:15:51.432 "rw_ios_per_sec": 0, 00:15:51.432 "rw_mbytes_per_sec": 0, 00:15:51.432 "r_mbytes_per_sec": 0, 00:15:51.432 "w_mbytes_per_sec": 0 00:15:51.432 }, 00:15:51.432 "claimed": true, 00:15:51.432 "claim_type": "exclusive_write", 00:15:51.432 "zoned": false, 00:15:51.432 "supported_io_types": { 00:15:51.432 "read": true, 00:15:51.432 "write": true, 00:15:51.432 "unmap": true, 00:15:51.432 "flush": true, 00:15:51.432 "reset": true, 00:15:51.432 "nvme_admin": false, 00:15:51.432 "nvme_io": false, 00:15:51.432 "nvme_io_md": false, 00:15:51.432 "write_zeroes": true, 00:15:51.432 "zcopy": true, 00:15:51.432 "get_zone_info": false, 00:15:51.432 "zone_management": false, 00:15:51.432 "zone_append": false, 00:15:51.432 "compare": false, 00:15:51.432 "compare_and_write": false, 00:15:51.432 "abort": true, 00:15:51.432 "seek_hole": false, 00:15:51.432 "seek_data": false, 00:15:51.432 "copy": true, 00:15:51.432 "nvme_iov_md": false 00:15:51.432 }, 00:15:51.432 "memory_domains": [ 00:15:51.432 { 00:15:51.433 "dma_device_id": "system", 00:15:51.433 "dma_device_type": 1 00:15:51.433 }, 00:15:51.433 { 00:15:51.433 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:51.433 "dma_device_type": 2 00:15:51.433 } 00:15:51.433 ], 00:15:51.433 "driver_specific": {} 00:15:51.433 }' 00:15:51.433 23:37:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:51.433 23:37:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:51.433 23:37:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:51.433 23:37:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:51.433 23:37:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:51.433 23:37:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:51.433 23:37:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:51.433 23:37:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:51.692 23:37:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:51.692 23:37:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:51.692 23:37:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:51.692 23:37:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:51.692 23:37:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:51.692 23:37:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:51.692 23:37:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:51.692 23:37:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:51.692 "name": "BaseBdev3", 00:15:51.692 "aliases": [ 00:15:51.692 "8a05c107-ff54-4b76-8ae0-392607caf6f7" 00:15:51.692 ], 00:15:51.692 "product_name": "Malloc disk", 00:15:51.692 "block_size": 512, 00:15:51.692 "num_blocks": 65536, 00:15:51.692 "uuid": "8a05c107-ff54-4b76-8ae0-392607caf6f7", 00:15:51.692 "assigned_rate_limits": { 00:15:51.692 "rw_ios_per_sec": 0, 00:15:51.692 "rw_mbytes_per_sec": 0, 00:15:51.692 "r_mbytes_per_sec": 0, 00:15:51.692 "w_mbytes_per_sec": 0 00:15:51.692 }, 00:15:51.692 "claimed": true, 00:15:51.692 "claim_type": "exclusive_write", 00:15:51.692 "zoned": false, 00:15:51.692 "supported_io_types": { 00:15:51.692 "read": true, 00:15:51.692 "write": true, 00:15:51.692 "unmap": true, 00:15:51.692 "flush": true, 00:15:51.692 "reset": true, 00:15:51.692 "nvme_admin": false, 00:15:51.692 "nvme_io": false, 00:15:51.692 "nvme_io_md": false, 00:15:51.692 "write_zeroes": true, 00:15:51.692 "zcopy": true, 00:15:51.692 "get_zone_info": false, 00:15:51.692 "zone_management": false, 00:15:51.692 "zone_append": false, 00:15:51.692 "compare": false, 00:15:51.692 "compare_and_write": false, 00:15:51.692 "abort": true, 00:15:51.692 "seek_hole": false, 00:15:51.692 "seek_data": false, 00:15:51.692 "copy": true, 00:15:51.692 "nvme_iov_md": false 00:15:51.692 }, 00:15:51.692 "memory_domains": [ 00:15:51.692 { 00:15:51.692 "dma_device_id": "system", 00:15:51.692 "dma_device_type": 1 00:15:51.692 }, 00:15:51.692 { 00:15:51.692 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:51.692 "dma_device_type": 2 00:15:51.692 } 00:15:51.692 ], 00:15:51.692 "driver_specific": {} 00:15:51.692 }' 00:15:51.950 23:37:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:51.950 23:37:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:51.950 23:37:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:51.950 23:37:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:51.950 23:37:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:51.950 23:37:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:51.950 23:37:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:51.950 23:37:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:51.950 23:37:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:51.950 23:37:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:52.207 23:37:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:52.207 23:37:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:52.207 23:37:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:52.207 23:37:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:15:52.207 23:37:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:52.207 23:37:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:52.207 "name": "BaseBdev4", 00:15:52.207 "aliases": [ 00:15:52.207 "d3285db1-af31-490e-9545-6f3c9ec0fcf8" 00:15:52.207 ], 00:15:52.207 "product_name": "Malloc disk", 00:15:52.207 "block_size": 512, 00:15:52.207 "num_blocks": 65536, 00:15:52.207 "uuid": "d3285db1-af31-490e-9545-6f3c9ec0fcf8", 00:15:52.207 "assigned_rate_limits": { 00:15:52.207 "rw_ios_per_sec": 0, 00:15:52.207 "rw_mbytes_per_sec": 0, 00:15:52.207 "r_mbytes_per_sec": 0, 00:15:52.207 "w_mbytes_per_sec": 0 00:15:52.207 }, 00:15:52.207 "claimed": true, 00:15:52.207 "claim_type": "exclusive_write", 00:15:52.207 "zoned": false, 00:15:52.207 "supported_io_types": { 00:15:52.207 "read": true, 00:15:52.207 "write": true, 00:15:52.207 "unmap": true, 00:15:52.207 "flush": true, 00:15:52.207 "reset": true, 00:15:52.207 "nvme_admin": false, 00:15:52.207 "nvme_io": false, 00:15:52.207 "nvme_io_md": false, 00:15:52.207 "write_zeroes": true, 00:15:52.207 "zcopy": true, 00:15:52.207 "get_zone_info": false, 00:15:52.207 "zone_management": false, 00:15:52.207 "zone_append": false, 00:15:52.207 "compare": false, 00:15:52.207 "compare_and_write": false, 00:15:52.207 "abort": true, 00:15:52.207 "seek_hole": false, 00:15:52.207 "seek_data": false, 00:15:52.207 "copy": true, 00:15:52.207 "nvme_iov_md": false 00:15:52.207 }, 00:15:52.207 "memory_domains": [ 00:15:52.207 { 00:15:52.207 "dma_device_id": "system", 00:15:52.207 "dma_device_type": 1 00:15:52.207 }, 00:15:52.207 { 00:15:52.207 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:52.207 "dma_device_type": 2 00:15:52.207 } 00:15:52.207 ], 00:15:52.207 "driver_specific": {} 00:15:52.207 }' 00:15:52.207 23:37:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:52.207 23:37:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:52.465 23:37:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:52.465 23:37:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:52.465 23:37:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:52.465 23:37:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:52.465 23:37:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:52.465 23:37:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:52.465 23:37:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:52.465 23:37:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:52.465 23:37:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:52.465 23:37:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:52.465 23:37:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:52.723 [2024-07-24 23:37:37.618757] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:52.723 [2024-07-24 23:37:37.618776] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:52.723 [2024-07-24 23:37:37.618808] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:52.723 23:37:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:15:52.723 23:37:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:15:52.723 23:37:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:52.723 23:37:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:15:52.723 23:37:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:15:52.723 23:37:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 3 00:15:52.723 23:37:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:52.723 23:37:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:15:52.723 23:37:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:52.723 23:37:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:52.723 23:37:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:52.723 23:37:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:52.723 23:37:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:52.723 23:37:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:52.723 23:37:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:52.723 23:37:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:52.723 23:37:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:52.981 23:37:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:52.981 "name": "Existed_Raid", 00:15:52.981 "uuid": "2fc3cf68-abe6-4b22-af5b-df7fdf39fa87", 00:15:52.981 "strip_size_kb": 64, 00:15:52.981 "state": "offline", 00:15:52.981 "raid_level": "concat", 00:15:52.981 "superblock": true, 00:15:52.981 "num_base_bdevs": 4, 00:15:52.981 "num_base_bdevs_discovered": 3, 00:15:52.981 "num_base_bdevs_operational": 3, 00:15:52.981 "base_bdevs_list": [ 00:15:52.981 { 00:15:52.981 "name": null, 00:15:52.981 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:52.981 "is_configured": false, 00:15:52.981 "data_offset": 2048, 00:15:52.981 "data_size": 63488 00:15:52.981 }, 00:15:52.981 { 00:15:52.981 "name": "BaseBdev2", 00:15:52.981 "uuid": "09cb4957-9456-4829-ae40-7615fbcafffa", 00:15:52.981 "is_configured": true, 00:15:52.981 "data_offset": 2048, 00:15:52.981 "data_size": 63488 00:15:52.981 }, 00:15:52.981 { 00:15:52.981 "name": "BaseBdev3", 00:15:52.981 "uuid": "8a05c107-ff54-4b76-8ae0-392607caf6f7", 00:15:52.981 "is_configured": true, 00:15:52.981 "data_offset": 2048, 00:15:52.981 "data_size": 63488 00:15:52.981 }, 00:15:52.981 { 00:15:52.981 "name": "BaseBdev4", 00:15:52.981 "uuid": "d3285db1-af31-490e-9545-6f3c9ec0fcf8", 00:15:52.981 "is_configured": true, 00:15:52.981 "data_offset": 2048, 00:15:52.981 "data_size": 63488 00:15:52.981 } 00:15:52.981 ] 00:15:52.981 }' 00:15:52.981 23:37:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:52.981 23:37:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:53.546 23:37:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:15:53.546 23:37:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:53.546 23:37:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:53.546 23:37:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:53.546 23:37:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:53.546 23:37:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:53.546 23:37:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:15:53.804 [2024-07-24 23:37:38.610215] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:53.804 23:37:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:53.804 23:37:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:53.804 23:37:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:53.804 23:37:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:54.062 23:37:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:54.062 23:37:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:54.062 23:37:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:15:54.062 [2024-07-24 23:37:38.952838] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:54.062 23:37:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:54.062 23:37:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:54.062 23:37:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:54.062 23:37:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:54.320 23:37:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:54.320 23:37:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:54.320 23:37:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:15:54.320 [2024-07-24 23:37:39.287431] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:15:54.320 [2024-07-24 23:37:39.287463] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe153d0 name Existed_Raid, state offline 00:15:54.320 23:37:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:54.320 23:37:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:54.320 23:37:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:54.320 23:37:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:15:54.578 23:37:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:15:54.578 23:37:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:15:54.578 23:37:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:15:54.578 23:37:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:15:54.578 23:37:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:54.578 23:37:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:54.836 BaseBdev2 00:15:54.836 23:37:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:15:54.836 23:37:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:15:54.836 23:37:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:15:54.836 23:37:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:15:54.836 23:37:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:15:54.836 23:37:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:15:54.836 23:37:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:54.836 23:37:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:55.095 [ 00:15:55.095 { 00:15:55.095 "name": "BaseBdev2", 00:15:55.095 "aliases": [ 00:15:55.095 "975d91c1-f9f3-4e86-bb15-f7a8f770d1f6" 00:15:55.095 ], 00:15:55.095 "product_name": "Malloc disk", 00:15:55.095 "block_size": 512, 00:15:55.095 "num_blocks": 65536, 00:15:55.095 "uuid": "975d91c1-f9f3-4e86-bb15-f7a8f770d1f6", 00:15:55.095 "assigned_rate_limits": { 00:15:55.095 "rw_ios_per_sec": 0, 00:15:55.095 "rw_mbytes_per_sec": 0, 00:15:55.095 "r_mbytes_per_sec": 0, 00:15:55.095 "w_mbytes_per_sec": 0 00:15:55.095 }, 00:15:55.095 "claimed": false, 00:15:55.095 "zoned": false, 00:15:55.095 "supported_io_types": { 00:15:55.095 "read": true, 00:15:55.095 "write": true, 00:15:55.095 "unmap": true, 00:15:55.095 "flush": true, 00:15:55.095 "reset": true, 00:15:55.095 "nvme_admin": false, 00:15:55.095 "nvme_io": false, 00:15:55.095 "nvme_io_md": false, 00:15:55.095 "write_zeroes": true, 00:15:55.095 "zcopy": true, 00:15:55.095 "get_zone_info": false, 00:15:55.095 "zone_management": false, 00:15:55.095 "zone_append": false, 00:15:55.095 "compare": false, 00:15:55.095 "compare_and_write": false, 00:15:55.095 "abort": true, 00:15:55.095 "seek_hole": false, 00:15:55.095 "seek_data": false, 00:15:55.095 "copy": true, 00:15:55.095 "nvme_iov_md": false 00:15:55.095 }, 00:15:55.095 "memory_domains": [ 00:15:55.095 { 00:15:55.095 "dma_device_id": "system", 00:15:55.095 "dma_device_type": 1 00:15:55.095 }, 00:15:55.095 { 00:15:55.095 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:55.095 "dma_device_type": 2 00:15:55.095 } 00:15:55.095 ], 00:15:55.095 "driver_specific": {} 00:15:55.095 } 00:15:55.095 ] 00:15:55.095 23:37:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:15:55.095 23:37:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:55.095 23:37:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:55.095 23:37:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:55.353 BaseBdev3 00:15:55.353 23:37:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:15:55.353 23:37:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:15:55.353 23:37:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:15:55.353 23:37:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:15:55.353 23:37:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:15:55.353 23:37:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:15:55.353 23:37:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:55.353 23:37:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:55.611 [ 00:15:55.611 { 00:15:55.611 "name": "BaseBdev3", 00:15:55.611 "aliases": [ 00:15:55.612 "0f4ad446-83f5-42f4-a58c-8efd3b9a34f6" 00:15:55.612 ], 00:15:55.612 "product_name": "Malloc disk", 00:15:55.612 "block_size": 512, 00:15:55.612 "num_blocks": 65536, 00:15:55.612 "uuid": "0f4ad446-83f5-42f4-a58c-8efd3b9a34f6", 00:15:55.612 "assigned_rate_limits": { 00:15:55.612 "rw_ios_per_sec": 0, 00:15:55.612 "rw_mbytes_per_sec": 0, 00:15:55.612 "r_mbytes_per_sec": 0, 00:15:55.612 "w_mbytes_per_sec": 0 00:15:55.612 }, 00:15:55.612 "claimed": false, 00:15:55.612 "zoned": false, 00:15:55.612 "supported_io_types": { 00:15:55.612 "read": true, 00:15:55.612 "write": true, 00:15:55.612 "unmap": true, 00:15:55.612 "flush": true, 00:15:55.612 "reset": true, 00:15:55.612 "nvme_admin": false, 00:15:55.612 "nvme_io": false, 00:15:55.612 "nvme_io_md": false, 00:15:55.612 "write_zeroes": true, 00:15:55.612 "zcopy": true, 00:15:55.612 "get_zone_info": false, 00:15:55.612 "zone_management": false, 00:15:55.612 "zone_append": false, 00:15:55.612 "compare": false, 00:15:55.612 "compare_and_write": false, 00:15:55.612 "abort": true, 00:15:55.612 "seek_hole": false, 00:15:55.612 "seek_data": false, 00:15:55.612 "copy": true, 00:15:55.612 "nvme_iov_md": false 00:15:55.612 }, 00:15:55.612 "memory_domains": [ 00:15:55.612 { 00:15:55.612 "dma_device_id": "system", 00:15:55.612 "dma_device_type": 1 00:15:55.612 }, 00:15:55.612 { 00:15:55.612 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:55.612 "dma_device_type": 2 00:15:55.612 } 00:15:55.612 ], 00:15:55.612 "driver_specific": {} 00:15:55.612 } 00:15:55.612 ] 00:15:55.612 23:37:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:15:55.612 23:37:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:55.612 23:37:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:55.612 23:37:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:15:55.870 BaseBdev4 00:15:55.870 23:37:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:15:55.870 23:37:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev4 00:15:55.870 23:37:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:15:55.870 23:37:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:15:55.870 23:37:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:15:55.870 23:37:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:15:55.870 23:37:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:55.870 23:37:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:15:56.129 [ 00:15:56.129 { 00:15:56.129 "name": "BaseBdev4", 00:15:56.129 "aliases": [ 00:15:56.129 "195228c2-fe72-419d-966a-942ad945756e" 00:15:56.129 ], 00:15:56.129 "product_name": "Malloc disk", 00:15:56.129 "block_size": 512, 00:15:56.129 "num_blocks": 65536, 00:15:56.129 "uuid": "195228c2-fe72-419d-966a-942ad945756e", 00:15:56.129 "assigned_rate_limits": { 00:15:56.129 "rw_ios_per_sec": 0, 00:15:56.129 "rw_mbytes_per_sec": 0, 00:15:56.129 "r_mbytes_per_sec": 0, 00:15:56.129 "w_mbytes_per_sec": 0 00:15:56.129 }, 00:15:56.129 "claimed": false, 00:15:56.129 "zoned": false, 00:15:56.129 "supported_io_types": { 00:15:56.129 "read": true, 00:15:56.129 "write": true, 00:15:56.129 "unmap": true, 00:15:56.129 "flush": true, 00:15:56.129 "reset": true, 00:15:56.129 "nvme_admin": false, 00:15:56.129 "nvme_io": false, 00:15:56.129 "nvme_io_md": false, 00:15:56.129 "write_zeroes": true, 00:15:56.129 "zcopy": true, 00:15:56.129 "get_zone_info": false, 00:15:56.129 "zone_management": false, 00:15:56.129 "zone_append": false, 00:15:56.129 "compare": false, 00:15:56.129 "compare_and_write": false, 00:15:56.129 "abort": true, 00:15:56.129 "seek_hole": false, 00:15:56.129 "seek_data": false, 00:15:56.129 "copy": true, 00:15:56.129 "nvme_iov_md": false 00:15:56.129 }, 00:15:56.129 "memory_domains": [ 00:15:56.129 { 00:15:56.129 "dma_device_id": "system", 00:15:56.129 "dma_device_type": 1 00:15:56.129 }, 00:15:56.129 { 00:15:56.129 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:56.129 "dma_device_type": 2 00:15:56.129 } 00:15:56.129 ], 00:15:56.129 "driver_specific": {} 00:15:56.129 } 00:15:56.129 ] 00:15:56.129 23:37:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:15:56.129 23:37:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:56.129 23:37:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:56.129 23:37:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:15:56.387 [2024-07-24 23:37:41.129049] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:56.388 [2024-07-24 23:37:41.129078] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:56.388 [2024-07-24 23:37:41.129089] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:56.388 [2024-07-24 23:37:41.130149] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:56.388 [2024-07-24 23:37:41.130180] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:15:56.388 23:37:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:15:56.388 23:37:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:56.388 23:37:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:56.388 23:37:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:56.388 23:37:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:56.388 23:37:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:56.388 23:37:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:56.388 23:37:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:56.388 23:37:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:56.388 23:37:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:56.388 23:37:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:56.388 23:37:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:56.388 23:37:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:56.388 "name": "Existed_Raid", 00:15:56.388 "uuid": "657414e6-debe-404d-b2f1-68dcefd06efc", 00:15:56.388 "strip_size_kb": 64, 00:15:56.388 "state": "configuring", 00:15:56.388 "raid_level": "concat", 00:15:56.388 "superblock": true, 00:15:56.388 "num_base_bdevs": 4, 00:15:56.388 "num_base_bdevs_discovered": 3, 00:15:56.388 "num_base_bdevs_operational": 4, 00:15:56.388 "base_bdevs_list": [ 00:15:56.388 { 00:15:56.388 "name": "BaseBdev1", 00:15:56.388 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:56.388 "is_configured": false, 00:15:56.388 "data_offset": 0, 00:15:56.388 "data_size": 0 00:15:56.388 }, 00:15:56.388 { 00:15:56.388 "name": "BaseBdev2", 00:15:56.388 "uuid": "975d91c1-f9f3-4e86-bb15-f7a8f770d1f6", 00:15:56.388 "is_configured": true, 00:15:56.388 "data_offset": 2048, 00:15:56.388 "data_size": 63488 00:15:56.388 }, 00:15:56.388 { 00:15:56.388 "name": "BaseBdev3", 00:15:56.388 "uuid": "0f4ad446-83f5-42f4-a58c-8efd3b9a34f6", 00:15:56.388 "is_configured": true, 00:15:56.388 "data_offset": 2048, 00:15:56.388 "data_size": 63488 00:15:56.388 }, 00:15:56.388 { 00:15:56.388 "name": "BaseBdev4", 00:15:56.388 "uuid": "195228c2-fe72-419d-966a-942ad945756e", 00:15:56.388 "is_configured": true, 00:15:56.388 "data_offset": 2048, 00:15:56.388 "data_size": 63488 00:15:56.388 } 00:15:56.388 ] 00:15:56.388 }' 00:15:56.388 23:37:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:56.388 23:37:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:56.955 23:37:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:15:57.214 [2024-07-24 23:37:41.979221] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:57.214 23:37:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:15:57.214 23:37:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:57.214 23:37:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:57.214 23:37:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:57.214 23:37:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:57.214 23:37:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:57.214 23:37:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:57.214 23:37:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:57.214 23:37:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:57.214 23:37:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:57.214 23:37:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:57.214 23:37:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:57.214 23:37:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:57.214 "name": "Existed_Raid", 00:15:57.214 "uuid": "657414e6-debe-404d-b2f1-68dcefd06efc", 00:15:57.214 "strip_size_kb": 64, 00:15:57.214 "state": "configuring", 00:15:57.214 "raid_level": "concat", 00:15:57.214 "superblock": true, 00:15:57.214 "num_base_bdevs": 4, 00:15:57.214 "num_base_bdevs_discovered": 2, 00:15:57.214 "num_base_bdevs_operational": 4, 00:15:57.214 "base_bdevs_list": [ 00:15:57.214 { 00:15:57.214 "name": "BaseBdev1", 00:15:57.214 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:57.214 "is_configured": false, 00:15:57.214 "data_offset": 0, 00:15:57.214 "data_size": 0 00:15:57.214 }, 00:15:57.214 { 00:15:57.214 "name": null, 00:15:57.214 "uuid": "975d91c1-f9f3-4e86-bb15-f7a8f770d1f6", 00:15:57.214 "is_configured": false, 00:15:57.214 "data_offset": 2048, 00:15:57.214 "data_size": 63488 00:15:57.214 }, 00:15:57.214 { 00:15:57.214 "name": "BaseBdev3", 00:15:57.214 "uuid": "0f4ad446-83f5-42f4-a58c-8efd3b9a34f6", 00:15:57.214 "is_configured": true, 00:15:57.214 "data_offset": 2048, 00:15:57.214 "data_size": 63488 00:15:57.214 }, 00:15:57.214 { 00:15:57.214 "name": "BaseBdev4", 00:15:57.214 "uuid": "195228c2-fe72-419d-966a-942ad945756e", 00:15:57.214 "is_configured": true, 00:15:57.214 "data_offset": 2048, 00:15:57.214 "data_size": 63488 00:15:57.214 } 00:15:57.214 ] 00:15:57.214 }' 00:15:57.214 23:37:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:57.214 23:37:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:57.780 23:37:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:57.780 23:37:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:58.039 23:37:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:15:58.039 23:37:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:58.039 [2024-07-24 23:37:43.004672] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:58.039 BaseBdev1 00:15:58.039 23:37:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:15:58.039 23:37:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:15:58.039 23:37:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:15:58.039 23:37:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:15:58.039 23:37:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:15:58.039 23:37:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:15:58.039 23:37:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:58.297 23:37:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:58.556 [ 00:15:58.556 { 00:15:58.556 "name": "BaseBdev1", 00:15:58.556 "aliases": [ 00:15:58.556 "8f3df9f6-ecc4-410f-9f82-7d91e7e06cef" 00:15:58.556 ], 00:15:58.556 "product_name": "Malloc disk", 00:15:58.556 "block_size": 512, 00:15:58.556 "num_blocks": 65536, 00:15:58.556 "uuid": "8f3df9f6-ecc4-410f-9f82-7d91e7e06cef", 00:15:58.556 "assigned_rate_limits": { 00:15:58.556 "rw_ios_per_sec": 0, 00:15:58.556 "rw_mbytes_per_sec": 0, 00:15:58.556 "r_mbytes_per_sec": 0, 00:15:58.556 "w_mbytes_per_sec": 0 00:15:58.556 }, 00:15:58.556 "claimed": true, 00:15:58.556 "claim_type": "exclusive_write", 00:15:58.556 "zoned": false, 00:15:58.556 "supported_io_types": { 00:15:58.556 "read": true, 00:15:58.556 "write": true, 00:15:58.556 "unmap": true, 00:15:58.556 "flush": true, 00:15:58.556 "reset": true, 00:15:58.556 "nvme_admin": false, 00:15:58.556 "nvme_io": false, 00:15:58.556 "nvme_io_md": false, 00:15:58.556 "write_zeroes": true, 00:15:58.556 "zcopy": true, 00:15:58.556 "get_zone_info": false, 00:15:58.556 "zone_management": false, 00:15:58.556 "zone_append": false, 00:15:58.556 "compare": false, 00:15:58.556 "compare_and_write": false, 00:15:58.556 "abort": true, 00:15:58.556 "seek_hole": false, 00:15:58.556 "seek_data": false, 00:15:58.556 "copy": true, 00:15:58.556 "nvme_iov_md": false 00:15:58.556 }, 00:15:58.556 "memory_domains": [ 00:15:58.556 { 00:15:58.556 "dma_device_id": "system", 00:15:58.556 "dma_device_type": 1 00:15:58.556 }, 00:15:58.556 { 00:15:58.556 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:58.556 "dma_device_type": 2 00:15:58.556 } 00:15:58.556 ], 00:15:58.556 "driver_specific": {} 00:15:58.556 } 00:15:58.556 ] 00:15:58.556 23:37:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:15:58.556 23:37:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:15:58.556 23:37:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:58.556 23:37:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:58.556 23:37:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:58.556 23:37:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:58.556 23:37:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:58.556 23:37:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:58.556 23:37:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:58.556 23:37:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:58.556 23:37:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:58.556 23:37:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:58.556 23:37:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:58.556 23:37:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:58.556 "name": "Existed_Raid", 00:15:58.556 "uuid": "657414e6-debe-404d-b2f1-68dcefd06efc", 00:15:58.556 "strip_size_kb": 64, 00:15:58.556 "state": "configuring", 00:15:58.556 "raid_level": "concat", 00:15:58.556 "superblock": true, 00:15:58.556 "num_base_bdevs": 4, 00:15:58.556 "num_base_bdevs_discovered": 3, 00:15:58.556 "num_base_bdevs_operational": 4, 00:15:58.556 "base_bdevs_list": [ 00:15:58.556 { 00:15:58.556 "name": "BaseBdev1", 00:15:58.556 "uuid": "8f3df9f6-ecc4-410f-9f82-7d91e7e06cef", 00:15:58.556 "is_configured": true, 00:15:58.556 "data_offset": 2048, 00:15:58.556 "data_size": 63488 00:15:58.556 }, 00:15:58.556 { 00:15:58.556 "name": null, 00:15:58.556 "uuid": "975d91c1-f9f3-4e86-bb15-f7a8f770d1f6", 00:15:58.556 "is_configured": false, 00:15:58.556 "data_offset": 2048, 00:15:58.556 "data_size": 63488 00:15:58.556 }, 00:15:58.556 { 00:15:58.556 "name": "BaseBdev3", 00:15:58.556 "uuid": "0f4ad446-83f5-42f4-a58c-8efd3b9a34f6", 00:15:58.556 "is_configured": true, 00:15:58.556 "data_offset": 2048, 00:15:58.556 "data_size": 63488 00:15:58.556 }, 00:15:58.556 { 00:15:58.556 "name": "BaseBdev4", 00:15:58.556 "uuid": "195228c2-fe72-419d-966a-942ad945756e", 00:15:58.556 "is_configured": true, 00:15:58.556 "data_offset": 2048, 00:15:58.556 "data_size": 63488 00:15:58.556 } 00:15:58.556 ] 00:15:58.556 }' 00:15:58.556 23:37:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:58.556 23:37:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:59.124 23:37:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:59.124 23:37:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:59.390 23:37:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:15:59.390 23:37:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:15:59.390 [2024-07-24 23:37:44.352342] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:59.390 23:37:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:15:59.390 23:37:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:59.390 23:37:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:59.390 23:37:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:59.390 23:37:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:59.390 23:37:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:15:59.390 23:37:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:59.390 23:37:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:59.390 23:37:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:59.390 23:37:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:59.390 23:37:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:59.390 23:37:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:59.650 23:37:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:59.650 "name": "Existed_Raid", 00:15:59.650 "uuid": "657414e6-debe-404d-b2f1-68dcefd06efc", 00:15:59.650 "strip_size_kb": 64, 00:15:59.650 "state": "configuring", 00:15:59.650 "raid_level": "concat", 00:15:59.650 "superblock": true, 00:15:59.650 "num_base_bdevs": 4, 00:15:59.650 "num_base_bdevs_discovered": 2, 00:15:59.650 "num_base_bdevs_operational": 4, 00:15:59.650 "base_bdevs_list": [ 00:15:59.650 { 00:15:59.650 "name": "BaseBdev1", 00:15:59.650 "uuid": "8f3df9f6-ecc4-410f-9f82-7d91e7e06cef", 00:15:59.650 "is_configured": true, 00:15:59.650 "data_offset": 2048, 00:15:59.650 "data_size": 63488 00:15:59.650 }, 00:15:59.650 { 00:15:59.650 "name": null, 00:15:59.650 "uuid": "975d91c1-f9f3-4e86-bb15-f7a8f770d1f6", 00:15:59.650 "is_configured": false, 00:15:59.650 "data_offset": 2048, 00:15:59.650 "data_size": 63488 00:15:59.650 }, 00:15:59.650 { 00:15:59.650 "name": null, 00:15:59.650 "uuid": "0f4ad446-83f5-42f4-a58c-8efd3b9a34f6", 00:15:59.651 "is_configured": false, 00:15:59.651 "data_offset": 2048, 00:15:59.651 "data_size": 63488 00:15:59.651 }, 00:15:59.651 { 00:15:59.651 "name": "BaseBdev4", 00:15:59.651 "uuid": "195228c2-fe72-419d-966a-942ad945756e", 00:15:59.651 "is_configured": true, 00:15:59.651 "data_offset": 2048, 00:15:59.651 "data_size": 63488 00:15:59.651 } 00:15:59.651 ] 00:15:59.651 }' 00:15:59.651 23:37:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:59.651 23:37:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:00.216 23:37:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:00.216 23:37:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:00.216 23:37:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:16:00.216 23:37:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:16:00.475 [2024-07-24 23:37:45.354959] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:00.475 23:37:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:00.475 23:37:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:00.475 23:37:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:00.475 23:37:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:00.475 23:37:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:00.475 23:37:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:00.475 23:37:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:00.475 23:37:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:00.475 23:37:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:00.475 23:37:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:00.475 23:37:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:00.475 23:37:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:00.733 23:37:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:00.733 "name": "Existed_Raid", 00:16:00.733 "uuid": "657414e6-debe-404d-b2f1-68dcefd06efc", 00:16:00.733 "strip_size_kb": 64, 00:16:00.733 "state": "configuring", 00:16:00.733 "raid_level": "concat", 00:16:00.733 "superblock": true, 00:16:00.733 "num_base_bdevs": 4, 00:16:00.733 "num_base_bdevs_discovered": 3, 00:16:00.733 "num_base_bdevs_operational": 4, 00:16:00.733 "base_bdevs_list": [ 00:16:00.733 { 00:16:00.733 "name": "BaseBdev1", 00:16:00.733 "uuid": "8f3df9f6-ecc4-410f-9f82-7d91e7e06cef", 00:16:00.733 "is_configured": true, 00:16:00.733 "data_offset": 2048, 00:16:00.733 "data_size": 63488 00:16:00.733 }, 00:16:00.733 { 00:16:00.733 "name": null, 00:16:00.733 "uuid": "975d91c1-f9f3-4e86-bb15-f7a8f770d1f6", 00:16:00.733 "is_configured": false, 00:16:00.733 "data_offset": 2048, 00:16:00.733 "data_size": 63488 00:16:00.733 }, 00:16:00.733 { 00:16:00.733 "name": "BaseBdev3", 00:16:00.733 "uuid": "0f4ad446-83f5-42f4-a58c-8efd3b9a34f6", 00:16:00.733 "is_configured": true, 00:16:00.733 "data_offset": 2048, 00:16:00.733 "data_size": 63488 00:16:00.733 }, 00:16:00.733 { 00:16:00.733 "name": "BaseBdev4", 00:16:00.733 "uuid": "195228c2-fe72-419d-966a-942ad945756e", 00:16:00.733 "is_configured": true, 00:16:00.733 "data_offset": 2048, 00:16:00.733 "data_size": 63488 00:16:00.733 } 00:16:00.733 ] 00:16:00.733 }' 00:16:00.733 23:37:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:00.733 23:37:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:01.299 23:37:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:01.299 23:37:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:01.299 23:37:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:16:01.299 23:37:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:01.558 [2024-07-24 23:37:46.309439] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:01.558 23:37:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:01.558 23:37:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:01.558 23:37:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:01.558 23:37:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:01.558 23:37:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:01.558 23:37:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:01.558 23:37:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:01.558 23:37:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:01.558 23:37:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:01.558 23:37:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:01.558 23:37:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:01.558 23:37:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:01.558 23:37:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:01.558 "name": "Existed_Raid", 00:16:01.558 "uuid": "657414e6-debe-404d-b2f1-68dcefd06efc", 00:16:01.558 "strip_size_kb": 64, 00:16:01.558 "state": "configuring", 00:16:01.558 "raid_level": "concat", 00:16:01.558 "superblock": true, 00:16:01.558 "num_base_bdevs": 4, 00:16:01.558 "num_base_bdevs_discovered": 2, 00:16:01.558 "num_base_bdevs_operational": 4, 00:16:01.558 "base_bdevs_list": [ 00:16:01.558 { 00:16:01.558 "name": null, 00:16:01.558 "uuid": "8f3df9f6-ecc4-410f-9f82-7d91e7e06cef", 00:16:01.558 "is_configured": false, 00:16:01.558 "data_offset": 2048, 00:16:01.558 "data_size": 63488 00:16:01.558 }, 00:16:01.558 { 00:16:01.558 "name": null, 00:16:01.558 "uuid": "975d91c1-f9f3-4e86-bb15-f7a8f770d1f6", 00:16:01.558 "is_configured": false, 00:16:01.558 "data_offset": 2048, 00:16:01.558 "data_size": 63488 00:16:01.558 }, 00:16:01.558 { 00:16:01.558 "name": "BaseBdev3", 00:16:01.558 "uuid": "0f4ad446-83f5-42f4-a58c-8efd3b9a34f6", 00:16:01.558 "is_configured": true, 00:16:01.558 "data_offset": 2048, 00:16:01.558 "data_size": 63488 00:16:01.558 }, 00:16:01.558 { 00:16:01.558 "name": "BaseBdev4", 00:16:01.558 "uuid": "195228c2-fe72-419d-966a-942ad945756e", 00:16:01.558 "is_configured": true, 00:16:01.558 "data_offset": 2048, 00:16:01.558 "data_size": 63488 00:16:01.558 } 00:16:01.558 ] 00:16:01.558 }' 00:16:01.558 23:37:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:01.558 23:37:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:02.123 23:37:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:02.123 23:37:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:02.382 23:37:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:16:02.382 23:37:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:16:02.382 [2024-07-24 23:37:47.305701] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:02.382 23:37:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:02.382 23:37:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:02.382 23:37:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:02.382 23:37:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:02.382 23:37:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:02.382 23:37:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:02.382 23:37:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:02.382 23:37:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:02.382 23:37:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:02.382 23:37:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:02.382 23:37:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:02.382 23:37:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:02.640 23:37:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:02.640 "name": "Existed_Raid", 00:16:02.640 "uuid": "657414e6-debe-404d-b2f1-68dcefd06efc", 00:16:02.640 "strip_size_kb": 64, 00:16:02.640 "state": "configuring", 00:16:02.640 "raid_level": "concat", 00:16:02.640 "superblock": true, 00:16:02.640 "num_base_bdevs": 4, 00:16:02.640 "num_base_bdevs_discovered": 3, 00:16:02.640 "num_base_bdevs_operational": 4, 00:16:02.640 "base_bdevs_list": [ 00:16:02.640 { 00:16:02.641 "name": null, 00:16:02.641 "uuid": "8f3df9f6-ecc4-410f-9f82-7d91e7e06cef", 00:16:02.641 "is_configured": false, 00:16:02.641 "data_offset": 2048, 00:16:02.641 "data_size": 63488 00:16:02.641 }, 00:16:02.641 { 00:16:02.641 "name": "BaseBdev2", 00:16:02.641 "uuid": "975d91c1-f9f3-4e86-bb15-f7a8f770d1f6", 00:16:02.641 "is_configured": true, 00:16:02.641 "data_offset": 2048, 00:16:02.641 "data_size": 63488 00:16:02.641 }, 00:16:02.641 { 00:16:02.641 "name": "BaseBdev3", 00:16:02.641 "uuid": "0f4ad446-83f5-42f4-a58c-8efd3b9a34f6", 00:16:02.641 "is_configured": true, 00:16:02.641 "data_offset": 2048, 00:16:02.641 "data_size": 63488 00:16:02.641 }, 00:16:02.641 { 00:16:02.641 "name": "BaseBdev4", 00:16:02.641 "uuid": "195228c2-fe72-419d-966a-942ad945756e", 00:16:02.641 "is_configured": true, 00:16:02.641 "data_offset": 2048, 00:16:02.641 "data_size": 63488 00:16:02.641 } 00:16:02.641 ] 00:16:02.641 }' 00:16:02.641 23:37:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:02.641 23:37:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:03.209 23:37:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:03.209 23:37:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:03.209 23:37:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:16:03.209 23:37:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:03.209 23:37:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:16:03.467 23:37:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 8f3df9f6-ecc4-410f-9f82-7d91e7e06cef 00:16:03.725 [2024-07-24 23:37:48.495460] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:16:03.725 [2024-07-24 23:37:48.495586] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xe178d0 00:16:03.725 [2024-07-24 23:37:48.495594] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:16:03.725 [2024-07-24 23:37:48.495714] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe012e0 00:16:03.725 [2024-07-24 23:37:48.495801] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe178d0 00:16:03.725 [2024-07-24 23:37:48.495806] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xe178d0 00:16:03.725 [2024-07-24 23:37:48.495867] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:03.725 NewBaseBdev 00:16:03.725 23:37:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:16:03.725 23:37:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=NewBaseBdev 00:16:03.725 23:37:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:16:03.725 23:37:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:16:03.725 23:37:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:16:03.725 23:37:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:16:03.725 23:37:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:03.725 23:37:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:16:03.984 [ 00:16:03.984 { 00:16:03.984 "name": "NewBaseBdev", 00:16:03.984 "aliases": [ 00:16:03.984 "8f3df9f6-ecc4-410f-9f82-7d91e7e06cef" 00:16:03.984 ], 00:16:03.984 "product_name": "Malloc disk", 00:16:03.984 "block_size": 512, 00:16:03.984 "num_blocks": 65536, 00:16:03.984 "uuid": "8f3df9f6-ecc4-410f-9f82-7d91e7e06cef", 00:16:03.984 "assigned_rate_limits": { 00:16:03.984 "rw_ios_per_sec": 0, 00:16:03.984 "rw_mbytes_per_sec": 0, 00:16:03.984 "r_mbytes_per_sec": 0, 00:16:03.984 "w_mbytes_per_sec": 0 00:16:03.984 }, 00:16:03.984 "claimed": true, 00:16:03.984 "claim_type": "exclusive_write", 00:16:03.984 "zoned": false, 00:16:03.984 "supported_io_types": { 00:16:03.984 "read": true, 00:16:03.984 "write": true, 00:16:03.984 "unmap": true, 00:16:03.984 "flush": true, 00:16:03.984 "reset": true, 00:16:03.984 "nvme_admin": false, 00:16:03.984 "nvme_io": false, 00:16:03.984 "nvme_io_md": false, 00:16:03.984 "write_zeroes": true, 00:16:03.984 "zcopy": true, 00:16:03.984 "get_zone_info": false, 00:16:03.984 "zone_management": false, 00:16:03.984 "zone_append": false, 00:16:03.984 "compare": false, 00:16:03.984 "compare_and_write": false, 00:16:03.984 "abort": true, 00:16:03.984 "seek_hole": false, 00:16:03.984 "seek_data": false, 00:16:03.984 "copy": true, 00:16:03.984 "nvme_iov_md": false 00:16:03.984 }, 00:16:03.984 "memory_domains": [ 00:16:03.984 { 00:16:03.984 "dma_device_id": "system", 00:16:03.984 "dma_device_type": 1 00:16:03.984 }, 00:16:03.984 { 00:16:03.984 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:03.984 "dma_device_type": 2 00:16:03.984 } 00:16:03.984 ], 00:16:03.984 "driver_specific": {} 00:16:03.984 } 00:16:03.984 ] 00:16:03.984 23:37:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:16:03.984 23:37:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:16:03.984 23:37:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:03.984 23:37:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:03.984 23:37:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:03.984 23:37:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:03.984 23:37:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:03.984 23:37:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:03.984 23:37:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:03.984 23:37:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:03.984 23:37:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:03.984 23:37:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:03.984 23:37:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:04.242 23:37:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:04.242 "name": "Existed_Raid", 00:16:04.242 "uuid": "657414e6-debe-404d-b2f1-68dcefd06efc", 00:16:04.242 "strip_size_kb": 64, 00:16:04.242 "state": "online", 00:16:04.242 "raid_level": "concat", 00:16:04.242 "superblock": true, 00:16:04.242 "num_base_bdevs": 4, 00:16:04.242 "num_base_bdevs_discovered": 4, 00:16:04.242 "num_base_bdevs_operational": 4, 00:16:04.242 "base_bdevs_list": [ 00:16:04.242 { 00:16:04.242 "name": "NewBaseBdev", 00:16:04.242 "uuid": "8f3df9f6-ecc4-410f-9f82-7d91e7e06cef", 00:16:04.242 "is_configured": true, 00:16:04.242 "data_offset": 2048, 00:16:04.242 "data_size": 63488 00:16:04.242 }, 00:16:04.242 { 00:16:04.242 "name": "BaseBdev2", 00:16:04.242 "uuid": "975d91c1-f9f3-4e86-bb15-f7a8f770d1f6", 00:16:04.242 "is_configured": true, 00:16:04.242 "data_offset": 2048, 00:16:04.242 "data_size": 63488 00:16:04.242 }, 00:16:04.242 { 00:16:04.242 "name": "BaseBdev3", 00:16:04.242 "uuid": "0f4ad446-83f5-42f4-a58c-8efd3b9a34f6", 00:16:04.242 "is_configured": true, 00:16:04.242 "data_offset": 2048, 00:16:04.242 "data_size": 63488 00:16:04.242 }, 00:16:04.242 { 00:16:04.242 "name": "BaseBdev4", 00:16:04.242 "uuid": "195228c2-fe72-419d-966a-942ad945756e", 00:16:04.242 "is_configured": true, 00:16:04.242 "data_offset": 2048, 00:16:04.242 "data_size": 63488 00:16:04.242 } 00:16:04.242 ] 00:16:04.242 }' 00:16:04.242 23:37:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:04.242 23:37:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:04.805 23:37:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:16:04.805 23:37:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:04.805 23:37:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:04.805 23:37:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:04.805 23:37:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:04.805 23:37:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:16:04.805 23:37:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:04.805 23:37:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:04.805 [2024-07-24 23:37:49.646661] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:04.805 23:37:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:04.805 "name": "Existed_Raid", 00:16:04.805 "aliases": [ 00:16:04.805 "657414e6-debe-404d-b2f1-68dcefd06efc" 00:16:04.805 ], 00:16:04.805 "product_name": "Raid Volume", 00:16:04.805 "block_size": 512, 00:16:04.805 "num_blocks": 253952, 00:16:04.805 "uuid": "657414e6-debe-404d-b2f1-68dcefd06efc", 00:16:04.805 "assigned_rate_limits": { 00:16:04.805 "rw_ios_per_sec": 0, 00:16:04.805 "rw_mbytes_per_sec": 0, 00:16:04.805 "r_mbytes_per_sec": 0, 00:16:04.805 "w_mbytes_per_sec": 0 00:16:04.805 }, 00:16:04.805 "claimed": false, 00:16:04.805 "zoned": false, 00:16:04.806 "supported_io_types": { 00:16:04.806 "read": true, 00:16:04.806 "write": true, 00:16:04.806 "unmap": true, 00:16:04.806 "flush": true, 00:16:04.806 "reset": true, 00:16:04.806 "nvme_admin": false, 00:16:04.806 "nvme_io": false, 00:16:04.806 "nvme_io_md": false, 00:16:04.806 "write_zeroes": true, 00:16:04.806 "zcopy": false, 00:16:04.806 "get_zone_info": false, 00:16:04.806 "zone_management": false, 00:16:04.806 "zone_append": false, 00:16:04.806 "compare": false, 00:16:04.806 "compare_and_write": false, 00:16:04.806 "abort": false, 00:16:04.806 "seek_hole": false, 00:16:04.806 "seek_data": false, 00:16:04.806 "copy": false, 00:16:04.806 "nvme_iov_md": false 00:16:04.806 }, 00:16:04.806 "memory_domains": [ 00:16:04.806 { 00:16:04.806 "dma_device_id": "system", 00:16:04.806 "dma_device_type": 1 00:16:04.806 }, 00:16:04.806 { 00:16:04.806 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:04.806 "dma_device_type": 2 00:16:04.806 }, 00:16:04.806 { 00:16:04.806 "dma_device_id": "system", 00:16:04.806 "dma_device_type": 1 00:16:04.806 }, 00:16:04.806 { 00:16:04.806 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:04.806 "dma_device_type": 2 00:16:04.806 }, 00:16:04.806 { 00:16:04.806 "dma_device_id": "system", 00:16:04.806 "dma_device_type": 1 00:16:04.806 }, 00:16:04.806 { 00:16:04.806 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:04.806 "dma_device_type": 2 00:16:04.806 }, 00:16:04.806 { 00:16:04.806 "dma_device_id": "system", 00:16:04.806 "dma_device_type": 1 00:16:04.806 }, 00:16:04.806 { 00:16:04.806 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:04.806 "dma_device_type": 2 00:16:04.806 } 00:16:04.806 ], 00:16:04.806 "driver_specific": { 00:16:04.806 "raid": { 00:16:04.806 "uuid": "657414e6-debe-404d-b2f1-68dcefd06efc", 00:16:04.806 "strip_size_kb": 64, 00:16:04.806 "state": "online", 00:16:04.806 "raid_level": "concat", 00:16:04.806 "superblock": true, 00:16:04.806 "num_base_bdevs": 4, 00:16:04.806 "num_base_bdevs_discovered": 4, 00:16:04.806 "num_base_bdevs_operational": 4, 00:16:04.806 "base_bdevs_list": [ 00:16:04.806 { 00:16:04.806 "name": "NewBaseBdev", 00:16:04.806 "uuid": "8f3df9f6-ecc4-410f-9f82-7d91e7e06cef", 00:16:04.806 "is_configured": true, 00:16:04.806 "data_offset": 2048, 00:16:04.806 "data_size": 63488 00:16:04.806 }, 00:16:04.806 { 00:16:04.806 "name": "BaseBdev2", 00:16:04.806 "uuid": "975d91c1-f9f3-4e86-bb15-f7a8f770d1f6", 00:16:04.806 "is_configured": true, 00:16:04.806 "data_offset": 2048, 00:16:04.806 "data_size": 63488 00:16:04.806 }, 00:16:04.806 { 00:16:04.806 "name": "BaseBdev3", 00:16:04.806 "uuid": "0f4ad446-83f5-42f4-a58c-8efd3b9a34f6", 00:16:04.806 "is_configured": true, 00:16:04.806 "data_offset": 2048, 00:16:04.806 "data_size": 63488 00:16:04.806 }, 00:16:04.806 { 00:16:04.806 "name": "BaseBdev4", 00:16:04.806 "uuid": "195228c2-fe72-419d-966a-942ad945756e", 00:16:04.806 "is_configured": true, 00:16:04.806 "data_offset": 2048, 00:16:04.806 "data_size": 63488 00:16:04.806 } 00:16:04.806 ] 00:16:04.806 } 00:16:04.806 } 00:16:04.806 }' 00:16:04.806 23:37:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:04.806 23:37:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:16:04.806 BaseBdev2 00:16:04.806 BaseBdev3 00:16:04.806 BaseBdev4' 00:16:04.806 23:37:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:04.806 23:37:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:16:04.806 23:37:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:05.063 23:37:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:05.063 "name": "NewBaseBdev", 00:16:05.063 "aliases": [ 00:16:05.063 "8f3df9f6-ecc4-410f-9f82-7d91e7e06cef" 00:16:05.063 ], 00:16:05.063 "product_name": "Malloc disk", 00:16:05.063 "block_size": 512, 00:16:05.063 "num_blocks": 65536, 00:16:05.063 "uuid": "8f3df9f6-ecc4-410f-9f82-7d91e7e06cef", 00:16:05.063 "assigned_rate_limits": { 00:16:05.063 "rw_ios_per_sec": 0, 00:16:05.063 "rw_mbytes_per_sec": 0, 00:16:05.063 "r_mbytes_per_sec": 0, 00:16:05.063 "w_mbytes_per_sec": 0 00:16:05.063 }, 00:16:05.063 "claimed": true, 00:16:05.063 "claim_type": "exclusive_write", 00:16:05.063 "zoned": false, 00:16:05.063 "supported_io_types": { 00:16:05.063 "read": true, 00:16:05.063 "write": true, 00:16:05.063 "unmap": true, 00:16:05.063 "flush": true, 00:16:05.063 "reset": true, 00:16:05.063 "nvme_admin": false, 00:16:05.063 "nvme_io": false, 00:16:05.063 "nvme_io_md": false, 00:16:05.063 "write_zeroes": true, 00:16:05.063 "zcopy": true, 00:16:05.063 "get_zone_info": false, 00:16:05.063 "zone_management": false, 00:16:05.063 "zone_append": false, 00:16:05.063 "compare": false, 00:16:05.063 "compare_and_write": false, 00:16:05.063 "abort": true, 00:16:05.063 "seek_hole": false, 00:16:05.063 "seek_data": false, 00:16:05.063 "copy": true, 00:16:05.063 "nvme_iov_md": false 00:16:05.063 }, 00:16:05.063 "memory_domains": [ 00:16:05.063 { 00:16:05.063 "dma_device_id": "system", 00:16:05.063 "dma_device_type": 1 00:16:05.063 }, 00:16:05.063 { 00:16:05.063 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:05.063 "dma_device_type": 2 00:16:05.063 } 00:16:05.063 ], 00:16:05.063 "driver_specific": {} 00:16:05.063 }' 00:16:05.063 23:37:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:05.063 23:37:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:05.063 23:37:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:05.063 23:37:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:05.063 23:37:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:05.063 23:37:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:05.063 23:37:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:05.320 23:37:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:05.320 23:37:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:05.320 23:37:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:05.320 23:37:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:05.320 23:37:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:05.320 23:37:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:05.320 23:37:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:05.320 23:37:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:05.578 23:37:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:05.578 "name": "BaseBdev2", 00:16:05.578 "aliases": [ 00:16:05.578 "975d91c1-f9f3-4e86-bb15-f7a8f770d1f6" 00:16:05.578 ], 00:16:05.578 "product_name": "Malloc disk", 00:16:05.578 "block_size": 512, 00:16:05.578 "num_blocks": 65536, 00:16:05.578 "uuid": "975d91c1-f9f3-4e86-bb15-f7a8f770d1f6", 00:16:05.578 "assigned_rate_limits": { 00:16:05.578 "rw_ios_per_sec": 0, 00:16:05.578 "rw_mbytes_per_sec": 0, 00:16:05.578 "r_mbytes_per_sec": 0, 00:16:05.578 "w_mbytes_per_sec": 0 00:16:05.578 }, 00:16:05.578 "claimed": true, 00:16:05.578 "claim_type": "exclusive_write", 00:16:05.578 "zoned": false, 00:16:05.578 "supported_io_types": { 00:16:05.578 "read": true, 00:16:05.578 "write": true, 00:16:05.578 "unmap": true, 00:16:05.578 "flush": true, 00:16:05.578 "reset": true, 00:16:05.578 "nvme_admin": false, 00:16:05.578 "nvme_io": false, 00:16:05.578 "nvme_io_md": false, 00:16:05.578 "write_zeroes": true, 00:16:05.578 "zcopy": true, 00:16:05.578 "get_zone_info": false, 00:16:05.578 "zone_management": false, 00:16:05.578 "zone_append": false, 00:16:05.578 "compare": false, 00:16:05.578 "compare_and_write": false, 00:16:05.578 "abort": true, 00:16:05.578 "seek_hole": false, 00:16:05.578 "seek_data": false, 00:16:05.578 "copy": true, 00:16:05.578 "nvme_iov_md": false 00:16:05.578 }, 00:16:05.578 "memory_domains": [ 00:16:05.578 { 00:16:05.578 "dma_device_id": "system", 00:16:05.578 "dma_device_type": 1 00:16:05.578 }, 00:16:05.578 { 00:16:05.578 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:05.578 "dma_device_type": 2 00:16:05.578 } 00:16:05.578 ], 00:16:05.578 "driver_specific": {} 00:16:05.578 }' 00:16:05.578 23:37:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:05.578 23:37:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:05.578 23:37:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:05.578 23:37:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:05.578 23:37:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:05.578 23:37:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:05.578 23:37:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:05.578 23:37:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:05.578 23:37:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:05.578 23:37:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:05.836 23:37:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:05.836 23:37:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:05.836 23:37:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:05.836 23:37:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:05.836 23:37:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:05.836 23:37:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:05.836 "name": "BaseBdev3", 00:16:05.836 "aliases": [ 00:16:05.836 "0f4ad446-83f5-42f4-a58c-8efd3b9a34f6" 00:16:05.836 ], 00:16:05.836 "product_name": "Malloc disk", 00:16:05.836 "block_size": 512, 00:16:05.836 "num_blocks": 65536, 00:16:05.836 "uuid": "0f4ad446-83f5-42f4-a58c-8efd3b9a34f6", 00:16:05.836 "assigned_rate_limits": { 00:16:05.836 "rw_ios_per_sec": 0, 00:16:05.836 "rw_mbytes_per_sec": 0, 00:16:05.836 "r_mbytes_per_sec": 0, 00:16:05.836 "w_mbytes_per_sec": 0 00:16:05.836 }, 00:16:05.836 "claimed": true, 00:16:05.836 "claim_type": "exclusive_write", 00:16:05.836 "zoned": false, 00:16:05.836 "supported_io_types": { 00:16:05.836 "read": true, 00:16:05.836 "write": true, 00:16:05.836 "unmap": true, 00:16:05.836 "flush": true, 00:16:05.836 "reset": true, 00:16:05.836 "nvme_admin": false, 00:16:05.836 "nvme_io": false, 00:16:05.836 "nvme_io_md": false, 00:16:05.836 "write_zeroes": true, 00:16:05.836 "zcopy": true, 00:16:05.836 "get_zone_info": false, 00:16:05.836 "zone_management": false, 00:16:05.836 "zone_append": false, 00:16:05.836 "compare": false, 00:16:05.836 "compare_and_write": false, 00:16:05.836 "abort": true, 00:16:05.836 "seek_hole": false, 00:16:05.836 "seek_data": false, 00:16:05.836 "copy": true, 00:16:05.836 "nvme_iov_md": false 00:16:05.836 }, 00:16:05.836 "memory_domains": [ 00:16:05.836 { 00:16:05.836 "dma_device_id": "system", 00:16:05.836 "dma_device_type": 1 00:16:05.836 }, 00:16:05.836 { 00:16:05.836 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:05.836 "dma_device_type": 2 00:16:05.836 } 00:16:05.836 ], 00:16:05.836 "driver_specific": {} 00:16:05.836 }' 00:16:05.836 23:37:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:05.836 23:37:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:06.094 23:37:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:06.094 23:37:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:06.094 23:37:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:06.094 23:37:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:06.094 23:37:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:06.094 23:37:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:06.094 23:37:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:06.094 23:37:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:06.094 23:37:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:06.094 23:37:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:06.094 23:37:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:06.094 23:37:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:06.094 23:37:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:16:06.352 23:37:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:06.352 "name": "BaseBdev4", 00:16:06.352 "aliases": [ 00:16:06.352 "195228c2-fe72-419d-966a-942ad945756e" 00:16:06.352 ], 00:16:06.352 "product_name": "Malloc disk", 00:16:06.352 "block_size": 512, 00:16:06.352 "num_blocks": 65536, 00:16:06.352 "uuid": "195228c2-fe72-419d-966a-942ad945756e", 00:16:06.352 "assigned_rate_limits": { 00:16:06.352 "rw_ios_per_sec": 0, 00:16:06.352 "rw_mbytes_per_sec": 0, 00:16:06.352 "r_mbytes_per_sec": 0, 00:16:06.352 "w_mbytes_per_sec": 0 00:16:06.352 }, 00:16:06.352 "claimed": true, 00:16:06.352 "claim_type": "exclusive_write", 00:16:06.352 "zoned": false, 00:16:06.352 "supported_io_types": { 00:16:06.352 "read": true, 00:16:06.352 "write": true, 00:16:06.352 "unmap": true, 00:16:06.352 "flush": true, 00:16:06.352 "reset": true, 00:16:06.352 "nvme_admin": false, 00:16:06.352 "nvme_io": false, 00:16:06.352 "nvme_io_md": false, 00:16:06.352 "write_zeroes": true, 00:16:06.352 "zcopy": true, 00:16:06.352 "get_zone_info": false, 00:16:06.352 "zone_management": false, 00:16:06.352 "zone_append": false, 00:16:06.352 "compare": false, 00:16:06.352 "compare_and_write": false, 00:16:06.352 "abort": true, 00:16:06.352 "seek_hole": false, 00:16:06.352 "seek_data": false, 00:16:06.352 "copy": true, 00:16:06.352 "nvme_iov_md": false 00:16:06.352 }, 00:16:06.352 "memory_domains": [ 00:16:06.352 { 00:16:06.352 "dma_device_id": "system", 00:16:06.352 "dma_device_type": 1 00:16:06.352 }, 00:16:06.352 { 00:16:06.352 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:06.352 "dma_device_type": 2 00:16:06.352 } 00:16:06.352 ], 00:16:06.352 "driver_specific": {} 00:16:06.352 }' 00:16:06.352 23:37:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:06.352 23:37:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:06.352 23:37:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:06.352 23:37:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:06.610 23:37:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:06.610 23:37:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:06.610 23:37:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:06.610 23:37:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:06.610 23:37:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:06.610 23:37:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:06.610 23:37:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:06.610 23:37:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:06.610 23:37:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:06.868 [2024-07-24 23:37:51.675712] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:06.868 [2024-07-24 23:37:51.675732] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:06.868 [2024-07-24 23:37:51.675766] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:06.868 [2024-07-24 23:37:51.675807] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:06.868 [2024-07-24 23:37:51.675813] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe178d0 name Existed_Raid, state offline 00:16:06.868 23:37:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 321249 00:16:06.868 23:37:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@950 -- # '[' -z 321249 ']' 00:16:06.868 23:37:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # kill -0 321249 00:16:06.868 23:37:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # uname 00:16:06.868 23:37:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:16:06.868 23:37:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 321249 00:16:06.868 23:37:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:16:06.868 23:37:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:16:06.868 23:37:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # echo 'killing process with pid 321249' 00:16:06.868 killing process with pid 321249 00:16:06.868 23:37:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@969 -- # kill 321249 00:16:06.868 [2024-07-24 23:37:51.732970] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:06.868 23:37:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@974 -- # wait 321249 00:16:06.868 [2024-07-24 23:37:51.763978] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:07.126 23:37:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:16:07.126 00:16:07.126 real 0m24.144s 00:16:07.126 user 0m44.917s 00:16:07.126 sys 0m3.757s 00:16:07.126 23:37:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1126 -- # xtrace_disable 00:16:07.126 23:37:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:07.126 ************************************ 00:16:07.126 END TEST raid_state_function_test_sb 00:16:07.126 ************************************ 00:16:07.126 23:37:51 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 4 00:16:07.126 23:37:51 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:16:07.126 23:37:51 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:16:07.126 23:37:51 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:07.126 ************************************ 00:16:07.126 START TEST raid_superblock_test 00:16:07.126 ************************************ 00:16:07.126 23:37:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # raid_superblock_test concat 4 00:16:07.126 23:37:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:16:07.126 23:37:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:16:07.126 23:37:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:16:07.126 23:37:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:16:07.126 23:37:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:16:07.126 23:37:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:16:07.126 23:37:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:16:07.126 23:37:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:16:07.126 23:37:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:16:07.126 23:37:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:16:07.126 23:37:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:16:07.126 23:37:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:16:07.126 23:37:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:16:07.126 23:37:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:16:07.126 23:37:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:16:07.126 23:37:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:16:07.126 23:37:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:16:07.126 23:37:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=325862 00:16:07.126 23:37:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 325862 /var/tmp/spdk-raid.sock 00:16:07.126 23:37:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@831 -- # '[' -z 325862 ']' 00:16:07.126 23:37:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:07.126 23:37:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:16:07.126 23:37:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:07.126 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:07.126 23:37:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:16:07.126 23:37:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:07.126 [2024-07-24 23:37:52.039403] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:16:07.126 [2024-07-24 23:37:52.039440] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid325862 ] 00:16:07.126 [2024-07-24 23:37:52.103009] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:07.384 [2024-07-24 23:37:52.182775] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:07.384 [2024-07-24 23:37:52.235986] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:07.384 [2024-07-24 23:37:52.236013] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:07.949 23:37:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:16:07.949 23:37:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@864 -- # return 0 00:16:07.950 23:37:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:16:07.950 23:37:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:07.950 23:37:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:16:07.950 23:37:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:16:07.950 23:37:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:16:07.950 23:37:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:07.950 23:37:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:16:07.950 23:37:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:07.950 23:37:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:16:08.207 malloc1 00:16:08.207 23:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:16:08.207 [2024-07-24 23:37:53.183188] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:16:08.207 [2024-07-24 23:37:53.183221] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:08.207 [2024-07-24 23:37:53.183234] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe34dd0 00:16:08.207 [2024-07-24 23:37:53.183240] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:08.208 [2024-07-24 23:37:53.184414] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:08.208 [2024-07-24 23:37:53.184434] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:16:08.208 pt1 00:16:08.208 23:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:16:08.208 23:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:08.208 23:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:16:08.208 23:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:16:08.208 23:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:16:08.208 23:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:08.208 23:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:16:08.208 23:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:08.208 23:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:16:08.465 malloc2 00:16:08.465 23:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:16:08.723 [2024-07-24 23:37:53.523713] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:16:08.723 [2024-07-24 23:37:53.523746] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:08.723 [2024-07-24 23:37:53.523758] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe358d0 00:16:08.723 [2024-07-24 23:37:53.523764] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:08.723 [2024-07-24 23:37:53.524795] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:08.723 [2024-07-24 23:37:53.524814] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:16:08.723 pt2 00:16:08.723 23:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:16:08.723 23:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:08.723 23:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:16:08.723 23:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:16:08.723 23:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:16:08.723 23:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:08.723 23:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:16:08.723 23:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:08.723 23:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:16:08.723 malloc3 00:16:08.723 23:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:16:08.981 [2024-07-24 23:37:53.856146] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:16:08.981 [2024-07-24 23:37:53.856178] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:08.981 [2024-07-24 23:37:53.856188] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xef6740 00:16:08.981 [2024-07-24 23:37:53.856193] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:08.981 [2024-07-24 23:37:53.857203] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:08.981 [2024-07-24 23:37:53.857224] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:16:08.981 pt3 00:16:08.981 23:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:16:08.981 23:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:08.981 23:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:16:08.981 23:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:16:08.981 23:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:16:08.981 23:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:08.981 23:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:16:08.981 23:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:08.981 23:37:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:16:09.239 malloc4 00:16:09.239 23:37:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:16:09.239 [2024-07-24 23:37:54.196647] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:16:09.239 [2024-07-24 23:37:54.196682] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:09.239 [2024-07-24 23:37:54.196693] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe2c990 00:16:09.239 [2024-07-24 23:37:54.196715] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:09.239 [2024-07-24 23:37:54.197780] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:09.239 [2024-07-24 23:37:54.197800] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:16:09.239 pt4 00:16:09.239 23:37:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:16:09.239 23:37:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:09.239 23:37:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:16:09.496 [2024-07-24 23:37:54.361087] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:16:09.496 [2024-07-24 23:37:54.361958] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:16:09.496 [2024-07-24 23:37:54.361998] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:16:09.496 [2024-07-24 23:37:54.362027] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:16:09.496 [2024-07-24 23:37:54.362141] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xe2e9d0 00:16:09.496 [2024-07-24 23:37:54.362147] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:16:09.496 [2024-07-24 23:37:54.362281] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe377d0 00:16:09.496 [2024-07-24 23:37:54.362382] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe2e9d0 00:16:09.496 [2024-07-24 23:37:54.362387] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xe2e9d0 00:16:09.496 [2024-07-24 23:37:54.362454] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:09.496 23:37:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:16:09.496 23:37:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:09.496 23:37:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:09.496 23:37:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:09.496 23:37:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:09.496 23:37:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:09.496 23:37:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:09.496 23:37:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:09.496 23:37:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:09.496 23:37:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:09.496 23:37:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:09.496 23:37:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:09.754 23:37:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:09.754 "name": "raid_bdev1", 00:16:09.754 "uuid": "d371e481-288d-491e-86fe-5989310ac5ea", 00:16:09.754 "strip_size_kb": 64, 00:16:09.754 "state": "online", 00:16:09.754 "raid_level": "concat", 00:16:09.754 "superblock": true, 00:16:09.754 "num_base_bdevs": 4, 00:16:09.754 "num_base_bdevs_discovered": 4, 00:16:09.754 "num_base_bdevs_operational": 4, 00:16:09.754 "base_bdevs_list": [ 00:16:09.754 { 00:16:09.754 "name": "pt1", 00:16:09.754 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:09.754 "is_configured": true, 00:16:09.754 "data_offset": 2048, 00:16:09.754 "data_size": 63488 00:16:09.754 }, 00:16:09.754 { 00:16:09.754 "name": "pt2", 00:16:09.754 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:09.754 "is_configured": true, 00:16:09.754 "data_offset": 2048, 00:16:09.754 "data_size": 63488 00:16:09.754 }, 00:16:09.754 { 00:16:09.754 "name": "pt3", 00:16:09.754 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:09.754 "is_configured": true, 00:16:09.754 "data_offset": 2048, 00:16:09.754 "data_size": 63488 00:16:09.754 }, 00:16:09.754 { 00:16:09.754 "name": "pt4", 00:16:09.754 "uuid": "00000000-0000-0000-0000-000000000004", 00:16:09.754 "is_configured": true, 00:16:09.754 "data_offset": 2048, 00:16:09.754 "data_size": 63488 00:16:09.754 } 00:16:09.754 ] 00:16:09.754 }' 00:16:09.754 23:37:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:09.754 23:37:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:10.319 23:37:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:16:10.319 23:37:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:16:10.319 23:37:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:10.320 23:37:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:10.320 23:37:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:10.320 23:37:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:10.320 23:37:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:10.320 23:37:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:10.320 [2024-07-24 23:37:55.191411] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:10.320 23:37:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:10.320 "name": "raid_bdev1", 00:16:10.320 "aliases": [ 00:16:10.320 "d371e481-288d-491e-86fe-5989310ac5ea" 00:16:10.320 ], 00:16:10.320 "product_name": "Raid Volume", 00:16:10.320 "block_size": 512, 00:16:10.320 "num_blocks": 253952, 00:16:10.320 "uuid": "d371e481-288d-491e-86fe-5989310ac5ea", 00:16:10.320 "assigned_rate_limits": { 00:16:10.320 "rw_ios_per_sec": 0, 00:16:10.320 "rw_mbytes_per_sec": 0, 00:16:10.320 "r_mbytes_per_sec": 0, 00:16:10.320 "w_mbytes_per_sec": 0 00:16:10.320 }, 00:16:10.320 "claimed": false, 00:16:10.320 "zoned": false, 00:16:10.320 "supported_io_types": { 00:16:10.320 "read": true, 00:16:10.320 "write": true, 00:16:10.320 "unmap": true, 00:16:10.320 "flush": true, 00:16:10.320 "reset": true, 00:16:10.320 "nvme_admin": false, 00:16:10.320 "nvme_io": false, 00:16:10.320 "nvme_io_md": false, 00:16:10.320 "write_zeroes": true, 00:16:10.320 "zcopy": false, 00:16:10.320 "get_zone_info": false, 00:16:10.320 "zone_management": false, 00:16:10.320 "zone_append": false, 00:16:10.320 "compare": false, 00:16:10.320 "compare_and_write": false, 00:16:10.320 "abort": false, 00:16:10.320 "seek_hole": false, 00:16:10.320 "seek_data": false, 00:16:10.320 "copy": false, 00:16:10.320 "nvme_iov_md": false 00:16:10.320 }, 00:16:10.320 "memory_domains": [ 00:16:10.320 { 00:16:10.320 "dma_device_id": "system", 00:16:10.320 "dma_device_type": 1 00:16:10.320 }, 00:16:10.320 { 00:16:10.320 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:10.320 "dma_device_type": 2 00:16:10.320 }, 00:16:10.320 { 00:16:10.320 "dma_device_id": "system", 00:16:10.320 "dma_device_type": 1 00:16:10.320 }, 00:16:10.320 { 00:16:10.320 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:10.320 "dma_device_type": 2 00:16:10.320 }, 00:16:10.320 { 00:16:10.320 "dma_device_id": "system", 00:16:10.320 "dma_device_type": 1 00:16:10.320 }, 00:16:10.320 { 00:16:10.320 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:10.320 "dma_device_type": 2 00:16:10.320 }, 00:16:10.320 { 00:16:10.320 "dma_device_id": "system", 00:16:10.320 "dma_device_type": 1 00:16:10.320 }, 00:16:10.320 { 00:16:10.320 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:10.320 "dma_device_type": 2 00:16:10.320 } 00:16:10.320 ], 00:16:10.320 "driver_specific": { 00:16:10.320 "raid": { 00:16:10.320 "uuid": "d371e481-288d-491e-86fe-5989310ac5ea", 00:16:10.320 "strip_size_kb": 64, 00:16:10.320 "state": "online", 00:16:10.320 "raid_level": "concat", 00:16:10.320 "superblock": true, 00:16:10.320 "num_base_bdevs": 4, 00:16:10.320 "num_base_bdevs_discovered": 4, 00:16:10.320 "num_base_bdevs_operational": 4, 00:16:10.320 "base_bdevs_list": [ 00:16:10.320 { 00:16:10.320 "name": "pt1", 00:16:10.320 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:10.320 "is_configured": true, 00:16:10.320 "data_offset": 2048, 00:16:10.320 "data_size": 63488 00:16:10.320 }, 00:16:10.320 { 00:16:10.320 "name": "pt2", 00:16:10.320 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:10.320 "is_configured": true, 00:16:10.320 "data_offset": 2048, 00:16:10.320 "data_size": 63488 00:16:10.320 }, 00:16:10.320 { 00:16:10.320 "name": "pt3", 00:16:10.320 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:10.320 "is_configured": true, 00:16:10.320 "data_offset": 2048, 00:16:10.320 "data_size": 63488 00:16:10.320 }, 00:16:10.320 { 00:16:10.320 "name": "pt4", 00:16:10.320 "uuid": "00000000-0000-0000-0000-000000000004", 00:16:10.320 "is_configured": true, 00:16:10.320 "data_offset": 2048, 00:16:10.320 "data_size": 63488 00:16:10.320 } 00:16:10.320 ] 00:16:10.320 } 00:16:10.320 } 00:16:10.320 }' 00:16:10.320 23:37:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:10.320 23:37:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:16:10.320 pt2 00:16:10.320 pt3 00:16:10.320 pt4' 00:16:10.320 23:37:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:10.320 23:37:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:16:10.320 23:37:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:10.578 23:37:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:10.579 "name": "pt1", 00:16:10.579 "aliases": [ 00:16:10.579 "00000000-0000-0000-0000-000000000001" 00:16:10.579 ], 00:16:10.579 "product_name": "passthru", 00:16:10.579 "block_size": 512, 00:16:10.579 "num_blocks": 65536, 00:16:10.579 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:10.579 "assigned_rate_limits": { 00:16:10.579 "rw_ios_per_sec": 0, 00:16:10.579 "rw_mbytes_per_sec": 0, 00:16:10.579 "r_mbytes_per_sec": 0, 00:16:10.579 "w_mbytes_per_sec": 0 00:16:10.579 }, 00:16:10.579 "claimed": true, 00:16:10.579 "claim_type": "exclusive_write", 00:16:10.579 "zoned": false, 00:16:10.579 "supported_io_types": { 00:16:10.579 "read": true, 00:16:10.579 "write": true, 00:16:10.579 "unmap": true, 00:16:10.579 "flush": true, 00:16:10.579 "reset": true, 00:16:10.579 "nvme_admin": false, 00:16:10.579 "nvme_io": false, 00:16:10.579 "nvme_io_md": false, 00:16:10.579 "write_zeroes": true, 00:16:10.579 "zcopy": true, 00:16:10.579 "get_zone_info": false, 00:16:10.579 "zone_management": false, 00:16:10.579 "zone_append": false, 00:16:10.579 "compare": false, 00:16:10.579 "compare_and_write": false, 00:16:10.579 "abort": true, 00:16:10.579 "seek_hole": false, 00:16:10.579 "seek_data": false, 00:16:10.579 "copy": true, 00:16:10.579 "nvme_iov_md": false 00:16:10.579 }, 00:16:10.579 "memory_domains": [ 00:16:10.579 { 00:16:10.579 "dma_device_id": "system", 00:16:10.579 "dma_device_type": 1 00:16:10.579 }, 00:16:10.579 { 00:16:10.579 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:10.579 "dma_device_type": 2 00:16:10.579 } 00:16:10.579 ], 00:16:10.579 "driver_specific": { 00:16:10.579 "passthru": { 00:16:10.579 "name": "pt1", 00:16:10.579 "base_bdev_name": "malloc1" 00:16:10.579 } 00:16:10.579 } 00:16:10.579 }' 00:16:10.579 23:37:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:10.579 23:37:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:10.579 23:37:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:10.579 23:37:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:10.579 23:37:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:10.579 23:37:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:10.579 23:37:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:10.837 23:37:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:10.837 23:37:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:10.837 23:37:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:10.837 23:37:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:10.837 23:37:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:10.837 23:37:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:10.837 23:37:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:16:10.837 23:37:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:11.095 23:37:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:11.095 "name": "pt2", 00:16:11.095 "aliases": [ 00:16:11.095 "00000000-0000-0000-0000-000000000002" 00:16:11.095 ], 00:16:11.095 "product_name": "passthru", 00:16:11.095 "block_size": 512, 00:16:11.095 "num_blocks": 65536, 00:16:11.095 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:11.095 "assigned_rate_limits": { 00:16:11.095 "rw_ios_per_sec": 0, 00:16:11.095 "rw_mbytes_per_sec": 0, 00:16:11.095 "r_mbytes_per_sec": 0, 00:16:11.095 "w_mbytes_per_sec": 0 00:16:11.095 }, 00:16:11.095 "claimed": true, 00:16:11.095 "claim_type": "exclusive_write", 00:16:11.095 "zoned": false, 00:16:11.095 "supported_io_types": { 00:16:11.095 "read": true, 00:16:11.095 "write": true, 00:16:11.095 "unmap": true, 00:16:11.095 "flush": true, 00:16:11.095 "reset": true, 00:16:11.095 "nvme_admin": false, 00:16:11.095 "nvme_io": false, 00:16:11.095 "nvme_io_md": false, 00:16:11.095 "write_zeroes": true, 00:16:11.095 "zcopy": true, 00:16:11.095 "get_zone_info": false, 00:16:11.095 "zone_management": false, 00:16:11.095 "zone_append": false, 00:16:11.095 "compare": false, 00:16:11.095 "compare_and_write": false, 00:16:11.095 "abort": true, 00:16:11.095 "seek_hole": false, 00:16:11.095 "seek_data": false, 00:16:11.095 "copy": true, 00:16:11.095 "nvme_iov_md": false 00:16:11.095 }, 00:16:11.095 "memory_domains": [ 00:16:11.095 { 00:16:11.095 "dma_device_id": "system", 00:16:11.095 "dma_device_type": 1 00:16:11.095 }, 00:16:11.095 { 00:16:11.095 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:11.095 "dma_device_type": 2 00:16:11.095 } 00:16:11.095 ], 00:16:11.095 "driver_specific": { 00:16:11.095 "passthru": { 00:16:11.095 "name": "pt2", 00:16:11.095 "base_bdev_name": "malloc2" 00:16:11.095 } 00:16:11.095 } 00:16:11.095 }' 00:16:11.095 23:37:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:11.095 23:37:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:11.095 23:37:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:11.095 23:37:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:11.095 23:37:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:11.095 23:37:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:11.095 23:37:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:11.095 23:37:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:11.095 23:37:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:11.095 23:37:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:11.354 23:37:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:11.354 23:37:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:11.354 23:37:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:11.354 23:37:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:16:11.354 23:37:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:11.354 23:37:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:11.354 "name": "pt3", 00:16:11.354 "aliases": [ 00:16:11.354 "00000000-0000-0000-0000-000000000003" 00:16:11.354 ], 00:16:11.354 "product_name": "passthru", 00:16:11.354 "block_size": 512, 00:16:11.354 "num_blocks": 65536, 00:16:11.354 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:11.354 "assigned_rate_limits": { 00:16:11.354 "rw_ios_per_sec": 0, 00:16:11.354 "rw_mbytes_per_sec": 0, 00:16:11.354 "r_mbytes_per_sec": 0, 00:16:11.354 "w_mbytes_per_sec": 0 00:16:11.354 }, 00:16:11.354 "claimed": true, 00:16:11.354 "claim_type": "exclusive_write", 00:16:11.354 "zoned": false, 00:16:11.354 "supported_io_types": { 00:16:11.354 "read": true, 00:16:11.354 "write": true, 00:16:11.354 "unmap": true, 00:16:11.354 "flush": true, 00:16:11.354 "reset": true, 00:16:11.354 "nvme_admin": false, 00:16:11.354 "nvme_io": false, 00:16:11.354 "nvme_io_md": false, 00:16:11.354 "write_zeroes": true, 00:16:11.354 "zcopy": true, 00:16:11.354 "get_zone_info": false, 00:16:11.354 "zone_management": false, 00:16:11.354 "zone_append": false, 00:16:11.354 "compare": false, 00:16:11.354 "compare_and_write": false, 00:16:11.354 "abort": true, 00:16:11.354 "seek_hole": false, 00:16:11.354 "seek_data": false, 00:16:11.354 "copy": true, 00:16:11.354 "nvme_iov_md": false 00:16:11.354 }, 00:16:11.354 "memory_domains": [ 00:16:11.354 { 00:16:11.354 "dma_device_id": "system", 00:16:11.354 "dma_device_type": 1 00:16:11.354 }, 00:16:11.354 { 00:16:11.354 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:11.354 "dma_device_type": 2 00:16:11.354 } 00:16:11.354 ], 00:16:11.354 "driver_specific": { 00:16:11.354 "passthru": { 00:16:11.354 "name": "pt3", 00:16:11.354 "base_bdev_name": "malloc3" 00:16:11.354 } 00:16:11.354 } 00:16:11.354 }' 00:16:11.354 23:37:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:11.354 23:37:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:11.612 23:37:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:11.612 23:37:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:11.612 23:37:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:11.612 23:37:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:11.612 23:37:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:11.612 23:37:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:11.612 23:37:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:11.612 23:37:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:11.612 23:37:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:11.870 23:37:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:11.870 23:37:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:11.870 23:37:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:11.870 23:37:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:16:11.870 23:37:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:11.870 "name": "pt4", 00:16:11.870 "aliases": [ 00:16:11.870 "00000000-0000-0000-0000-000000000004" 00:16:11.870 ], 00:16:11.870 "product_name": "passthru", 00:16:11.870 "block_size": 512, 00:16:11.870 "num_blocks": 65536, 00:16:11.870 "uuid": "00000000-0000-0000-0000-000000000004", 00:16:11.870 "assigned_rate_limits": { 00:16:11.870 "rw_ios_per_sec": 0, 00:16:11.870 "rw_mbytes_per_sec": 0, 00:16:11.870 "r_mbytes_per_sec": 0, 00:16:11.870 "w_mbytes_per_sec": 0 00:16:11.870 }, 00:16:11.870 "claimed": true, 00:16:11.870 "claim_type": "exclusive_write", 00:16:11.870 "zoned": false, 00:16:11.870 "supported_io_types": { 00:16:11.870 "read": true, 00:16:11.870 "write": true, 00:16:11.870 "unmap": true, 00:16:11.870 "flush": true, 00:16:11.870 "reset": true, 00:16:11.870 "nvme_admin": false, 00:16:11.870 "nvme_io": false, 00:16:11.870 "nvme_io_md": false, 00:16:11.870 "write_zeroes": true, 00:16:11.870 "zcopy": true, 00:16:11.870 "get_zone_info": false, 00:16:11.870 "zone_management": false, 00:16:11.870 "zone_append": false, 00:16:11.870 "compare": false, 00:16:11.870 "compare_and_write": false, 00:16:11.870 "abort": true, 00:16:11.870 "seek_hole": false, 00:16:11.870 "seek_data": false, 00:16:11.870 "copy": true, 00:16:11.870 "nvme_iov_md": false 00:16:11.870 }, 00:16:11.870 "memory_domains": [ 00:16:11.870 { 00:16:11.870 "dma_device_id": "system", 00:16:11.870 "dma_device_type": 1 00:16:11.870 }, 00:16:11.870 { 00:16:11.870 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:11.870 "dma_device_type": 2 00:16:11.870 } 00:16:11.870 ], 00:16:11.870 "driver_specific": { 00:16:11.870 "passthru": { 00:16:11.870 "name": "pt4", 00:16:11.870 "base_bdev_name": "malloc4" 00:16:11.870 } 00:16:11.870 } 00:16:11.870 }' 00:16:11.870 23:37:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:11.870 23:37:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:12.128 23:37:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:12.128 23:37:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:12.128 23:37:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:12.128 23:37:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:12.128 23:37:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:12.128 23:37:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:12.128 23:37:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:12.128 23:37:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:12.128 23:37:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:12.128 23:37:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:12.128 23:37:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:12.128 23:37:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:16:12.386 [2024-07-24 23:37:57.236725] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:12.386 23:37:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=d371e481-288d-491e-86fe-5989310ac5ea 00:16:12.386 23:37:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z d371e481-288d-491e-86fe-5989310ac5ea ']' 00:16:12.386 23:37:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:16:12.644 [2024-07-24 23:37:57.392920] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:12.644 [2024-07-24 23:37:57.392931] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:12.644 [2024-07-24 23:37:57.392965] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:12.644 [2024-07-24 23:37:57.393009] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:12.644 [2024-07-24 23:37:57.393014] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe2e9d0 name raid_bdev1, state offline 00:16:12.644 23:37:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:12.644 23:37:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:16:12.644 23:37:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:16:12.644 23:37:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:16:12.644 23:37:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:16:12.644 23:37:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:16:12.901 23:37:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:16:12.902 23:37:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:16:12.902 23:37:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:16:12.902 23:37:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:16:13.160 23:37:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:16:13.160 23:37:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:16:13.418 23:37:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:16:13.418 23:37:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:16:13.418 23:37:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:16:13.418 23:37:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:16:13.418 23:37:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # local es=0 00:16:13.418 23:37:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:16:13.418 23:37:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:13.418 23:37:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:16:13.418 23:37:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:13.418 23:37:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:16:13.418 23:37:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:13.418 23:37:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:16:13.418 23:37:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:13.418 23:37:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:16:13.418 23:37:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:16:13.677 [2024-07-24 23:37:58.531839] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:16:13.677 [2024-07-24 23:37:58.532824] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:16:13.677 [2024-07-24 23:37:58.532854] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:16:13.677 [2024-07-24 23:37:58.532876] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:16:13.677 [2024-07-24 23:37:58.532912] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:16:13.677 [2024-07-24 23:37:58.532939] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:16:13.677 [2024-07-24 23:37:58.532951] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:16:13.677 [2024-07-24 23:37:58.532963] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:16:13.677 [2024-07-24 23:37:58.532971] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:13.677 [2024-07-24 23:37:58.532977] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe35250 name raid_bdev1, state configuring 00:16:13.677 request: 00:16:13.677 { 00:16:13.677 "name": "raid_bdev1", 00:16:13.677 "raid_level": "concat", 00:16:13.677 "base_bdevs": [ 00:16:13.677 "malloc1", 00:16:13.677 "malloc2", 00:16:13.677 "malloc3", 00:16:13.677 "malloc4" 00:16:13.677 ], 00:16:13.677 "strip_size_kb": 64, 00:16:13.677 "superblock": false, 00:16:13.677 "method": "bdev_raid_create", 00:16:13.677 "req_id": 1 00:16:13.677 } 00:16:13.677 Got JSON-RPC error response 00:16:13.677 response: 00:16:13.677 { 00:16:13.677 "code": -17, 00:16:13.677 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:16:13.677 } 00:16:13.677 23:37:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # es=1 00:16:13.677 23:37:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:16:13.677 23:37:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:16:13.677 23:37:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:16:13.677 23:37:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:13.677 23:37:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:16:13.935 23:37:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:16:13.935 23:37:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:16:13.935 23:37:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:16:13.935 [2024-07-24 23:37:58.868676] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:16:13.935 [2024-07-24 23:37:58.868706] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:13.935 [2024-07-24 23:37:58.868719] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xef6970 00:16:13.935 [2024-07-24 23:37:58.868725] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:13.935 [2024-07-24 23:37:58.869864] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:13.935 [2024-07-24 23:37:58.869884] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:16:13.935 [2024-07-24 23:37:58.869929] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:16:13.935 [2024-07-24 23:37:58.869948] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:16:13.935 pt1 00:16:13.935 23:37:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 4 00:16:13.935 23:37:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:13.935 23:37:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:13.935 23:37:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:13.935 23:37:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:13.935 23:37:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:13.935 23:37:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:13.935 23:37:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:13.935 23:37:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:13.935 23:37:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:13.935 23:37:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:13.935 23:37:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:14.194 23:37:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:14.194 "name": "raid_bdev1", 00:16:14.194 "uuid": "d371e481-288d-491e-86fe-5989310ac5ea", 00:16:14.194 "strip_size_kb": 64, 00:16:14.194 "state": "configuring", 00:16:14.194 "raid_level": "concat", 00:16:14.194 "superblock": true, 00:16:14.194 "num_base_bdevs": 4, 00:16:14.194 "num_base_bdevs_discovered": 1, 00:16:14.194 "num_base_bdevs_operational": 4, 00:16:14.194 "base_bdevs_list": [ 00:16:14.194 { 00:16:14.194 "name": "pt1", 00:16:14.194 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:14.194 "is_configured": true, 00:16:14.194 "data_offset": 2048, 00:16:14.194 "data_size": 63488 00:16:14.194 }, 00:16:14.194 { 00:16:14.194 "name": null, 00:16:14.194 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:14.194 "is_configured": false, 00:16:14.194 "data_offset": 2048, 00:16:14.194 "data_size": 63488 00:16:14.194 }, 00:16:14.194 { 00:16:14.194 "name": null, 00:16:14.194 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:14.194 "is_configured": false, 00:16:14.194 "data_offset": 2048, 00:16:14.194 "data_size": 63488 00:16:14.194 }, 00:16:14.194 { 00:16:14.194 "name": null, 00:16:14.194 "uuid": "00000000-0000-0000-0000-000000000004", 00:16:14.194 "is_configured": false, 00:16:14.194 "data_offset": 2048, 00:16:14.194 "data_size": 63488 00:16:14.194 } 00:16:14.194 ] 00:16:14.194 }' 00:16:14.194 23:37:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:14.194 23:37:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:14.760 23:37:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:16:14.760 23:37:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:16:14.760 [2024-07-24 23:37:59.650712] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:16:14.760 [2024-07-24 23:37:59.650752] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:14.760 [2024-07-24 23:37:59.650763] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe2f6d0 00:16:14.760 [2024-07-24 23:37:59.650785] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:14.760 [2024-07-24 23:37:59.651031] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:14.760 [2024-07-24 23:37:59.651042] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:16:14.760 [2024-07-24 23:37:59.651088] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:16:14.760 [2024-07-24 23:37:59.651101] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:16:14.760 pt2 00:16:14.760 23:37:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:16:15.019 [2024-07-24 23:37:59.815148] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:16:15.019 23:37:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 4 00:16:15.019 23:37:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:15.019 23:37:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:15.019 23:37:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:15.019 23:37:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:15.019 23:37:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:15.019 23:37:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:15.019 23:37:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:15.019 23:37:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:15.019 23:37:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:15.019 23:37:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:15.019 23:37:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:15.019 23:38:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:15.019 "name": "raid_bdev1", 00:16:15.019 "uuid": "d371e481-288d-491e-86fe-5989310ac5ea", 00:16:15.019 "strip_size_kb": 64, 00:16:15.019 "state": "configuring", 00:16:15.019 "raid_level": "concat", 00:16:15.019 "superblock": true, 00:16:15.019 "num_base_bdevs": 4, 00:16:15.019 "num_base_bdevs_discovered": 1, 00:16:15.019 "num_base_bdevs_operational": 4, 00:16:15.019 "base_bdevs_list": [ 00:16:15.019 { 00:16:15.019 "name": "pt1", 00:16:15.019 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:15.019 "is_configured": true, 00:16:15.019 "data_offset": 2048, 00:16:15.019 "data_size": 63488 00:16:15.019 }, 00:16:15.019 { 00:16:15.019 "name": null, 00:16:15.019 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:15.019 "is_configured": false, 00:16:15.019 "data_offset": 2048, 00:16:15.019 "data_size": 63488 00:16:15.019 }, 00:16:15.019 { 00:16:15.019 "name": null, 00:16:15.019 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:15.019 "is_configured": false, 00:16:15.019 "data_offset": 2048, 00:16:15.019 "data_size": 63488 00:16:15.019 }, 00:16:15.019 { 00:16:15.019 "name": null, 00:16:15.019 "uuid": "00000000-0000-0000-0000-000000000004", 00:16:15.019 "is_configured": false, 00:16:15.019 "data_offset": 2048, 00:16:15.019 "data_size": 63488 00:16:15.019 } 00:16:15.019 ] 00:16:15.019 }' 00:16:15.019 23:38:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:15.019 23:38:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:15.587 23:38:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:16:15.587 23:38:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:16:15.587 23:38:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:16:15.845 [2024-07-24 23:38:00.625221] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:16:15.845 [2024-07-24 23:38:00.625259] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:15.845 [2024-07-24 23:38:00.625269] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe2e060 00:16:15.845 [2024-07-24 23:38:00.625275] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:15.845 [2024-07-24 23:38:00.625537] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:15.845 [2024-07-24 23:38:00.625548] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:16:15.845 [2024-07-24 23:38:00.625590] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:16:15.845 [2024-07-24 23:38:00.625603] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:16:15.845 pt2 00:16:15.845 23:38:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:16:15.845 23:38:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:16:15.845 23:38:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:16:15.845 [2024-07-24 23:38:00.789648] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:16:15.845 [2024-07-24 23:38:00.789673] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:15.845 [2024-07-24 23:38:00.789682] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe2d610 00:16:15.845 [2024-07-24 23:38:00.789687] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:15.845 [2024-07-24 23:38:00.789871] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:15.845 [2024-07-24 23:38:00.789879] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:16:15.845 [2024-07-24 23:38:00.789910] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:16:15.845 [2024-07-24 23:38:00.789920] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:16:15.845 pt3 00:16:15.845 23:38:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:16:15.845 23:38:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:16:15.845 23:38:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:16:16.104 [2024-07-24 23:38:00.958096] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:16:16.104 [2024-07-24 23:38:00.958121] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:16.104 [2024-07-24 23:38:00.958131] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe30a50 00:16:16.104 [2024-07-24 23:38:00.958138] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:16.104 [2024-07-24 23:38:00.958369] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:16.104 [2024-07-24 23:38:00.958381] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:16:16.104 [2024-07-24 23:38:00.958418] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:16:16.104 [2024-07-24 23:38:00.958430] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:16:16.104 [2024-07-24 23:38:00.958541] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xef8370 00:16:16.104 [2024-07-24 23:38:00.958548] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:16:16.104 [2024-07-24 23:38:00.958680] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe2c740 00:16:16.104 [2024-07-24 23:38:00.958783] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xef8370 00:16:16.104 [2024-07-24 23:38:00.958790] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xef8370 00:16:16.104 [2024-07-24 23:38:00.958864] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:16.104 pt4 00:16:16.104 23:38:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:16:16.104 23:38:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:16:16.104 23:38:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:16:16.104 23:38:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:16.104 23:38:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:16.104 23:38:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:16.104 23:38:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:16.104 23:38:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:16.104 23:38:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:16.104 23:38:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:16.104 23:38:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:16.104 23:38:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:16.104 23:38:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:16.104 23:38:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:16.363 23:38:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:16.363 "name": "raid_bdev1", 00:16:16.363 "uuid": "d371e481-288d-491e-86fe-5989310ac5ea", 00:16:16.363 "strip_size_kb": 64, 00:16:16.363 "state": "online", 00:16:16.363 "raid_level": "concat", 00:16:16.363 "superblock": true, 00:16:16.363 "num_base_bdevs": 4, 00:16:16.363 "num_base_bdevs_discovered": 4, 00:16:16.363 "num_base_bdevs_operational": 4, 00:16:16.363 "base_bdevs_list": [ 00:16:16.363 { 00:16:16.363 "name": "pt1", 00:16:16.363 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:16.363 "is_configured": true, 00:16:16.363 "data_offset": 2048, 00:16:16.363 "data_size": 63488 00:16:16.363 }, 00:16:16.363 { 00:16:16.363 "name": "pt2", 00:16:16.363 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:16.363 "is_configured": true, 00:16:16.363 "data_offset": 2048, 00:16:16.363 "data_size": 63488 00:16:16.363 }, 00:16:16.363 { 00:16:16.363 "name": "pt3", 00:16:16.363 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:16.363 "is_configured": true, 00:16:16.363 "data_offset": 2048, 00:16:16.363 "data_size": 63488 00:16:16.363 }, 00:16:16.363 { 00:16:16.363 "name": "pt4", 00:16:16.363 "uuid": "00000000-0000-0000-0000-000000000004", 00:16:16.363 "is_configured": true, 00:16:16.363 "data_offset": 2048, 00:16:16.363 "data_size": 63488 00:16:16.363 } 00:16:16.363 ] 00:16:16.363 }' 00:16:16.363 23:38:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:16.363 23:38:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:16.622 23:38:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:16:16.622 23:38:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:16:16.622 23:38:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:16.622 23:38:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:16.622 23:38:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:16.622 23:38:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:16.622 23:38:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:16.622 23:38:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:16.880 [2024-07-24 23:38:01.756353] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:16.880 23:38:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:16.880 "name": "raid_bdev1", 00:16:16.880 "aliases": [ 00:16:16.880 "d371e481-288d-491e-86fe-5989310ac5ea" 00:16:16.880 ], 00:16:16.880 "product_name": "Raid Volume", 00:16:16.880 "block_size": 512, 00:16:16.880 "num_blocks": 253952, 00:16:16.880 "uuid": "d371e481-288d-491e-86fe-5989310ac5ea", 00:16:16.880 "assigned_rate_limits": { 00:16:16.880 "rw_ios_per_sec": 0, 00:16:16.880 "rw_mbytes_per_sec": 0, 00:16:16.880 "r_mbytes_per_sec": 0, 00:16:16.880 "w_mbytes_per_sec": 0 00:16:16.880 }, 00:16:16.880 "claimed": false, 00:16:16.880 "zoned": false, 00:16:16.880 "supported_io_types": { 00:16:16.880 "read": true, 00:16:16.880 "write": true, 00:16:16.880 "unmap": true, 00:16:16.880 "flush": true, 00:16:16.880 "reset": true, 00:16:16.880 "nvme_admin": false, 00:16:16.880 "nvme_io": false, 00:16:16.880 "nvme_io_md": false, 00:16:16.880 "write_zeroes": true, 00:16:16.880 "zcopy": false, 00:16:16.880 "get_zone_info": false, 00:16:16.880 "zone_management": false, 00:16:16.880 "zone_append": false, 00:16:16.880 "compare": false, 00:16:16.880 "compare_and_write": false, 00:16:16.880 "abort": false, 00:16:16.880 "seek_hole": false, 00:16:16.880 "seek_data": false, 00:16:16.880 "copy": false, 00:16:16.880 "nvme_iov_md": false 00:16:16.880 }, 00:16:16.880 "memory_domains": [ 00:16:16.880 { 00:16:16.880 "dma_device_id": "system", 00:16:16.880 "dma_device_type": 1 00:16:16.880 }, 00:16:16.880 { 00:16:16.880 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:16.880 "dma_device_type": 2 00:16:16.880 }, 00:16:16.880 { 00:16:16.880 "dma_device_id": "system", 00:16:16.880 "dma_device_type": 1 00:16:16.880 }, 00:16:16.880 { 00:16:16.880 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:16.880 "dma_device_type": 2 00:16:16.880 }, 00:16:16.880 { 00:16:16.880 "dma_device_id": "system", 00:16:16.880 "dma_device_type": 1 00:16:16.880 }, 00:16:16.880 { 00:16:16.880 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:16.880 "dma_device_type": 2 00:16:16.880 }, 00:16:16.880 { 00:16:16.880 "dma_device_id": "system", 00:16:16.880 "dma_device_type": 1 00:16:16.880 }, 00:16:16.880 { 00:16:16.880 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:16.880 "dma_device_type": 2 00:16:16.880 } 00:16:16.880 ], 00:16:16.880 "driver_specific": { 00:16:16.880 "raid": { 00:16:16.880 "uuid": "d371e481-288d-491e-86fe-5989310ac5ea", 00:16:16.880 "strip_size_kb": 64, 00:16:16.880 "state": "online", 00:16:16.880 "raid_level": "concat", 00:16:16.880 "superblock": true, 00:16:16.880 "num_base_bdevs": 4, 00:16:16.880 "num_base_bdevs_discovered": 4, 00:16:16.880 "num_base_bdevs_operational": 4, 00:16:16.880 "base_bdevs_list": [ 00:16:16.880 { 00:16:16.880 "name": "pt1", 00:16:16.880 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:16.880 "is_configured": true, 00:16:16.880 "data_offset": 2048, 00:16:16.880 "data_size": 63488 00:16:16.880 }, 00:16:16.880 { 00:16:16.880 "name": "pt2", 00:16:16.880 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:16.880 "is_configured": true, 00:16:16.880 "data_offset": 2048, 00:16:16.880 "data_size": 63488 00:16:16.880 }, 00:16:16.880 { 00:16:16.880 "name": "pt3", 00:16:16.880 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:16.880 "is_configured": true, 00:16:16.880 "data_offset": 2048, 00:16:16.880 "data_size": 63488 00:16:16.880 }, 00:16:16.880 { 00:16:16.880 "name": "pt4", 00:16:16.880 "uuid": "00000000-0000-0000-0000-000000000004", 00:16:16.880 "is_configured": true, 00:16:16.880 "data_offset": 2048, 00:16:16.880 "data_size": 63488 00:16:16.880 } 00:16:16.880 ] 00:16:16.880 } 00:16:16.880 } 00:16:16.880 }' 00:16:16.880 23:38:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:16.880 23:38:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:16:16.880 pt2 00:16:16.880 pt3 00:16:16.880 pt4' 00:16:16.880 23:38:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:16.880 23:38:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:16:16.880 23:38:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:17.139 23:38:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:17.139 "name": "pt1", 00:16:17.139 "aliases": [ 00:16:17.139 "00000000-0000-0000-0000-000000000001" 00:16:17.139 ], 00:16:17.139 "product_name": "passthru", 00:16:17.139 "block_size": 512, 00:16:17.139 "num_blocks": 65536, 00:16:17.139 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:17.139 "assigned_rate_limits": { 00:16:17.139 "rw_ios_per_sec": 0, 00:16:17.139 "rw_mbytes_per_sec": 0, 00:16:17.139 "r_mbytes_per_sec": 0, 00:16:17.139 "w_mbytes_per_sec": 0 00:16:17.139 }, 00:16:17.139 "claimed": true, 00:16:17.139 "claim_type": "exclusive_write", 00:16:17.139 "zoned": false, 00:16:17.139 "supported_io_types": { 00:16:17.139 "read": true, 00:16:17.139 "write": true, 00:16:17.139 "unmap": true, 00:16:17.139 "flush": true, 00:16:17.139 "reset": true, 00:16:17.139 "nvme_admin": false, 00:16:17.139 "nvme_io": false, 00:16:17.139 "nvme_io_md": false, 00:16:17.139 "write_zeroes": true, 00:16:17.139 "zcopy": true, 00:16:17.139 "get_zone_info": false, 00:16:17.139 "zone_management": false, 00:16:17.139 "zone_append": false, 00:16:17.139 "compare": false, 00:16:17.139 "compare_and_write": false, 00:16:17.139 "abort": true, 00:16:17.139 "seek_hole": false, 00:16:17.139 "seek_data": false, 00:16:17.139 "copy": true, 00:16:17.139 "nvme_iov_md": false 00:16:17.139 }, 00:16:17.139 "memory_domains": [ 00:16:17.139 { 00:16:17.139 "dma_device_id": "system", 00:16:17.139 "dma_device_type": 1 00:16:17.139 }, 00:16:17.139 { 00:16:17.139 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:17.139 "dma_device_type": 2 00:16:17.139 } 00:16:17.139 ], 00:16:17.139 "driver_specific": { 00:16:17.139 "passthru": { 00:16:17.139 "name": "pt1", 00:16:17.139 "base_bdev_name": "malloc1" 00:16:17.139 } 00:16:17.139 } 00:16:17.139 }' 00:16:17.139 23:38:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:17.139 23:38:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:17.139 23:38:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:17.139 23:38:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:17.139 23:38:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:17.400 23:38:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:17.400 23:38:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:17.400 23:38:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:17.400 23:38:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:17.400 23:38:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:17.400 23:38:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:17.400 23:38:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:17.400 23:38:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:17.400 23:38:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:16:17.400 23:38:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:17.742 23:38:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:17.742 "name": "pt2", 00:16:17.742 "aliases": [ 00:16:17.742 "00000000-0000-0000-0000-000000000002" 00:16:17.742 ], 00:16:17.742 "product_name": "passthru", 00:16:17.742 "block_size": 512, 00:16:17.742 "num_blocks": 65536, 00:16:17.742 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:17.742 "assigned_rate_limits": { 00:16:17.742 "rw_ios_per_sec": 0, 00:16:17.742 "rw_mbytes_per_sec": 0, 00:16:17.742 "r_mbytes_per_sec": 0, 00:16:17.742 "w_mbytes_per_sec": 0 00:16:17.742 }, 00:16:17.742 "claimed": true, 00:16:17.742 "claim_type": "exclusive_write", 00:16:17.742 "zoned": false, 00:16:17.742 "supported_io_types": { 00:16:17.742 "read": true, 00:16:17.742 "write": true, 00:16:17.742 "unmap": true, 00:16:17.742 "flush": true, 00:16:17.742 "reset": true, 00:16:17.742 "nvme_admin": false, 00:16:17.742 "nvme_io": false, 00:16:17.742 "nvme_io_md": false, 00:16:17.742 "write_zeroes": true, 00:16:17.742 "zcopy": true, 00:16:17.742 "get_zone_info": false, 00:16:17.742 "zone_management": false, 00:16:17.742 "zone_append": false, 00:16:17.742 "compare": false, 00:16:17.742 "compare_and_write": false, 00:16:17.742 "abort": true, 00:16:17.742 "seek_hole": false, 00:16:17.742 "seek_data": false, 00:16:17.742 "copy": true, 00:16:17.742 "nvme_iov_md": false 00:16:17.742 }, 00:16:17.742 "memory_domains": [ 00:16:17.742 { 00:16:17.742 "dma_device_id": "system", 00:16:17.742 "dma_device_type": 1 00:16:17.742 }, 00:16:17.742 { 00:16:17.742 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:17.742 "dma_device_type": 2 00:16:17.742 } 00:16:17.742 ], 00:16:17.742 "driver_specific": { 00:16:17.742 "passthru": { 00:16:17.742 "name": "pt2", 00:16:17.742 "base_bdev_name": "malloc2" 00:16:17.742 } 00:16:17.742 } 00:16:17.742 }' 00:16:17.742 23:38:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:17.742 23:38:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:17.742 23:38:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:17.742 23:38:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:17.742 23:38:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:17.742 23:38:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:17.742 23:38:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:17.742 23:38:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:17.742 23:38:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:17.742 23:38:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:17.742 23:38:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:18.000 23:38:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:18.000 23:38:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:18.000 23:38:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:16:18.000 23:38:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:18.000 23:38:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:18.000 "name": "pt3", 00:16:18.000 "aliases": [ 00:16:18.000 "00000000-0000-0000-0000-000000000003" 00:16:18.000 ], 00:16:18.000 "product_name": "passthru", 00:16:18.000 "block_size": 512, 00:16:18.000 "num_blocks": 65536, 00:16:18.000 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:18.000 "assigned_rate_limits": { 00:16:18.000 "rw_ios_per_sec": 0, 00:16:18.000 "rw_mbytes_per_sec": 0, 00:16:18.000 "r_mbytes_per_sec": 0, 00:16:18.000 "w_mbytes_per_sec": 0 00:16:18.000 }, 00:16:18.000 "claimed": true, 00:16:18.000 "claim_type": "exclusive_write", 00:16:18.000 "zoned": false, 00:16:18.000 "supported_io_types": { 00:16:18.000 "read": true, 00:16:18.000 "write": true, 00:16:18.000 "unmap": true, 00:16:18.000 "flush": true, 00:16:18.000 "reset": true, 00:16:18.000 "nvme_admin": false, 00:16:18.000 "nvme_io": false, 00:16:18.000 "nvme_io_md": false, 00:16:18.000 "write_zeroes": true, 00:16:18.000 "zcopy": true, 00:16:18.000 "get_zone_info": false, 00:16:18.000 "zone_management": false, 00:16:18.000 "zone_append": false, 00:16:18.000 "compare": false, 00:16:18.000 "compare_and_write": false, 00:16:18.000 "abort": true, 00:16:18.000 "seek_hole": false, 00:16:18.000 "seek_data": false, 00:16:18.000 "copy": true, 00:16:18.000 "nvme_iov_md": false 00:16:18.000 }, 00:16:18.000 "memory_domains": [ 00:16:18.000 { 00:16:18.000 "dma_device_id": "system", 00:16:18.000 "dma_device_type": 1 00:16:18.000 }, 00:16:18.000 { 00:16:18.000 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:18.000 "dma_device_type": 2 00:16:18.000 } 00:16:18.000 ], 00:16:18.000 "driver_specific": { 00:16:18.000 "passthru": { 00:16:18.000 "name": "pt3", 00:16:18.000 "base_bdev_name": "malloc3" 00:16:18.000 } 00:16:18.000 } 00:16:18.000 }' 00:16:18.000 23:38:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:18.000 23:38:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:18.259 23:38:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:18.259 23:38:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:18.259 23:38:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:18.259 23:38:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:18.259 23:38:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:18.259 23:38:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:18.259 23:38:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:18.259 23:38:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:18.259 23:38:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:18.259 23:38:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:18.259 23:38:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:18.259 23:38:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:16:18.259 23:38:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:18.518 23:38:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:18.518 "name": "pt4", 00:16:18.518 "aliases": [ 00:16:18.518 "00000000-0000-0000-0000-000000000004" 00:16:18.518 ], 00:16:18.518 "product_name": "passthru", 00:16:18.518 "block_size": 512, 00:16:18.518 "num_blocks": 65536, 00:16:18.518 "uuid": "00000000-0000-0000-0000-000000000004", 00:16:18.518 "assigned_rate_limits": { 00:16:18.518 "rw_ios_per_sec": 0, 00:16:18.518 "rw_mbytes_per_sec": 0, 00:16:18.518 "r_mbytes_per_sec": 0, 00:16:18.518 "w_mbytes_per_sec": 0 00:16:18.518 }, 00:16:18.518 "claimed": true, 00:16:18.518 "claim_type": "exclusive_write", 00:16:18.518 "zoned": false, 00:16:18.518 "supported_io_types": { 00:16:18.518 "read": true, 00:16:18.518 "write": true, 00:16:18.518 "unmap": true, 00:16:18.518 "flush": true, 00:16:18.518 "reset": true, 00:16:18.518 "nvme_admin": false, 00:16:18.518 "nvme_io": false, 00:16:18.518 "nvme_io_md": false, 00:16:18.518 "write_zeroes": true, 00:16:18.518 "zcopy": true, 00:16:18.518 "get_zone_info": false, 00:16:18.518 "zone_management": false, 00:16:18.518 "zone_append": false, 00:16:18.518 "compare": false, 00:16:18.518 "compare_and_write": false, 00:16:18.518 "abort": true, 00:16:18.518 "seek_hole": false, 00:16:18.518 "seek_data": false, 00:16:18.518 "copy": true, 00:16:18.518 "nvme_iov_md": false 00:16:18.518 }, 00:16:18.518 "memory_domains": [ 00:16:18.518 { 00:16:18.518 "dma_device_id": "system", 00:16:18.518 "dma_device_type": 1 00:16:18.518 }, 00:16:18.518 { 00:16:18.518 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:18.518 "dma_device_type": 2 00:16:18.518 } 00:16:18.518 ], 00:16:18.518 "driver_specific": { 00:16:18.518 "passthru": { 00:16:18.518 "name": "pt4", 00:16:18.518 "base_bdev_name": "malloc4" 00:16:18.518 } 00:16:18.518 } 00:16:18.518 }' 00:16:18.518 23:38:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:18.518 23:38:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:18.518 23:38:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:18.518 23:38:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:18.518 23:38:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:18.775 23:38:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:18.775 23:38:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:18.775 23:38:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:18.775 23:38:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:18.775 23:38:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:18.775 23:38:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:18.775 23:38:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:18.775 23:38:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:16:18.775 23:38:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:19.033 [2024-07-24 23:38:03.861821] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:19.033 23:38:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' d371e481-288d-491e-86fe-5989310ac5ea '!=' d371e481-288d-491e-86fe-5989310ac5ea ']' 00:16:19.033 23:38:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:16:19.033 23:38:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:19.033 23:38:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:16:19.033 23:38:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 325862 00:16:19.033 23:38:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@950 -- # '[' -z 325862 ']' 00:16:19.033 23:38:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # kill -0 325862 00:16:19.033 23:38:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # uname 00:16:19.033 23:38:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:16:19.033 23:38:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 325862 00:16:19.033 23:38:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:16:19.033 23:38:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:16:19.033 23:38:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 325862' 00:16:19.033 killing process with pid 325862 00:16:19.033 23:38:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@969 -- # kill 325862 00:16:19.033 [2024-07-24 23:38:03.920674] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:19.033 [2024-07-24 23:38:03.920719] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:19.033 [2024-07-24 23:38:03.920771] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:19.033 [2024-07-24 23:38:03.920777] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xef8370 name raid_bdev1, state offline 00:16:19.033 23:38:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@974 -- # wait 325862 00:16:19.033 [2024-07-24 23:38:03.951899] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:19.293 23:38:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:16:19.293 00:16:19.293 real 0m12.121s 00:16:19.293 user 0m22.131s 00:16:19.293 sys 0m1.836s 00:16:19.293 23:38:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:16:19.293 23:38:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:19.293 ************************************ 00:16:19.293 END TEST raid_superblock_test 00:16:19.293 ************************************ 00:16:19.293 23:38:04 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 4 read 00:16:19.293 23:38:04 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:16:19.293 23:38:04 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:16:19.293 23:38:04 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:19.293 ************************************ 00:16:19.293 START TEST raid_read_error_test 00:16:19.293 ************************************ 00:16:19.293 23:38:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test concat 4 read 00:16:19.293 23:38:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:16:19.293 23:38:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:16:19.293 23:38:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:16:19.293 23:38:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:16:19.293 23:38:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:19.293 23:38:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:16:19.293 23:38:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:19.293 23:38:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:19.293 23:38:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:16:19.293 23:38:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:19.293 23:38:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:19.293 23:38:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:16:19.293 23:38:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:19.293 23:38:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:19.293 23:38:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:16:19.293 23:38:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:19.293 23:38:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:19.293 23:38:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:16:19.293 23:38:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:16:19.293 23:38:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:16:19.293 23:38:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:16:19.293 23:38:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:16:19.293 23:38:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:16:19.293 23:38:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:16:19.293 23:38:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:16:19.293 23:38:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:16:19.293 23:38:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:16:19.293 23:38:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:16:19.293 23:38:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.oCSiRwY5vv 00:16:19.293 23:38:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=328241 00:16:19.293 23:38:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 328241 /var/tmp/spdk-raid.sock 00:16:19.293 23:38:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:16:19.293 23:38:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@831 -- # '[' -z 328241 ']' 00:16:19.293 23:38:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:19.293 23:38:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:16:19.293 23:38:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:19.293 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:19.293 23:38:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:16:19.293 23:38:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:19.293 [2024-07-24 23:38:04.260732] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:16:19.293 [2024-07-24 23:38:04.260768] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid328241 ] 00:16:19.553 [2024-07-24 23:38:04.322794] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:19.553 [2024-07-24 23:38:04.400569] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:19.553 [2024-07-24 23:38:04.451123] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:19.553 [2024-07-24 23:38:04.451146] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:20.120 23:38:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:16:20.120 23:38:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@864 -- # return 0 00:16:20.120 23:38:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:20.120 23:38:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:16:20.380 BaseBdev1_malloc 00:16:20.380 23:38:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:16:20.380 true 00:16:20.639 23:38:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:16:20.639 [2024-07-24 23:38:05.538476] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:16:20.639 [2024-07-24 23:38:05.538506] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:20.639 [2024-07-24 23:38:05.538518] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b2a550 00:16:20.639 [2024-07-24 23:38:05.538528] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:20.639 [2024-07-24 23:38:05.539829] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:20.639 [2024-07-24 23:38:05.539850] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:16:20.639 BaseBdev1 00:16:20.639 23:38:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:20.639 23:38:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:16:20.898 BaseBdev2_malloc 00:16:20.898 23:38:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:16:20.898 true 00:16:20.898 23:38:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:16:21.157 [2024-07-24 23:38:06.039273] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:16:21.157 [2024-07-24 23:38:06.039303] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:21.157 [2024-07-24 23:38:06.039314] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b2ed90 00:16:21.157 [2024-07-24 23:38:06.039320] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:21.157 [2024-07-24 23:38:06.040391] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:21.157 [2024-07-24 23:38:06.040412] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:16:21.157 BaseBdev2 00:16:21.157 23:38:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:21.157 23:38:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:16:21.416 BaseBdev3_malloc 00:16:21.416 23:38:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:16:21.416 true 00:16:21.416 23:38:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:16:21.674 [2024-07-24 23:38:06.531866] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:16:21.674 [2024-07-24 23:38:06.531896] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:21.674 [2024-07-24 23:38:06.531907] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b31050 00:16:21.674 [2024-07-24 23:38:06.531913] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:21.674 [2024-07-24 23:38:06.532941] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:21.674 [2024-07-24 23:38:06.532962] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:16:21.674 BaseBdev3 00:16:21.674 23:38:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:21.674 23:38:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:16:21.933 BaseBdev4_malloc 00:16:21.933 23:38:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:16:21.933 true 00:16:21.933 23:38:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:16:22.192 [2024-07-24 23:38:07.032631] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:16:22.192 [2024-07-24 23:38:07.032666] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:22.192 [2024-07-24 23:38:07.032678] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b31f20 00:16:22.192 [2024-07-24 23:38:07.032684] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:22.192 [2024-07-24 23:38:07.033753] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:22.192 [2024-07-24 23:38:07.033773] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:16:22.192 BaseBdev4 00:16:22.192 23:38:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:16:22.452 [2024-07-24 23:38:07.201098] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:22.452 [2024-07-24 23:38:07.201980] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:22.452 [2024-07-24 23:38:07.202026] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:22.452 [2024-07-24 23:38:07.202064] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:16:22.452 [2024-07-24 23:38:07.202218] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1b2c0a0 00:16:22.452 [2024-07-24 23:38:07.202224] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:16:22.452 [2024-07-24 23:38:07.202356] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19806e0 00:16:22.452 [2024-07-24 23:38:07.202456] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1b2c0a0 00:16:22.452 [2024-07-24 23:38:07.202461] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1b2c0a0 00:16:22.452 [2024-07-24 23:38:07.202535] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:22.452 23:38:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:16:22.452 23:38:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:22.452 23:38:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:22.452 23:38:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:22.452 23:38:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:22.452 23:38:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:22.452 23:38:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:22.452 23:38:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:22.452 23:38:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:22.452 23:38:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:22.452 23:38:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:22.452 23:38:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:22.452 23:38:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:22.452 "name": "raid_bdev1", 00:16:22.452 "uuid": "73d791cd-e7c4-4567-9b3a-d728b037ba8f", 00:16:22.452 "strip_size_kb": 64, 00:16:22.452 "state": "online", 00:16:22.452 "raid_level": "concat", 00:16:22.452 "superblock": true, 00:16:22.452 "num_base_bdevs": 4, 00:16:22.452 "num_base_bdevs_discovered": 4, 00:16:22.452 "num_base_bdevs_operational": 4, 00:16:22.452 "base_bdevs_list": [ 00:16:22.452 { 00:16:22.452 "name": "BaseBdev1", 00:16:22.452 "uuid": "5b6154d3-8282-5de9-a72a-514e94112d2e", 00:16:22.452 "is_configured": true, 00:16:22.452 "data_offset": 2048, 00:16:22.452 "data_size": 63488 00:16:22.452 }, 00:16:22.452 { 00:16:22.452 "name": "BaseBdev2", 00:16:22.452 "uuid": "c7e00b77-85a7-594e-b2b1-ebed0b564319", 00:16:22.452 "is_configured": true, 00:16:22.452 "data_offset": 2048, 00:16:22.452 "data_size": 63488 00:16:22.452 }, 00:16:22.452 { 00:16:22.452 "name": "BaseBdev3", 00:16:22.452 "uuid": "c0bc6fe2-b56a-511d-89ac-7fa6c9d52a4e", 00:16:22.452 "is_configured": true, 00:16:22.452 "data_offset": 2048, 00:16:22.452 "data_size": 63488 00:16:22.452 }, 00:16:22.452 { 00:16:22.452 "name": "BaseBdev4", 00:16:22.452 "uuid": "9642c337-66e9-548e-8c4f-1d294a4e505f", 00:16:22.452 "is_configured": true, 00:16:22.452 "data_offset": 2048, 00:16:22.452 "data_size": 63488 00:16:22.452 } 00:16:22.452 ] 00:16:22.452 }' 00:16:22.452 23:38:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:22.452 23:38:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:23.018 23:38:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:16:23.018 23:38:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:16:23.018 [2024-07-24 23:38:07.939200] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b1e440 00:16:23.954 23:38:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:16:24.213 23:38:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:16:24.213 23:38:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:16:24.213 23:38:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:16:24.213 23:38:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:16:24.213 23:38:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:24.213 23:38:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:24.213 23:38:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:24.213 23:38:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:24.213 23:38:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:24.213 23:38:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:24.213 23:38:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:24.213 23:38:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:24.213 23:38:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:24.213 23:38:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:24.213 23:38:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:24.472 23:38:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:24.472 "name": "raid_bdev1", 00:16:24.472 "uuid": "73d791cd-e7c4-4567-9b3a-d728b037ba8f", 00:16:24.472 "strip_size_kb": 64, 00:16:24.472 "state": "online", 00:16:24.472 "raid_level": "concat", 00:16:24.472 "superblock": true, 00:16:24.472 "num_base_bdevs": 4, 00:16:24.472 "num_base_bdevs_discovered": 4, 00:16:24.472 "num_base_bdevs_operational": 4, 00:16:24.472 "base_bdevs_list": [ 00:16:24.472 { 00:16:24.472 "name": "BaseBdev1", 00:16:24.472 "uuid": "5b6154d3-8282-5de9-a72a-514e94112d2e", 00:16:24.472 "is_configured": true, 00:16:24.472 "data_offset": 2048, 00:16:24.472 "data_size": 63488 00:16:24.472 }, 00:16:24.472 { 00:16:24.472 "name": "BaseBdev2", 00:16:24.472 "uuid": "c7e00b77-85a7-594e-b2b1-ebed0b564319", 00:16:24.472 "is_configured": true, 00:16:24.472 "data_offset": 2048, 00:16:24.472 "data_size": 63488 00:16:24.472 }, 00:16:24.472 { 00:16:24.472 "name": "BaseBdev3", 00:16:24.472 "uuid": "c0bc6fe2-b56a-511d-89ac-7fa6c9d52a4e", 00:16:24.472 "is_configured": true, 00:16:24.472 "data_offset": 2048, 00:16:24.472 "data_size": 63488 00:16:24.472 }, 00:16:24.472 { 00:16:24.472 "name": "BaseBdev4", 00:16:24.472 "uuid": "9642c337-66e9-548e-8c4f-1d294a4e505f", 00:16:24.472 "is_configured": true, 00:16:24.472 "data_offset": 2048, 00:16:24.472 "data_size": 63488 00:16:24.472 } 00:16:24.472 ] 00:16:24.472 }' 00:16:24.472 23:38:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:24.472 23:38:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:24.731 23:38:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:16:24.990 [2024-07-24 23:38:09.847848] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:24.990 [2024-07-24 23:38:09.847880] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:24.990 [2024-07-24 23:38:09.849862] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:24.990 [2024-07-24 23:38:09.849887] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:24.990 [2024-07-24 23:38:09.849912] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:24.990 [2024-07-24 23:38:09.849917] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b2c0a0 name raid_bdev1, state offline 00:16:24.990 0 00:16:24.990 23:38:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 328241 00:16:24.990 23:38:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@950 -- # '[' -z 328241 ']' 00:16:24.990 23:38:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # kill -0 328241 00:16:24.990 23:38:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # uname 00:16:24.990 23:38:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:16:24.990 23:38:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 328241 00:16:24.990 23:38:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:16:24.990 23:38:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:16:24.990 23:38:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 328241' 00:16:24.990 killing process with pid 328241 00:16:24.990 23:38:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@969 -- # kill 328241 00:16:24.990 [2024-07-24 23:38:09.909462] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:24.990 23:38:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@974 -- # wait 328241 00:16:24.990 [2024-07-24 23:38:09.935043] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:25.249 23:38:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.oCSiRwY5vv 00:16:25.249 23:38:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:16:25.249 23:38:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:16:25.249 23:38:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.53 00:16:25.249 23:38:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:16:25.249 23:38:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:25.249 23:38:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:16:25.249 23:38:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.53 != \0\.\0\0 ]] 00:16:25.249 00:16:25.249 real 0m5.926s 00:16:25.249 user 0m9.324s 00:16:25.249 sys 0m0.871s 00:16:25.249 23:38:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:16:25.249 23:38:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:25.249 ************************************ 00:16:25.249 END TEST raid_read_error_test 00:16:25.249 ************************************ 00:16:25.249 23:38:10 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 4 write 00:16:25.249 23:38:10 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:16:25.249 23:38:10 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:16:25.249 23:38:10 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:25.249 ************************************ 00:16:25.249 START TEST raid_write_error_test 00:16:25.249 ************************************ 00:16:25.249 23:38:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test concat 4 write 00:16:25.249 23:38:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:16:25.249 23:38:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:16:25.249 23:38:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:16:25.249 23:38:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:16:25.249 23:38:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:25.249 23:38:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:16:25.249 23:38:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:25.249 23:38:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:25.249 23:38:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:16:25.249 23:38:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:25.249 23:38:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:25.249 23:38:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:16:25.249 23:38:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:25.249 23:38:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:25.249 23:38:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:16:25.249 23:38:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:25.249 23:38:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:25.249 23:38:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:16:25.249 23:38:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:16:25.249 23:38:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:16:25.249 23:38:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:16:25.249 23:38:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:16:25.249 23:38:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:16:25.249 23:38:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:16:25.249 23:38:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:16:25.249 23:38:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:16:25.249 23:38:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:16:25.249 23:38:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:16:25.249 23:38:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.tjAjMfvFtE 00:16:25.249 23:38:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=329259 00:16:25.249 23:38:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 329259 /var/tmp/spdk-raid.sock 00:16:25.249 23:38:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:16:25.249 23:38:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@831 -- # '[' -z 329259 ']' 00:16:25.249 23:38:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:25.249 23:38:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:16:25.249 23:38:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:25.249 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:25.250 23:38:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:16:25.250 23:38:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:25.508 [2024-07-24 23:38:10.251659] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:16:25.508 [2024-07-24 23:38:10.251699] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid329259 ] 00:16:25.508 [2024-07-24 23:38:10.314601] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:25.508 [2024-07-24 23:38:10.393796] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:25.508 [2024-07-24 23:38:10.450359] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:25.508 [2024-07-24 23:38:10.450388] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:26.075 23:38:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:16:26.075 23:38:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@864 -- # return 0 00:16:26.075 23:38:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:26.075 23:38:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:16:26.333 BaseBdev1_malloc 00:16:26.333 23:38:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:16:26.592 true 00:16:26.592 23:38:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:16:26.592 [2024-07-24 23:38:11.522658] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:16:26.592 [2024-07-24 23:38:11.522690] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:26.592 [2024-07-24 23:38:11.522702] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13ba550 00:16:26.592 [2024-07-24 23:38:11.522707] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:26.592 [2024-07-24 23:38:11.523911] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:26.592 [2024-07-24 23:38:11.523932] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:16:26.592 BaseBdev1 00:16:26.592 23:38:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:26.592 23:38:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:16:26.851 BaseBdev2_malloc 00:16:26.851 23:38:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:16:27.110 true 00:16:27.110 23:38:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:16:27.110 [2024-07-24 23:38:12.019410] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:16:27.110 [2024-07-24 23:38:12.019443] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:27.110 [2024-07-24 23:38:12.019455] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13bed90 00:16:27.110 [2024-07-24 23:38:12.019461] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:27.110 [2024-07-24 23:38:12.020515] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:27.110 [2024-07-24 23:38:12.020536] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:16:27.110 BaseBdev2 00:16:27.110 23:38:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:27.110 23:38:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:16:27.368 BaseBdev3_malloc 00:16:27.369 23:38:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:16:27.369 true 00:16:27.627 23:38:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:16:27.627 [2024-07-24 23:38:12.524203] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:16:27.627 [2024-07-24 23:38:12.524234] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:27.627 [2024-07-24 23:38:12.524245] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13c1050 00:16:27.627 [2024-07-24 23:38:12.524255] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:27.627 [2024-07-24 23:38:12.525329] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:27.627 [2024-07-24 23:38:12.525350] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:16:27.627 BaseBdev3 00:16:27.627 23:38:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:27.627 23:38:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:16:27.885 BaseBdev4_malloc 00:16:27.885 23:38:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:16:27.885 true 00:16:27.885 23:38:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:16:28.144 [2024-07-24 23:38:12.992787] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:16:28.144 [2024-07-24 23:38:12.992816] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:28.144 [2024-07-24 23:38:12.992827] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13c1f20 00:16:28.144 [2024-07-24 23:38:12.992833] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:28.144 [2024-07-24 23:38:12.993844] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:28.144 [2024-07-24 23:38:12.993864] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:16:28.144 BaseBdev4 00:16:28.144 23:38:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:16:28.402 [2024-07-24 23:38:13.161251] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:28.402 [2024-07-24 23:38:13.162124] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:28.402 [2024-07-24 23:38:13.162168] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:28.402 [2024-07-24 23:38:13.162207] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:16:28.402 [2024-07-24 23:38:13.162353] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x13bc0a0 00:16:28.402 [2024-07-24 23:38:13.162359] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:16:28.402 [2024-07-24 23:38:13.162498] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12106e0 00:16:28.402 [2024-07-24 23:38:13.162601] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x13bc0a0 00:16:28.402 [2024-07-24 23:38:13.162606] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x13bc0a0 00:16:28.402 [2024-07-24 23:38:13.162671] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:28.402 23:38:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:16:28.402 23:38:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:28.402 23:38:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:28.402 23:38:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:28.402 23:38:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:28.402 23:38:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:28.402 23:38:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:28.402 23:38:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:28.402 23:38:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:28.402 23:38:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:28.402 23:38:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:28.402 23:38:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:28.402 23:38:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:28.402 "name": "raid_bdev1", 00:16:28.402 "uuid": "73efd04b-610e-48df-951e-f32323156e7f", 00:16:28.402 "strip_size_kb": 64, 00:16:28.402 "state": "online", 00:16:28.402 "raid_level": "concat", 00:16:28.402 "superblock": true, 00:16:28.402 "num_base_bdevs": 4, 00:16:28.402 "num_base_bdevs_discovered": 4, 00:16:28.402 "num_base_bdevs_operational": 4, 00:16:28.402 "base_bdevs_list": [ 00:16:28.402 { 00:16:28.402 "name": "BaseBdev1", 00:16:28.402 "uuid": "935c4e6b-7024-5360-b8b6-4a4a701e2d8f", 00:16:28.402 "is_configured": true, 00:16:28.402 "data_offset": 2048, 00:16:28.402 "data_size": 63488 00:16:28.402 }, 00:16:28.402 { 00:16:28.402 "name": "BaseBdev2", 00:16:28.402 "uuid": "b028c6c2-72b6-5674-97c5-134ffdf8387e", 00:16:28.402 "is_configured": true, 00:16:28.402 "data_offset": 2048, 00:16:28.402 "data_size": 63488 00:16:28.402 }, 00:16:28.402 { 00:16:28.402 "name": "BaseBdev3", 00:16:28.402 "uuid": "fa0d3590-ec02-59ce-b141-4c3647ffd442", 00:16:28.402 "is_configured": true, 00:16:28.402 "data_offset": 2048, 00:16:28.402 "data_size": 63488 00:16:28.402 }, 00:16:28.402 { 00:16:28.402 "name": "BaseBdev4", 00:16:28.402 "uuid": "342383ff-cbcb-5237-8957-840feda93a5a", 00:16:28.402 "is_configured": true, 00:16:28.402 "data_offset": 2048, 00:16:28.402 "data_size": 63488 00:16:28.402 } 00:16:28.402 ] 00:16:28.402 }' 00:16:28.402 23:38:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:28.402 23:38:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:28.968 23:38:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:16:28.968 23:38:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:16:28.968 [2024-07-24 23:38:13.879304] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13ae440 00:16:29.904 23:38:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:16:30.161 23:38:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:16:30.161 23:38:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:16:30.161 23:38:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:16:30.161 23:38:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:16:30.161 23:38:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:30.161 23:38:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:30.161 23:38:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:30.161 23:38:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:30.161 23:38:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:30.161 23:38:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:30.161 23:38:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:30.161 23:38:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:30.161 23:38:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:30.161 23:38:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:30.161 23:38:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:30.161 23:38:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:30.161 "name": "raid_bdev1", 00:16:30.162 "uuid": "73efd04b-610e-48df-951e-f32323156e7f", 00:16:30.162 "strip_size_kb": 64, 00:16:30.162 "state": "online", 00:16:30.162 "raid_level": "concat", 00:16:30.162 "superblock": true, 00:16:30.162 "num_base_bdevs": 4, 00:16:30.162 "num_base_bdevs_discovered": 4, 00:16:30.162 "num_base_bdevs_operational": 4, 00:16:30.162 "base_bdevs_list": [ 00:16:30.162 { 00:16:30.162 "name": "BaseBdev1", 00:16:30.162 "uuid": "935c4e6b-7024-5360-b8b6-4a4a701e2d8f", 00:16:30.162 "is_configured": true, 00:16:30.162 "data_offset": 2048, 00:16:30.162 "data_size": 63488 00:16:30.162 }, 00:16:30.162 { 00:16:30.162 "name": "BaseBdev2", 00:16:30.162 "uuid": "b028c6c2-72b6-5674-97c5-134ffdf8387e", 00:16:30.162 "is_configured": true, 00:16:30.162 "data_offset": 2048, 00:16:30.162 "data_size": 63488 00:16:30.162 }, 00:16:30.162 { 00:16:30.162 "name": "BaseBdev3", 00:16:30.162 "uuid": "fa0d3590-ec02-59ce-b141-4c3647ffd442", 00:16:30.162 "is_configured": true, 00:16:30.162 "data_offset": 2048, 00:16:30.162 "data_size": 63488 00:16:30.162 }, 00:16:30.162 { 00:16:30.162 "name": "BaseBdev4", 00:16:30.162 "uuid": "342383ff-cbcb-5237-8957-840feda93a5a", 00:16:30.162 "is_configured": true, 00:16:30.162 "data_offset": 2048, 00:16:30.162 "data_size": 63488 00:16:30.162 } 00:16:30.162 ] 00:16:30.162 }' 00:16:30.162 23:38:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:30.162 23:38:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:30.727 23:38:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:16:30.986 [2024-07-24 23:38:15.791882] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:30.986 [2024-07-24 23:38:15.791918] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:30.986 [2024-07-24 23:38:15.793955] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:30.986 [2024-07-24 23:38:15.793982] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:30.986 [2024-07-24 23:38:15.794008] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:30.986 [2024-07-24 23:38:15.794013] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13bc0a0 name raid_bdev1, state offline 00:16:30.986 0 00:16:30.986 23:38:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 329259 00:16:30.986 23:38:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@950 -- # '[' -z 329259 ']' 00:16:30.986 23:38:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # kill -0 329259 00:16:30.986 23:38:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # uname 00:16:30.986 23:38:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:16:30.986 23:38:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 329259 00:16:30.986 23:38:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:16:30.986 23:38:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:16:30.986 23:38:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 329259' 00:16:30.986 killing process with pid 329259 00:16:30.986 23:38:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@969 -- # kill 329259 00:16:30.986 [2024-07-24 23:38:15.852534] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:30.986 23:38:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@974 -- # wait 329259 00:16:30.986 [2024-07-24 23:38:15.878191] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:31.244 23:38:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.tjAjMfvFtE 00:16:31.245 23:38:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:16:31.245 23:38:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:16:31.245 23:38:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.52 00:16:31.245 23:38:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:16:31.245 23:38:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:31.245 23:38:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:16:31.245 23:38:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.52 != \0\.\0\0 ]] 00:16:31.245 00:16:31.245 real 0m5.879s 00:16:31.245 user 0m9.224s 00:16:31.245 sys 0m0.850s 00:16:31.245 23:38:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:16:31.245 23:38:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:31.245 ************************************ 00:16:31.245 END TEST raid_write_error_test 00:16:31.245 ************************************ 00:16:31.245 23:38:16 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:16:31.245 23:38:16 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 4 false 00:16:31.245 23:38:16 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:16:31.245 23:38:16 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:16:31.245 23:38:16 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:31.245 ************************************ 00:16:31.245 START TEST raid_state_function_test 00:16:31.245 ************************************ 00:16:31.245 23:38:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # raid_state_function_test raid1 4 false 00:16:31.245 23:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:16:31.245 23:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:16:31.245 23:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:16:31.245 23:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:16:31.245 23:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:16:31.245 23:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:31.245 23:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:16:31.245 23:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:31.245 23:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:31.245 23:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:16:31.245 23:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:31.245 23:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:31.245 23:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:16:31.245 23:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:31.245 23:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:31.245 23:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:16:31.245 23:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:31.245 23:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:31.245 23:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:16:31.245 23:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:16:31.245 23:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:16:31.245 23:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:16:31.245 23:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:16:31.245 23:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:16:31.245 23:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:16:31.245 23:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:16:31.245 23:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:16:31.245 23:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:16:31.245 23:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=330377 00:16:31.245 23:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 330377' 00:16:31.245 Process raid pid: 330377 00:16:31.245 23:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:16:31.245 23:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 330377 /var/tmp/spdk-raid.sock 00:16:31.245 23:38:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@831 -- # '[' -z 330377 ']' 00:16:31.245 23:38:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:31.245 23:38:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:16:31.245 23:38:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:31.245 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:31.245 23:38:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:16:31.245 23:38:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:31.245 [2024-07-24 23:38:16.197870] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:16:31.245 [2024-07-24 23:38:16.197911] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:16:31.503 [2024-07-24 23:38:16.263599] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:31.503 [2024-07-24 23:38:16.341826] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:31.503 [2024-07-24 23:38:16.392732] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:31.503 [2024-07-24 23:38:16.392758] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:32.069 23:38:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:16:32.069 23:38:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@864 -- # return 0 00:16:32.069 23:38:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:32.328 [2024-07-24 23:38:17.127161] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:32.328 [2024-07-24 23:38:17.127188] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:32.328 [2024-07-24 23:38:17.127196] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:32.328 [2024-07-24 23:38:17.127201] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:32.328 [2024-07-24 23:38:17.127205] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:32.328 [2024-07-24 23:38:17.127210] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:32.328 [2024-07-24 23:38:17.127214] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:16:32.328 [2024-07-24 23:38:17.127219] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:16:32.328 23:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:16:32.328 23:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:32.328 23:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:32.328 23:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:32.328 23:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:32.328 23:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:32.328 23:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:32.328 23:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:32.328 23:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:32.328 23:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:32.328 23:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:32.328 23:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:32.328 23:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:32.328 "name": "Existed_Raid", 00:16:32.328 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:32.328 "strip_size_kb": 0, 00:16:32.328 "state": "configuring", 00:16:32.328 "raid_level": "raid1", 00:16:32.328 "superblock": false, 00:16:32.328 "num_base_bdevs": 4, 00:16:32.328 "num_base_bdevs_discovered": 0, 00:16:32.328 "num_base_bdevs_operational": 4, 00:16:32.328 "base_bdevs_list": [ 00:16:32.328 { 00:16:32.328 "name": "BaseBdev1", 00:16:32.328 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:32.328 "is_configured": false, 00:16:32.328 "data_offset": 0, 00:16:32.328 "data_size": 0 00:16:32.328 }, 00:16:32.328 { 00:16:32.328 "name": "BaseBdev2", 00:16:32.328 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:32.328 "is_configured": false, 00:16:32.328 "data_offset": 0, 00:16:32.328 "data_size": 0 00:16:32.328 }, 00:16:32.328 { 00:16:32.328 "name": "BaseBdev3", 00:16:32.328 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:32.328 "is_configured": false, 00:16:32.329 "data_offset": 0, 00:16:32.329 "data_size": 0 00:16:32.329 }, 00:16:32.329 { 00:16:32.329 "name": "BaseBdev4", 00:16:32.329 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:32.329 "is_configured": false, 00:16:32.329 "data_offset": 0, 00:16:32.329 "data_size": 0 00:16:32.329 } 00:16:32.329 ] 00:16:32.329 }' 00:16:32.329 23:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:32.329 23:38:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:32.894 23:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:33.153 [2024-07-24 23:38:17.917125] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:33.153 [2024-07-24 23:38:17.917144] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x180fb50 name Existed_Raid, state configuring 00:16:33.153 23:38:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:33.153 [2024-07-24 23:38:18.085566] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:33.153 [2024-07-24 23:38:18.085583] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:33.153 [2024-07-24 23:38:18.085588] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:33.153 [2024-07-24 23:38:18.085593] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:33.153 [2024-07-24 23:38:18.085597] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:33.153 [2024-07-24 23:38:18.085601] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:33.153 [2024-07-24 23:38:18.085621] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:16:33.153 [2024-07-24 23:38:18.085626] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:16:33.153 23:38:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:33.412 [2024-07-24 23:38:18.249950] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:33.412 BaseBdev1 00:16:33.412 23:38:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:16:33.412 23:38:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:16:33.412 23:38:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:16:33.412 23:38:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:16:33.412 23:38:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:16:33.412 23:38:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:16:33.412 23:38:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:33.671 23:38:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:33.671 [ 00:16:33.671 { 00:16:33.671 "name": "BaseBdev1", 00:16:33.671 "aliases": [ 00:16:33.671 "86694326-292b-48d7-a818-e6840d19de15" 00:16:33.671 ], 00:16:33.671 "product_name": "Malloc disk", 00:16:33.671 "block_size": 512, 00:16:33.671 "num_blocks": 65536, 00:16:33.671 "uuid": "86694326-292b-48d7-a818-e6840d19de15", 00:16:33.671 "assigned_rate_limits": { 00:16:33.671 "rw_ios_per_sec": 0, 00:16:33.671 "rw_mbytes_per_sec": 0, 00:16:33.671 "r_mbytes_per_sec": 0, 00:16:33.671 "w_mbytes_per_sec": 0 00:16:33.671 }, 00:16:33.671 "claimed": true, 00:16:33.671 "claim_type": "exclusive_write", 00:16:33.671 "zoned": false, 00:16:33.671 "supported_io_types": { 00:16:33.671 "read": true, 00:16:33.671 "write": true, 00:16:33.671 "unmap": true, 00:16:33.671 "flush": true, 00:16:33.671 "reset": true, 00:16:33.671 "nvme_admin": false, 00:16:33.671 "nvme_io": false, 00:16:33.671 "nvme_io_md": false, 00:16:33.671 "write_zeroes": true, 00:16:33.671 "zcopy": true, 00:16:33.671 "get_zone_info": false, 00:16:33.671 "zone_management": false, 00:16:33.671 "zone_append": false, 00:16:33.671 "compare": false, 00:16:33.671 "compare_and_write": false, 00:16:33.671 "abort": true, 00:16:33.671 "seek_hole": false, 00:16:33.671 "seek_data": false, 00:16:33.671 "copy": true, 00:16:33.671 "nvme_iov_md": false 00:16:33.671 }, 00:16:33.671 "memory_domains": [ 00:16:33.671 { 00:16:33.671 "dma_device_id": "system", 00:16:33.671 "dma_device_type": 1 00:16:33.671 }, 00:16:33.671 { 00:16:33.671 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:33.671 "dma_device_type": 2 00:16:33.671 } 00:16:33.671 ], 00:16:33.671 "driver_specific": {} 00:16:33.671 } 00:16:33.671 ] 00:16:33.671 23:38:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:16:33.671 23:38:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:16:33.671 23:38:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:33.671 23:38:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:33.671 23:38:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:33.671 23:38:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:33.671 23:38:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:33.671 23:38:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:33.671 23:38:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:33.671 23:38:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:33.671 23:38:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:33.671 23:38:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:33.671 23:38:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:33.930 23:38:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:33.930 "name": "Existed_Raid", 00:16:33.930 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:33.930 "strip_size_kb": 0, 00:16:33.930 "state": "configuring", 00:16:33.930 "raid_level": "raid1", 00:16:33.930 "superblock": false, 00:16:33.930 "num_base_bdevs": 4, 00:16:33.930 "num_base_bdevs_discovered": 1, 00:16:33.930 "num_base_bdevs_operational": 4, 00:16:33.930 "base_bdevs_list": [ 00:16:33.930 { 00:16:33.930 "name": "BaseBdev1", 00:16:33.930 "uuid": "86694326-292b-48d7-a818-e6840d19de15", 00:16:33.930 "is_configured": true, 00:16:33.930 "data_offset": 0, 00:16:33.930 "data_size": 65536 00:16:33.930 }, 00:16:33.930 { 00:16:33.930 "name": "BaseBdev2", 00:16:33.930 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:33.930 "is_configured": false, 00:16:33.930 "data_offset": 0, 00:16:33.930 "data_size": 0 00:16:33.930 }, 00:16:33.930 { 00:16:33.930 "name": "BaseBdev3", 00:16:33.931 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:33.931 "is_configured": false, 00:16:33.931 "data_offset": 0, 00:16:33.931 "data_size": 0 00:16:33.931 }, 00:16:33.931 { 00:16:33.931 "name": "BaseBdev4", 00:16:33.931 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:33.931 "is_configured": false, 00:16:33.931 "data_offset": 0, 00:16:33.931 "data_size": 0 00:16:33.931 } 00:16:33.931 ] 00:16:33.931 }' 00:16:33.931 23:38:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:33.931 23:38:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:34.497 23:38:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:34.497 [2024-07-24 23:38:19.396903] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:34.497 [2024-07-24 23:38:19.396938] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x180f3a0 name Existed_Raid, state configuring 00:16:34.497 23:38:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:34.756 [2024-07-24 23:38:19.565355] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:34.756 [2024-07-24 23:38:19.566318] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:34.756 [2024-07-24 23:38:19.566340] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:34.756 [2024-07-24 23:38:19.566345] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:34.756 [2024-07-24 23:38:19.566350] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:34.756 [2024-07-24 23:38:19.566354] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:16:34.756 [2024-07-24 23:38:19.566358] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:16:34.756 23:38:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:16:34.756 23:38:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:34.756 23:38:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:16:34.756 23:38:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:34.756 23:38:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:34.756 23:38:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:34.756 23:38:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:34.756 23:38:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:34.756 23:38:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:34.756 23:38:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:34.756 23:38:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:34.756 23:38:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:34.756 23:38:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:34.756 23:38:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:35.015 23:38:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:35.015 "name": "Existed_Raid", 00:16:35.015 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:35.015 "strip_size_kb": 0, 00:16:35.015 "state": "configuring", 00:16:35.015 "raid_level": "raid1", 00:16:35.015 "superblock": false, 00:16:35.015 "num_base_bdevs": 4, 00:16:35.015 "num_base_bdevs_discovered": 1, 00:16:35.015 "num_base_bdevs_operational": 4, 00:16:35.015 "base_bdevs_list": [ 00:16:35.015 { 00:16:35.015 "name": "BaseBdev1", 00:16:35.015 "uuid": "86694326-292b-48d7-a818-e6840d19de15", 00:16:35.015 "is_configured": true, 00:16:35.015 "data_offset": 0, 00:16:35.015 "data_size": 65536 00:16:35.015 }, 00:16:35.015 { 00:16:35.015 "name": "BaseBdev2", 00:16:35.015 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:35.015 "is_configured": false, 00:16:35.015 "data_offset": 0, 00:16:35.015 "data_size": 0 00:16:35.015 }, 00:16:35.015 { 00:16:35.015 "name": "BaseBdev3", 00:16:35.015 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:35.015 "is_configured": false, 00:16:35.015 "data_offset": 0, 00:16:35.015 "data_size": 0 00:16:35.015 }, 00:16:35.015 { 00:16:35.015 "name": "BaseBdev4", 00:16:35.015 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:35.015 "is_configured": false, 00:16:35.015 "data_offset": 0, 00:16:35.015 "data_size": 0 00:16:35.015 } 00:16:35.015 ] 00:16:35.015 }' 00:16:35.015 23:38:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:35.015 23:38:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:35.272 23:38:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:35.563 [2024-07-24 23:38:20.414313] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:35.563 BaseBdev2 00:16:35.563 23:38:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:16:35.563 23:38:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:16:35.563 23:38:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:16:35.563 23:38:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:16:35.563 23:38:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:16:35.563 23:38:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:16:35.563 23:38:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:35.821 23:38:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:35.821 [ 00:16:35.821 { 00:16:35.821 "name": "BaseBdev2", 00:16:35.821 "aliases": [ 00:16:35.821 "969a14f1-bb71-445e-b007-c105d012262d" 00:16:35.821 ], 00:16:35.821 "product_name": "Malloc disk", 00:16:35.821 "block_size": 512, 00:16:35.821 "num_blocks": 65536, 00:16:35.821 "uuid": "969a14f1-bb71-445e-b007-c105d012262d", 00:16:35.821 "assigned_rate_limits": { 00:16:35.821 "rw_ios_per_sec": 0, 00:16:35.821 "rw_mbytes_per_sec": 0, 00:16:35.821 "r_mbytes_per_sec": 0, 00:16:35.821 "w_mbytes_per_sec": 0 00:16:35.821 }, 00:16:35.821 "claimed": true, 00:16:35.821 "claim_type": "exclusive_write", 00:16:35.821 "zoned": false, 00:16:35.821 "supported_io_types": { 00:16:35.821 "read": true, 00:16:35.821 "write": true, 00:16:35.821 "unmap": true, 00:16:35.821 "flush": true, 00:16:35.821 "reset": true, 00:16:35.821 "nvme_admin": false, 00:16:35.821 "nvme_io": false, 00:16:35.821 "nvme_io_md": false, 00:16:35.821 "write_zeroes": true, 00:16:35.821 "zcopy": true, 00:16:35.821 "get_zone_info": false, 00:16:35.821 "zone_management": false, 00:16:35.821 "zone_append": false, 00:16:35.822 "compare": false, 00:16:35.822 "compare_and_write": false, 00:16:35.822 "abort": true, 00:16:35.822 "seek_hole": false, 00:16:35.822 "seek_data": false, 00:16:35.822 "copy": true, 00:16:35.822 "nvme_iov_md": false 00:16:35.822 }, 00:16:35.822 "memory_domains": [ 00:16:35.822 { 00:16:35.822 "dma_device_id": "system", 00:16:35.822 "dma_device_type": 1 00:16:35.822 }, 00:16:35.822 { 00:16:35.822 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:35.822 "dma_device_type": 2 00:16:35.822 } 00:16:35.822 ], 00:16:35.822 "driver_specific": {} 00:16:35.822 } 00:16:35.822 ] 00:16:35.822 23:38:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:16:35.822 23:38:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:35.822 23:38:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:35.822 23:38:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:16:35.822 23:38:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:35.822 23:38:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:35.822 23:38:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:35.822 23:38:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:35.822 23:38:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:35.822 23:38:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:35.822 23:38:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:35.822 23:38:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:35.822 23:38:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:35.822 23:38:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:35.822 23:38:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:36.081 23:38:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:36.081 "name": "Existed_Raid", 00:16:36.081 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:36.081 "strip_size_kb": 0, 00:16:36.081 "state": "configuring", 00:16:36.081 "raid_level": "raid1", 00:16:36.081 "superblock": false, 00:16:36.081 "num_base_bdevs": 4, 00:16:36.081 "num_base_bdevs_discovered": 2, 00:16:36.081 "num_base_bdevs_operational": 4, 00:16:36.081 "base_bdevs_list": [ 00:16:36.081 { 00:16:36.081 "name": "BaseBdev1", 00:16:36.081 "uuid": "86694326-292b-48d7-a818-e6840d19de15", 00:16:36.081 "is_configured": true, 00:16:36.081 "data_offset": 0, 00:16:36.081 "data_size": 65536 00:16:36.081 }, 00:16:36.081 { 00:16:36.081 "name": "BaseBdev2", 00:16:36.081 "uuid": "969a14f1-bb71-445e-b007-c105d012262d", 00:16:36.081 "is_configured": true, 00:16:36.081 "data_offset": 0, 00:16:36.081 "data_size": 65536 00:16:36.081 }, 00:16:36.081 { 00:16:36.081 "name": "BaseBdev3", 00:16:36.081 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:36.081 "is_configured": false, 00:16:36.081 "data_offset": 0, 00:16:36.081 "data_size": 0 00:16:36.081 }, 00:16:36.081 { 00:16:36.081 "name": "BaseBdev4", 00:16:36.081 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:36.081 "is_configured": false, 00:16:36.081 "data_offset": 0, 00:16:36.081 "data_size": 0 00:16:36.081 } 00:16:36.081 ] 00:16:36.081 }' 00:16:36.081 23:38:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:36.081 23:38:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:36.708 23:38:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:36.708 [2024-07-24 23:38:21.591950] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:36.708 BaseBdev3 00:16:36.708 23:38:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:16:36.708 23:38:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:16:36.708 23:38:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:16:36.708 23:38:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:16:36.708 23:38:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:16:36.708 23:38:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:16:36.708 23:38:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:36.967 23:38:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:36.967 [ 00:16:36.967 { 00:16:36.967 "name": "BaseBdev3", 00:16:36.967 "aliases": [ 00:16:36.967 "655c258a-83b7-4741-aea3-96d8768f113a" 00:16:36.967 ], 00:16:36.967 "product_name": "Malloc disk", 00:16:36.967 "block_size": 512, 00:16:36.967 "num_blocks": 65536, 00:16:36.967 "uuid": "655c258a-83b7-4741-aea3-96d8768f113a", 00:16:36.967 "assigned_rate_limits": { 00:16:36.967 "rw_ios_per_sec": 0, 00:16:36.967 "rw_mbytes_per_sec": 0, 00:16:36.967 "r_mbytes_per_sec": 0, 00:16:36.967 "w_mbytes_per_sec": 0 00:16:36.967 }, 00:16:36.967 "claimed": true, 00:16:36.967 "claim_type": "exclusive_write", 00:16:36.967 "zoned": false, 00:16:36.967 "supported_io_types": { 00:16:36.967 "read": true, 00:16:36.967 "write": true, 00:16:36.967 "unmap": true, 00:16:36.967 "flush": true, 00:16:36.967 "reset": true, 00:16:36.967 "nvme_admin": false, 00:16:36.967 "nvme_io": false, 00:16:36.967 "nvme_io_md": false, 00:16:36.967 "write_zeroes": true, 00:16:36.967 "zcopy": true, 00:16:36.967 "get_zone_info": false, 00:16:36.967 "zone_management": false, 00:16:36.967 "zone_append": false, 00:16:36.967 "compare": false, 00:16:36.967 "compare_and_write": false, 00:16:36.967 "abort": true, 00:16:36.967 "seek_hole": false, 00:16:36.967 "seek_data": false, 00:16:36.967 "copy": true, 00:16:36.967 "nvme_iov_md": false 00:16:36.967 }, 00:16:36.967 "memory_domains": [ 00:16:36.967 { 00:16:36.967 "dma_device_id": "system", 00:16:36.967 "dma_device_type": 1 00:16:36.967 }, 00:16:36.967 { 00:16:36.967 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:36.967 "dma_device_type": 2 00:16:36.967 } 00:16:36.967 ], 00:16:36.967 "driver_specific": {} 00:16:36.967 } 00:16:36.967 ] 00:16:37.226 23:38:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:16:37.226 23:38:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:37.226 23:38:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:37.226 23:38:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:16:37.226 23:38:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:37.226 23:38:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:37.226 23:38:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:37.226 23:38:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:37.226 23:38:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:37.226 23:38:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:37.226 23:38:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:37.226 23:38:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:37.226 23:38:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:37.226 23:38:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:37.226 23:38:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:37.226 23:38:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:37.226 "name": "Existed_Raid", 00:16:37.226 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:37.226 "strip_size_kb": 0, 00:16:37.226 "state": "configuring", 00:16:37.226 "raid_level": "raid1", 00:16:37.226 "superblock": false, 00:16:37.226 "num_base_bdevs": 4, 00:16:37.226 "num_base_bdevs_discovered": 3, 00:16:37.226 "num_base_bdevs_operational": 4, 00:16:37.226 "base_bdevs_list": [ 00:16:37.226 { 00:16:37.226 "name": "BaseBdev1", 00:16:37.226 "uuid": "86694326-292b-48d7-a818-e6840d19de15", 00:16:37.226 "is_configured": true, 00:16:37.226 "data_offset": 0, 00:16:37.226 "data_size": 65536 00:16:37.226 }, 00:16:37.226 { 00:16:37.226 "name": "BaseBdev2", 00:16:37.226 "uuid": "969a14f1-bb71-445e-b007-c105d012262d", 00:16:37.226 "is_configured": true, 00:16:37.226 "data_offset": 0, 00:16:37.226 "data_size": 65536 00:16:37.226 }, 00:16:37.226 { 00:16:37.226 "name": "BaseBdev3", 00:16:37.226 "uuid": "655c258a-83b7-4741-aea3-96d8768f113a", 00:16:37.226 "is_configured": true, 00:16:37.226 "data_offset": 0, 00:16:37.226 "data_size": 65536 00:16:37.226 }, 00:16:37.226 { 00:16:37.226 "name": "BaseBdev4", 00:16:37.226 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:37.226 "is_configured": false, 00:16:37.226 "data_offset": 0, 00:16:37.226 "data_size": 0 00:16:37.226 } 00:16:37.226 ] 00:16:37.226 }' 00:16:37.226 23:38:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:37.226 23:38:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:37.793 23:38:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:16:37.793 [2024-07-24 23:38:22.757486] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:16:37.793 [2024-07-24 23:38:22.757517] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x18103d0 00:16:37.793 [2024-07-24 23:38:22.757520] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:16:37.793 [2024-07-24 23:38:22.757666] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1815e50 00:16:37.793 [2024-07-24 23:38:22.757754] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x18103d0 00:16:37.793 [2024-07-24 23:38:22.757759] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x18103d0 00:16:37.793 [2024-07-24 23:38:22.757870] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:37.793 BaseBdev4 00:16:37.793 23:38:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:16:37.793 23:38:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev4 00:16:37.793 23:38:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:16:37.793 23:38:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:16:37.793 23:38:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:16:37.793 23:38:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:16:37.793 23:38:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:38.051 23:38:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:16:38.310 [ 00:16:38.310 { 00:16:38.310 "name": "BaseBdev4", 00:16:38.310 "aliases": [ 00:16:38.310 "d8ee01a5-6f3f-4769-b105-58b373014b49" 00:16:38.310 ], 00:16:38.310 "product_name": "Malloc disk", 00:16:38.310 "block_size": 512, 00:16:38.310 "num_blocks": 65536, 00:16:38.310 "uuid": "d8ee01a5-6f3f-4769-b105-58b373014b49", 00:16:38.310 "assigned_rate_limits": { 00:16:38.310 "rw_ios_per_sec": 0, 00:16:38.310 "rw_mbytes_per_sec": 0, 00:16:38.310 "r_mbytes_per_sec": 0, 00:16:38.310 "w_mbytes_per_sec": 0 00:16:38.310 }, 00:16:38.310 "claimed": true, 00:16:38.310 "claim_type": "exclusive_write", 00:16:38.310 "zoned": false, 00:16:38.310 "supported_io_types": { 00:16:38.310 "read": true, 00:16:38.310 "write": true, 00:16:38.310 "unmap": true, 00:16:38.310 "flush": true, 00:16:38.310 "reset": true, 00:16:38.310 "nvme_admin": false, 00:16:38.310 "nvme_io": false, 00:16:38.310 "nvme_io_md": false, 00:16:38.310 "write_zeroes": true, 00:16:38.310 "zcopy": true, 00:16:38.310 "get_zone_info": false, 00:16:38.310 "zone_management": false, 00:16:38.310 "zone_append": false, 00:16:38.310 "compare": false, 00:16:38.310 "compare_and_write": false, 00:16:38.310 "abort": true, 00:16:38.310 "seek_hole": false, 00:16:38.310 "seek_data": false, 00:16:38.310 "copy": true, 00:16:38.310 "nvme_iov_md": false 00:16:38.310 }, 00:16:38.310 "memory_domains": [ 00:16:38.310 { 00:16:38.310 "dma_device_id": "system", 00:16:38.310 "dma_device_type": 1 00:16:38.310 }, 00:16:38.310 { 00:16:38.310 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:38.310 "dma_device_type": 2 00:16:38.310 } 00:16:38.310 ], 00:16:38.310 "driver_specific": {} 00:16:38.310 } 00:16:38.310 ] 00:16:38.310 23:38:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:16:38.310 23:38:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:38.310 23:38:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:38.310 23:38:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:16:38.310 23:38:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:38.310 23:38:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:38.310 23:38:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:38.310 23:38:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:38.310 23:38:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:38.310 23:38:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:38.310 23:38:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:38.310 23:38:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:38.310 23:38:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:38.310 23:38:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:38.310 23:38:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:38.568 23:38:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:38.568 "name": "Existed_Raid", 00:16:38.568 "uuid": "2cdf9b7b-2f87-4898-9d42-0df72781c699", 00:16:38.568 "strip_size_kb": 0, 00:16:38.568 "state": "online", 00:16:38.568 "raid_level": "raid1", 00:16:38.568 "superblock": false, 00:16:38.568 "num_base_bdevs": 4, 00:16:38.568 "num_base_bdevs_discovered": 4, 00:16:38.568 "num_base_bdevs_operational": 4, 00:16:38.568 "base_bdevs_list": [ 00:16:38.568 { 00:16:38.568 "name": "BaseBdev1", 00:16:38.568 "uuid": "86694326-292b-48d7-a818-e6840d19de15", 00:16:38.568 "is_configured": true, 00:16:38.568 "data_offset": 0, 00:16:38.568 "data_size": 65536 00:16:38.568 }, 00:16:38.568 { 00:16:38.568 "name": "BaseBdev2", 00:16:38.568 "uuid": "969a14f1-bb71-445e-b007-c105d012262d", 00:16:38.568 "is_configured": true, 00:16:38.568 "data_offset": 0, 00:16:38.568 "data_size": 65536 00:16:38.568 }, 00:16:38.568 { 00:16:38.568 "name": "BaseBdev3", 00:16:38.568 "uuid": "655c258a-83b7-4741-aea3-96d8768f113a", 00:16:38.568 "is_configured": true, 00:16:38.568 "data_offset": 0, 00:16:38.568 "data_size": 65536 00:16:38.568 }, 00:16:38.568 { 00:16:38.568 "name": "BaseBdev4", 00:16:38.568 "uuid": "d8ee01a5-6f3f-4769-b105-58b373014b49", 00:16:38.568 "is_configured": true, 00:16:38.568 "data_offset": 0, 00:16:38.568 "data_size": 65536 00:16:38.568 } 00:16:38.568 ] 00:16:38.568 }' 00:16:38.568 23:38:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:38.568 23:38:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:38.826 23:38:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:16:38.826 23:38:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:38.826 23:38:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:38.826 23:38:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:38.826 23:38:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:38.826 23:38:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:38.826 23:38:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:38.826 23:38:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:39.084 [2024-07-24 23:38:23.912709] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:39.084 23:38:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:39.084 "name": "Existed_Raid", 00:16:39.084 "aliases": [ 00:16:39.084 "2cdf9b7b-2f87-4898-9d42-0df72781c699" 00:16:39.084 ], 00:16:39.084 "product_name": "Raid Volume", 00:16:39.084 "block_size": 512, 00:16:39.084 "num_blocks": 65536, 00:16:39.084 "uuid": "2cdf9b7b-2f87-4898-9d42-0df72781c699", 00:16:39.084 "assigned_rate_limits": { 00:16:39.084 "rw_ios_per_sec": 0, 00:16:39.084 "rw_mbytes_per_sec": 0, 00:16:39.084 "r_mbytes_per_sec": 0, 00:16:39.084 "w_mbytes_per_sec": 0 00:16:39.084 }, 00:16:39.084 "claimed": false, 00:16:39.084 "zoned": false, 00:16:39.084 "supported_io_types": { 00:16:39.084 "read": true, 00:16:39.084 "write": true, 00:16:39.084 "unmap": false, 00:16:39.084 "flush": false, 00:16:39.084 "reset": true, 00:16:39.084 "nvme_admin": false, 00:16:39.084 "nvme_io": false, 00:16:39.084 "nvme_io_md": false, 00:16:39.084 "write_zeroes": true, 00:16:39.084 "zcopy": false, 00:16:39.084 "get_zone_info": false, 00:16:39.084 "zone_management": false, 00:16:39.084 "zone_append": false, 00:16:39.084 "compare": false, 00:16:39.084 "compare_and_write": false, 00:16:39.084 "abort": false, 00:16:39.084 "seek_hole": false, 00:16:39.084 "seek_data": false, 00:16:39.084 "copy": false, 00:16:39.084 "nvme_iov_md": false 00:16:39.084 }, 00:16:39.084 "memory_domains": [ 00:16:39.084 { 00:16:39.084 "dma_device_id": "system", 00:16:39.084 "dma_device_type": 1 00:16:39.084 }, 00:16:39.084 { 00:16:39.084 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:39.084 "dma_device_type": 2 00:16:39.084 }, 00:16:39.084 { 00:16:39.084 "dma_device_id": "system", 00:16:39.084 "dma_device_type": 1 00:16:39.084 }, 00:16:39.084 { 00:16:39.084 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:39.084 "dma_device_type": 2 00:16:39.084 }, 00:16:39.084 { 00:16:39.084 "dma_device_id": "system", 00:16:39.084 "dma_device_type": 1 00:16:39.084 }, 00:16:39.084 { 00:16:39.084 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:39.084 "dma_device_type": 2 00:16:39.084 }, 00:16:39.084 { 00:16:39.084 "dma_device_id": "system", 00:16:39.084 "dma_device_type": 1 00:16:39.084 }, 00:16:39.084 { 00:16:39.084 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:39.084 "dma_device_type": 2 00:16:39.084 } 00:16:39.084 ], 00:16:39.084 "driver_specific": { 00:16:39.084 "raid": { 00:16:39.084 "uuid": "2cdf9b7b-2f87-4898-9d42-0df72781c699", 00:16:39.084 "strip_size_kb": 0, 00:16:39.084 "state": "online", 00:16:39.084 "raid_level": "raid1", 00:16:39.084 "superblock": false, 00:16:39.084 "num_base_bdevs": 4, 00:16:39.084 "num_base_bdevs_discovered": 4, 00:16:39.084 "num_base_bdevs_operational": 4, 00:16:39.084 "base_bdevs_list": [ 00:16:39.084 { 00:16:39.084 "name": "BaseBdev1", 00:16:39.084 "uuid": "86694326-292b-48d7-a818-e6840d19de15", 00:16:39.084 "is_configured": true, 00:16:39.084 "data_offset": 0, 00:16:39.084 "data_size": 65536 00:16:39.084 }, 00:16:39.084 { 00:16:39.084 "name": "BaseBdev2", 00:16:39.084 "uuid": "969a14f1-bb71-445e-b007-c105d012262d", 00:16:39.084 "is_configured": true, 00:16:39.084 "data_offset": 0, 00:16:39.084 "data_size": 65536 00:16:39.084 }, 00:16:39.084 { 00:16:39.084 "name": "BaseBdev3", 00:16:39.084 "uuid": "655c258a-83b7-4741-aea3-96d8768f113a", 00:16:39.084 "is_configured": true, 00:16:39.084 "data_offset": 0, 00:16:39.084 "data_size": 65536 00:16:39.084 }, 00:16:39.084 { 00:16:39.084 "name": "BaseBdev4", 00:16:39.084 "uuid": "d8ee01a5-6f3f-4769-b105-58b373014b49", 00:16:39.084 "is_configured": true, 00:16:39.084 "data_offset": 0, 00:16:39.084 "data_size": 65536 00:16:39.084 } 00:16:39.084 ] 00:16:39.084 } 00:16:39.084 } 00:16:39.084 }' 00:16:39.084 23:38:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:39.084 23:38:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:16:39.084 BaseBdev2 00:16:39.084 BaseBdev3 00:16:39.084 BaseBdev4' 00:16:39.084 23:38:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:39.084 23:38:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:16:39.084 23:38:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:39.342 23:38:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:39.342 "name": "BaseBdev1", 00:16:39.342 "aliases": [ 00:16:39.342 "86694326-292b-48d7-a818-e6840d19de15" 00:16:39.342 ], 00:16:39.342 "product_name": "Malloc disk", 00:16:39.342 "block_size": 512, 00:16:39.342 "num_blocks": 65536, 00:16:39.342 "uuid": "86694326-292b-48d7-a818-e6840d19de15", 00:16:39.342 "assigned_rate_limits": { 00:16:39.342 "rw_ios_per_sec": 0, 00:16:39.342 "rw_mbytes_per_sec": 0, 00:16:39.342 "r_mbytes_per_sec": 0, 00:16:39.342 "w_mbytes_per_sec": 0 00:16:39.342 }, 00:16:39.342 "claimed": true, 00:16:39.342 "claim_type": "exclusive_write", 00:16:39.342 "zoned": false, 00:16:39.342 "supported_io_types": { 00:16:39.342 "read": true, 00:16:39.342 "write": true, 00:16:39.342 "unmap": true, 00:16:39.342 "flush": true, 00:16:39.342 "reset": true, 00:16:39.342 "nvme_admin": false, 00:16:39.342 "nvme_io": false, 00:16:39.342 "nvme_io_md": false, 00:16:39.342 "write_zeroes": true, 00:16:39.342 "zcopy": true, 00:16:39.342 "get_zone_info": false, 00:16:39.342 "zone_management": false, 00:16:39.342 "zone_append": false, 00:16:39.342 "compare": false, 00:16:39.342 "compare_and_write": false, 00:16:39.342 "abort": true, 00:16:39.342 "seek_hole": false, 00:16:39.342 "seek_data": false, 00:16:39.342 "copy": true, 00:16:39.342 "nvme_iov_md": false 00:16:39.342 }, 00:16:39.342 "memory_domains": [ 00:16:39.342 { 00:16:39.342 "dma_device_id": "system", 00:16:39.342 "dma_device_type": 1 00:16:39.342 }, 00:16:39.342 { 00:16:39.343 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:39.343 "dma_device_type": 2 00:16:39.343 } 00:16:39.343 ], 00:16:39.343 "driver_specific": {} 00:16:39.343 }' 00:16:39.343 23:38:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:39.343 23:38:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:39.343 23:38:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:39.343 23:38:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:39.343 23:38:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:39.343 23:38:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:39.343 23:38:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:39.601 23:38:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:39.601 23:38:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:39.601 23:38:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:39.601 23:38:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:39.601 23:38:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:39.601 23:38:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:39.601 23:38:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:39.601 23:38:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:39.860 23:38:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:39.860 "name": "BaseBdev2", 00:16:39.860 "aliases": [ 00:16:39.860 "969a14f1-bb71-445e-b007-c105d012262d" 00:16:39.860 ], 00:16:39.860 "product_name": "Malloc disk", 00:16:39.860 "block_size": 512, 00:16:39.860 "num_blocks": 65536, 00:16:39.860 "uuid": "969a14f1-bb71-445e-b007-c105d012262d", 00:16:39.860 "assigned_rate_limits": { 00:16:39.860 "rw_ios_per_sec": 0, 00:16:39.860 "rw_mbytes_per_sec": 0, 00:16:39.860 "r_mbytes_per_sec": 0, 00:16:39.860 "w_mbytes_per_sec": 0 00:16:39.860 }, 00:16:39.860 "claimed": true, 00:16:39.860 "claim_type": "exclusive_write", 00:16:39.860 "zoned": false, 00:16:39.860 "supported_io_types": { 00:16:39.860 "read": true, 00:16:39.860 "write": true, 00:16:39.860 "unmap": true, 00:16:39.860 "flush": true, 00:16:39.860 "reset": true, 00:16:39.860 "nvme_admin": false, 00:16:39.860 "nvme_io": false, 00:16:39.860 "nvme_io_md": false, 00:16:39.860 "write_zeroes": true, 00:16:39.860 "zcopy": true, 00:16:39.860 "get_zone_info": false, 00:16:39.860 "zone_management": false, 00:16:39.860 "zone_append": false, 00:16:39.860 "compare": false, 00:16:39.860 "compare_and_write": false, 00:16:39.860 "abort": true, 00:16:39.860 "seek_hole": false, 00:16:39.860 "seek_data": false, 00:16:39.860 "copy": true, 00:16:39.860 "nvme_iov_md": false 00:16:39.860 }, 00:16:39.860 "memory_domains": [ 00:16:39.860 { 00:16:39.860 "dma_device_id": "system", 00:16:39.860 "dma_device_type": 1 00:16:39.860 }, 00:16:39.860 { 00:16:39.860 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:39.860 "dma_device_type": 2 00:16:39.860 } 00:16:39.860 ], 00:16:39.860 "driver_specific": {} 00:16:39.860 }' 00:16:39.860 23:38:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:39.860 23:38:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:39.860 23:38:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:39.860 23:38:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:39.860 23:38:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:39.860 23:38:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:39.860 23:38:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:39.860 23:38:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:40.119 23:38:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:40.119 23:38:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:40.119 23:38:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:40.119 23:38:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:40.119 23:38:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:40.119 23:38:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:40.119 23:38:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:40.119 23:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:40.119 "name": "BaseBdev3", 00:16:40.119 "aliases": [ 00:16:40.119 "655c258a-83b7-4741-aea3-96d8768f113a" 00:16:40.119 ], 00:16:40.119 "product_name": "Malloc disk", 00:16:40.119 "block_size": 512, 00:16:40.119 "num_blocks": 65536, 00:16:40.119 "uuid": "655c258a-83b7-4741-aea3-96d8768f113a", 00:16:40.119 "assigned_rate_limits": { 00:16:40.119 "rw_ios_per_sec": 0, 00:16:40.119 "rw_mbytes_per_sec": 0, 00:16:40.119 "r_mbytes_per_sec": 0, 00:16:40.119 "w_mbytes_per_sec": 0 00:16:40.119 }, 00:16:40.119 "claimed": true, 00:16:40.119 "claim_type": "exclusive_write", 00:16:40.119 "zoned": false, 00:16:40.119 "supported_io_types": { 00:16:40.119 "read": true, 00:16:40.119 "write": true, 00:16:40.119 "unmap": true, 00:16:40.119 "flush": true, 00:16:40.119 "reset": true, 00:16:40.119 "nvme_admin": false, 00:16:40.119 "nvme_io": false, 00:16:40.119 "nvme_io_md": false, 00:16:40.119 "write_zeroes": true, 00:16:40.119 "zcopy": true, 00:16:40.119 "get_zone_info": false, 00:16:40.119 "zone_management": false, 00:16:40.119 "zone_append": false, 00:16:40.119 "compare": false, 00:16:40.119 "compare_and_write": false, 00:16:40.119 "abort": true, 00:16:40.119 "seek_hole": false, 00:16:40.119 "seek_data": false, 00:16:40.119 "copy": true, 00:16:40.119 "nvme_iov_md": false 00:16:40.119 }, 00:16:40.119 "memory_domains": [ 00:16:40.119 { 00:16:40.119 "dma_device_id": "system", 00:16:40.119 "dma_device_type": 1 00:16:40.119 }, 00:16:40.119 { 00:16:40.119 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:40.119 "dma_device_type": 2 00:16:40.119 } 00:16:40.119 ], 00:16:40.119 "driver_specific": {} 00:16:40.119 }' 00:16:40.119 23:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:40.379 23:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:40.379 23:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:40.379 23:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:40.379 23:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:40.379 23:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:40.379 23:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:40.379 23:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:40.379 23:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:40.379 23:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:40.379 23:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:40.379 23:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:40.379 23:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:40.379 23:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:40.379 23:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:16:40.637 23:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:40.637 "name": "BaseBdev4", 00:16:40.637 "aliases": [ 00:16:40.637 "d8ee01a5-6f3f-4769-b105-58b373014b49" 00:16:40.637 ], 00:16:40.637 "product_name": "Malloc disk", 00:16:40.637 "block_size": 512, 00:16:40.637 "num_blocks": 65536, 00:16:40.637 "uuid": "d8ee01a5-6f3f-4769-b105-58b373014b49", 00:16:40.637 "assigned_rate_limits": { 00:16:40.637 "rw_ios_per_sec": 0, 00:16:40.637 "rw_mbytes_per_sec": 0, 00:16:40.637 "r_mbytes_per_sec": 0, 00:16:40.637 "w_mbytes_per_sec": 0 00:16:40.637 }, 00:16:40.637 "claimed": true, 00:16:40.637 "claim_type": "exclusive_write", 00:16:40.637 "zoned": false, 00:16:40.637 "supported_io_types": { 00:16:40.637 "read": true, 00:16:40.637 "write": true, 00:16:40.637 "unmap": true, 00:16:40.637 "flush": true, 00:16:40.637 "reset": true, 00:16:40.637 "nvme_admin": false, 00:16:40.637 "nvme_io": false, 00:16:40.637 "nvme_io_md": false, 00:16:40.637 "write_zeroes": true, 00:16:40.637 "zcopy": true, 00:16:40.637 "get_zone_info": false, 00:16:40.637 "zone_management": false, 00:16:40.637 "zone_append": false, 00:16:40.637 "compare": false, 00:16:40.637 "compare_and_write": false, 00:16:40.637 "abort": true, 00:16:40.637 "seek_hole": false, 00:16:40.637 "seek_data": false, 00:16:40.637 "copy": true, 00:16:40.637 "nvme_iov_md": false 00:16:40.637 }, 00:16:40.637 "memory_domains": [ 00:16:40.637 { 00:16:40.637 "dma_device_id": "system", 00:16:40.637 "dma_device_type": 1 00:16:40.637 }, 00:16:40.637 { 00:16:40.637 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:40.637 "dma_device_type": 2 00:16:40.637 } 00:16:40.637 ], 00:16:40.637 "driver_specific": {} 00:16:40.637 }' 00:16:40.637 23:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:40.637 23:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:40.637 23:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:40.637 23:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:40.896 23:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:40.896 23:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:40.896 23:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:40.896 23:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:40.896 23:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:40.896 23:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:40.896 23:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:40.896 23:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:40.896 23:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:41.154 [2024-07-24 23:38:25.965856] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:41.154 23:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:16:41.154 23:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:16:41.154 23:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:41.154 23:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:16:41.154 23:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:16:41.154 23:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:16:41.154 23:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:41.154 23:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:41.154 23:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:41.154 23:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:41.154 23:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:41.154 23:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:41.154 23:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:41.154 23:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:41.154 23:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:41.154 23:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:41.154 23:38:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:41.154 23:38:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:41.154 "name": "Existed_Raid", 00:16:41.154 "uuid": "2cdf9b7b-2f87-4898-9d42-0df72781c699", 00:16:41.154 "strip_size_kb": 0, 00:16:41.154 "state": "online", 00:16:41.154 "raid_level": "raid1", 00:16:41.154 "superblock": false, 00:16:41.154 "num_base_bdevs": 4, 00:16:41.154 "num_base_bdevs_discovered": 3, 00:16:41.154 "num_base_bdevs_operational": 3, 00:16:41.154 "base_bdevs_list": [ 00:16:41.154 { 00:16:41.154 "name": null, 00:16:41.154 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:41.154 "is_configured": false, 00:16:41.154 "data_offset": 0, 00:16:41.154 "data_size": 65536 00:16:41.154 }, 00:16:41.154 { 00:16:41.154 "name": "BaseBdev2", 00:16:41.154 "uuid": "969a14f1-bb71-445e-b007-c105d012262d", 00:16:41.154 "is_configured": true, 00:16:41.154 "data_offset": 0, 00:16:41.154 "data_size": 65536 00:16:41.154 }, 00:16:41.154 { 00:16:41.154 "name": "BaseBdev3", 00:16:41.154 "uuid": "655c258a-83b7-4741-aea3-96d8768f113a", 00:16:41.154 "is_configured": true, 00:16:41.154 "data_offset": 0, 00:16:41.154 "data_size": 65536 00:16:41.154 }, 00:16:41.154 { 00:16:41.154 "name": "BaseBdev4", 00:16:41.154 "uuid": "d8ee01a5-6f3f-4769-b105-58b373014b49", 00:16:41.154 "is_configured": true, 00:16:41.154 "data_offset": 0, 00:16:41.154 "data_size": 65536 00:16:41.154 } 00:16:41.154 ] 00:16:41.154 }' 00:16:41.154 23:38:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:41.154 23:38:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:41.721 23:38:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:16:41.721 23:38:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:41.721 23:38:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:41.721 23:38:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:41.979 23:38:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:41.979 23:38:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:41.980 23:38:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:16:41.980 [2024-07-24 23:38:26.937248] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:41.980 23:38:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:41.980 23:38:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:41.980 23:38:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:41.980 23:38:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:42.238 23:38:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:42.239 23:38:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:42.239 23:38:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:16:42.497 [2024-07-24 23:38:27.279835] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:42.497 23:38:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:42.497 23:38:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:42.497 23:38:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:42.497 23:38:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:42.497 23:38:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:42.497 23:38:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:42.497 23:38:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:16:42.756 [2024-07-24 23:38:27.618476] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:16:42.756 [2024-07-24 23:38:27.618531] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:42.756 [2024-07-24 23:38:27.628527] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:42.756 [2024-07-24 23:38:27.628552] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:42.756 [2024-07-24 23:38:27.628558] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x18103d0 name Existed_Raid, state offline 00:16:42.756 23:38:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:42.756 23:38:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:42.756 23:38:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:42.756 23:38:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:16:43.015 23:38:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:16:43.015 23:38:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:16:43.015 23:38:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:16:43.015 23:38:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:16:43.015 23:38:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:43.015 23:38:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:43.015 BaseBdev2 00:16:43.015 23:38:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:16:43.015 23:38:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:16:43.015 23:38:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:16:43.015 23:38:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:16:43.015 23:38:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:16:43.015 23:38:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:16:43.015 23:38:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:43.273 23:38:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:43.532 [ 00:16:43.532 { 00:16:43.532 "name": "BaseBdev2", 00:16:43.532 "aliases": [ 00:16:43.532 "ad8b39ec-9a0b-4699-8c1d-00d471acfcb5" 00:16:43.532 ], 00:16:43.532 "product_name": "Malloc disk", 00:16:43.532 "block_size": 512, 00:16:43.532 "num_blocks": 65536, 00:16:43.532 "uuid": "ad8b39ec-9a0b-4699-8c1d-00d471acfcb5", 00:16:43.532 "assigned_rate_limits": { 00:16:43.532 "rw_ios_per_sec": 0, 00:16:43.532 "rw_mbytes_per_sec": 0, 00:16:43.532 "r_mbytes_per_sec": 0, 00:16:43.532 "w_mbytes_per_sec": 0 00:16:43.532 }, 00:16:43.532 "claimed": false, 00:16:43.532 "zoned": false, 00:16:43.532 "supported_io_types": { 00:16:43.532 "read": true, 00:16:43.532 "write": true, 00:16:43.532 "unmap": true, 00:16:43.532 "flush": true, 00:16:43.532 "reset": true, 00:16:43.532 "nvme_admin": false, 00:16:43.532 "nvme_io": false, 00:16:43.532 "nvme_io_md": false, 00:16:43.532 "write_zeroes": true, 00:16:43.532 "zcopy": true, 00:16:43.532 "get_zone_info": false, 00:16:43.532 "zone_management": false, 00:16:43.532 "zone_append": false, 00:16:43.532 "compare": false, 00:16:43.532 "compare_and_write": false, 00:16:43.532 "abort": true, 00:16:43.532 "seek_hole": false, 00:16:43.532 "seek_data": false, 00:16:43.532 "copy": true, 00:16:43.532 "nvme_iov_md": false 00:16:43.532 }, 00:16:43.532 "memory_domains": [ 00:16:43.532 { 00:16:43.532 "dma_device_id": "system", 00:16:43.532 "dma_device_type": 1 00:16:43.532 }, 00:16:43.532 { 00:16:43.532 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:43.532 "dma_device_type": 2 00:16:43.532 } 00:16:43.532 ], 00:16:43.532 "driver_specific": {} 00:16:43.532 } 00:16:43.532 ] 00:16:43.532 23:38:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:16:43.532 23:38:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:43.532 23:38:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:43.532 23:38:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:43.532 BaseBdev3 00:16:43.532 23:38:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:16:43.532 23:38:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:16:43.532 23:38:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:16:43.532 23:38:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:16:43.533 23:38:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:16:43.533 23:38:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:16:43.533 23:38:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:43.792 23:38:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:43.792 [ 00:16:43.792 { 00:16:43.792 "name": "BaseBdev3", 00:16:43.792 "aliases": [ 00:16:43.792 "d51b6be8-9e56-4c8c-a265-c8068d84b50a" 00:16:43.792 ], 00:16:43.792 "product_name": "Malloc disk", 00:16:43.792 "block_size": 512, 00:16:43.792 "num_blocks": 65536, 00:16:43.792 "uuid": "d51b6be8-9e56-4c8c-a265-c8068d84b50a", 00:16:43.792 "assigned_rate_limits": { 00:16:43.792 "rw_ios_per_sec": 0, 00:16:43.792 "rw_mbytes_per_sec": 0, 00:16:43.792 "r_mbytes_per_sec": 0, 00:16:43.792 "w_mbytes_per_sec": 0 00:16:43.792 }, 00:16:43.792 "claimed": false, 00:16:43.792 "zoned": false, 00:16:43.792 "supported_io_types": { 00:16:43.792 "read": true, 00:16:43.792 "write": true, 00:16:43.792 "unmap": true, 00:16:43.792 "flush": true, 00:16:43.792 "reset": true, 00:16:43.792 "nvme_admin": false, 00:16:43.792 "nvme_io": false, 00:16:43.792 "nvme_io_md": false, 00:16:43.792 "write_zeroes": true, 00:16:43.792 "zcopy": true, 00:16:43.792 "get_zone_info": false, 00:16:43.792 "zone_management": false, 00:16:43.792 "zone_append": false, 00:16:43.792 "compare": false, 00:16:43.792 "compare_and_write": false, 00:16:43.792 "abort": true, 00:16:43.792 "seek_hole": false, 00:16:43.792 "seek_data": false, 00:16:43.792 "copy": true, 00:16:43.792 "nvme_iov_md": false 00:16:43.792 }, 00:16:43.792 "memory_domains": [ 00:16:43.792 { 00:16:43.792 "dma_device_id": "system", 00:16:43.792 "dma_device_type": 1 00:16:43.792 }, 00:16:43.792 { 00:16:43.792 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:43.792 "dma_device_type": 2 00:16:43.792 } 00:16:43.792 ], 00:16:43.792 "driver_specific": {} 00:16:43.792 } 00:16:43.792 ] 00:16:44.052 23:38:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:16:44.052 23:38:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:44.052 23:38:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:44.052 23:38:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:16:44.052 BaseBdev4 00:16:44.052 23:38:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:16:44.052 23:38:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev4 00:16:44.052 23:38:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:16:44.052 23:38:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:16:44.052 23:38:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:16:44.052 23:38:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:16:44.052 23:38:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:44.311 23:38:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:16:44.311 [ 00:16:44.311 { 00:16:44.311 "name": "BaseBdev4", 00:16:44.311 "aliases": [ 00:16:44.311 "2b5aa4f3-a2c9-41e0-baad-96d0e69755fd" 00:16:44.311 ], 00:16:44.311 "product_name": "Malloc disk", 00:16:44.311 "block_size": 512, 00:16:44.311 "num_blocks": 65536, 00:16:44.311 "uuid": "2b5aa4f3-a2c9-41e0-baad-96d0e69755fd", 00:16:44.311 "assigned_rate_limits": { 00:16:44.311 "rw_ios_per_sec": 0, 00:16:44.311 "rw_mbytes_per_sec": 0, 00:16:44.311 "r_mbytes_per_sec": 0, 00:16:44.311 "w_mbytes_per_sec": 0 00:16:44.311 }, 00:16:44.311 "claimed": false, 00:16:44.311 "zoned": false, 00:16:44.311 "supported_io_types": { 00:16:44.311 "read": true, 00:16:44.311 "write": true, 00:16:44.311 "unmap": true, 00:16:44.311 "flush": true, 00:16:44.311 "reset": true, 00:16:44.311 "nvme_admin": false, 00:16:44.311 "nvme_io": false, 00:16:44.311 "nvme_io_md": false, 00:16:44.311 "write_zeroes": true, 00:16:44.311 "zcopy": true, 00:16:44.311 "get_zone_info": false, 00:16:44.311 "zone_management": false, 00:16:44.311 "zone_append": false, 00:16:44.311 "compare": false, 00:16:44.311 "compare_and_write": false, 00:16:44.311 "abort": true, 00:16:44.311 "seek_hole": false, 00:16:44.311 "seek_data": false, 00:16:44.311 "copy": true, 00:16:44.311 "nvme_iov_md": false 00:16:44.311 }, 00:16:44.311 "memory_domains": [ 00:16:44.311 { 00:16:44.311 "dma_device_id": "system", 00:16:44.311 "dma_device_type": 1 00:16:44.311 }, 00:16:44.311 { 00:16:44.311 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:44.311 "dma_device_type": 2 00:16:44.311 } 00:16:44.311 ], 00:16:44.311 "driver_specific": {} 00:16:44.311 } 00:16:44.311 ] 00:16:44.311 23:38:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:16:44.311 23:38:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:44.311 23:38:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:44.311 23:38:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:44.570 [2024-07-24 23:38:29.448249] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:44.570 [2024-07-24 23:38:29.448278] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:44.570 [2024-07-24 23:38:29.448289] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:44.570 [2024-07-24 23:38:29.449243] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:44.570 [2024-07-24 23:38:29.449271] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:16:44.570 23:38:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:16:44.570 23:38:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:44.570 23:38:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:44.570 23:38:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:44.570 23:38:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:44.570 23:38:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:44.570 23:38:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:44.570 23:38:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:44.570 23:38:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:44.570 23:38:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:44.570 23:38:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:44.570 23:38:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:44.829 23:38:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:44.829 "name": "Existed_Raid", 00:16:44.829 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:44.829 "strip_size_kb": 0, 00:16:44.829 "state": "configuring", 00:16:44.829 "raid_level": "raid1", 00:16:44.829 "superblock": false, 00:16:44.829 "num_base_bdevs": 4, 00:16:44.829 "num_base_bdevs_discovered": 3, 00:16:44.829 "num_base_bdevs_operational": 4, 00:16:44.829 "base_bdevs_list": [ 00:16:44.829 { 00:16:44.829 "name": "BaseBdev1", 00:16:44.829 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:44.829 "is_configured": false, 00:16:44.829 "data_offset": 0, 00:16:44.829 "data_size": 0 00:16:44.829 }, 00:16:44.829 { 00:16:44.829 "name": "BaseBdev2", 00:16:44.829 "uuid": "ad8b39ec-9a0b-4699-8c1d-00d471acfcb5", 00:16:44.829 "is_configured": true, 00:16:44.829 "data_offset": 0, 00:16:44.829 "data_size": 65536 00:16:44.829 }, 00:16:44.829 { 00:16:44.829 "name": "BaseBdev3", 00:16:44.829 "uuid": "d51b6be8-9e56-4c8c-a265-c8068d84b50a", 00:16:44.829 "is_configured": true, 00:16:44.829 "data_offset": 0, 00:16:44.829 "data_size": 65536 00:16:44.829 }, 00:16:44.829 { 00:16:44.829 "name": "BaseBdev4", 00:16:44.829 "uuid": "2b5aa4f3-a2c9-41e0-baad-96d0e69755fd", 00:16:44.829 "is_configured": true, 00:16:44.829 "data_offset": 0, 00:16:44.829 "data_size": 65536 00:16:44.829 } 00:16:44.829 ] 00:16:44.829 }' 00:16:44.829 23:38:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:44.829 23:38:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:45.397 23:38:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:16:45.397 [2024-07-24 23:38:30.250307] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:45.397 23:38:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:16:45.397 23:38:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:45.397 23:38:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:45.397 23:38:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:45.397 23:38:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:45.397 23:38:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:45.397 23:38:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:45.397 23:38:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:45.397 23:38:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:45.397 23:38:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:45.397 23:38:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:45.397 23:38:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:45.656 23:38:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:45.656 "name": "Existed_Raid", 00:16:45.656 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:45.656 "strip_size_kb": 0, 00:16:45.656 "state": "configuring", 00:16:45.656 "raid_level": "raid1", 00:16:45.656 "superblock": false, 00:16:45.656 "num_base_bdevs": 4, 00:16:45.656 "num_base_bdevs_discovered": 2, 00:16:45.656 "num_base_bdevs_operational": 4, 00:16:45.656 "base_bdevs_list": [ 00:16:45.656 { 00:16:45.656 "name": "BaseBdev1", 00:16:45.656 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:45.656 "is_configured": false, 00:16:45.656 "data_offset": 0, 00:16:45.656 "data_size": 0 00:16:45.656 }, 00:16:45.656 { 00:16:45.656 "name": null, 00:16:45.656 "uuid": "ad8b39ec-9a0b-4699-8c1d-00d471acfcb5", 00:16:45.656 "is_configured": false, 00:16:45.656 "data_offset": 0, 00:16:45.656 "data_size": 65536 00:16:45.656 }, 00:16:45.656 { 00:16:45.656 "name": "BaseBdev3", 00:16:45.656 "uuid": "d51b6be8-9e56-4c8c-a265-c8068d84b50a", 00:16:45.656 "is_configured": true, 00:16:45.656 "data_offset": 0, 00:16:45.656 "data_size": 65536 00:16:45.656 }, 00:16:45.656 { 00:16:45.656 "name": "BaseBdev4", 00:16:45.656 "uuid": "2b5aa4f3-a2c9-41e0-baad-96d0e69755fd", 00:16:45.656 "is_configured": true, 00:16:45.656 "data_offset": 0, 00:16:45.656 "data_size": 65536 00:16:45.656 } 00:16:45.656 ] 00:16:45.656 }' 00:16:45.656 23:38:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:45.656 23:38:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:46.223 23:38:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:46.223 23:38:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:46.223 23:38:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:16:46.223 23:38:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:46.482 [2024-07-24 23:38:31.247755] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:46.482 BaseBdev1 00:16:46.482 23:38:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:16:46.482 23:38:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:16:46.482 23:38:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:16:46.482 23:38:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:16:46.482 23:38:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:16:46.482 23:38:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:16:46.482 23:38:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:46.482 23:38:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:46.741 [ 00:16:46.741 { 00:16:46.741 "name": "BaseBdev1", 00:16:46.741 "aliases": [ 00:16:46.741 "1188e5d3-6772-4912-9276-19cac0f4929c" 00:16:46.741 ], 00:16:46.741 "product_name": "Malloc disk", 00:16:46.741 "block_size": 512, 00:16:46.741 "num_blocks": 65536, 00:16:46.741 "uuid": "1188e5d3-6772-4912-9276-19cac0f4929c", 00:16:46.741 "assigned_rate_limits": { 00:16:46.741 "rw_ios_per_sec": 0, 00:16:46.741 "rw_mbytes_per_sec": 0, 00:16:46.741 "r_mbytes_per_sec": 0, 00:16:46.741 "w_mbytes_per_sec": 0 00:16:46.741 }, 00:16:46.741 "claimed": true, 00:16:46.741 "claim_type": "exclusive_write", 00:16:46.741 "zoned": false, 00:16:46.741 "supported_io_types": { 00:16:46.741 "read": true, 00:16:46.741 "write": true, 00:16:46.741 "unmap": true, 00:16:46.741 "flush": true, 00:16:46.741 "reset": true, 00:16:46.741 "nvme_admin": false, 00:16:46.741 "nvme_io": false, 00:16:46.741 "nvme_io_md": false, 00:16:46.741 "write_zeroes": true, 00:16:46.741 "zcopy": true, 00:16:46.741 "get_zone_info": false, 00:16:46.741 "zone_management": false, 00:16:46.741 "zone_append": false, 00:16:46.741 "compare": false, 00:16:46.741 "compare_and_write": false, 00:16:46.741 "abort": true, 00:16:46.741 "seek_hole": false, 00:16:46.741 "seek_data": false, 00:16:46.741 "copy": true, 00:16:46.741 "nvme_iov_md": false 00:16:46.741 }, 00:16:46.741 "memory_domains": [ 00:16:46.741 { 00:16:46.741 "dma_device_id": "system", 00:16:46.741 "dma_device_type": 1 00:16:46.741 }, 00:16:46.741 { 00:16:46.741 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:46.741 "dma_device_type": 2 00:16:46.741 } 00:16:46.741 ], 00:16:46.741 "driver_specific": {} 00:16:46.741 } 00:16:46.741 ] 00:16:46.741 23:38:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:16:46.741 23:38:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:16:46.741 23:38:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:46.741 23:38:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:46.741 23:38:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:46.741 23:38:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:46.741 23:38:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:46.741 23:38:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:46.741 23:38:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:46.741 23:38:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:46.741 23:38:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:46.741 23:38:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:46.741 23:38:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:47.000 23:38:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:47.000 "name": "Existed_Raid", 00:16:47.000 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:47.000 "strip_size_kb": 0, 00:16:47.000 "state": "configuring", 00:16:47.000 "raid_level": "raid1", 00:16:47.000 "superblock": false, 00:16:47.000 "num_base_bdevs": 4, 00:16:47.000 "num_base_bdevs_discovered": 3, 00:16:47.000 "num_base_bdevs_operational": 4, 00:16:47.000 "base_bdevs_list": [ 00:16:47.000 { 00:16:47.000 "name": "BaseBdev1", 00:16:47.000 "uuid": "1188e5d3-6772-4912-9276-19cac0f4929c", 00:16:47.000 "is_configured": true, 00:16:47.000 "data_offset": 0, 00:16:47.000 "data_size": 65536 00:16:47.000 }, 00:16:47.000 { 00:16:47.000 "name": null, 00:16:47.000 "uuid": "ad8b39ec-9a0b-4699-8c1d-00d471acfcb5", 00:16:47.000 "is_configured": false, 00:16:47.000 "data_offset": 0, 00:16:47.000 "data_size": 65536 00:16:47.000 }, 00:16:47.000 { 00:16:47.000 "name": "BaseBdev3", 00:16:47.000 "uuid": "d51b6be8-9e56-4c8c-a265-c8068d84b50a", 00:16:47.000 "is_configured": true, 00:16:47.000 "data_offset": 0, 00:16:47.000 "data_size": 65536 00:16:47.000 }, 00:16:47.000 { 00:16:47.000 "name": "BaseBdev4", 00:16:47.000 "uuid": "2b5aa4f3-a2c9-41e0-baad-96d0e69755fd", 00:16:47.000 "is_configured": true, 00:16:47.000 "data_offset": 0, 00:16:47.000 "data_size": 65536 00:16:47.000 } 00:16:47.000 ] 00:16:47.000 }' 00:16:47.000 23:38:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:47.000 23:38:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:47.259 23:38:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:47.518 23:38:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:47.518 23:38:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:16:47.518 23:38:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:16:47.777 [2024-07-24 23:38:32.575205] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:47.777 23:38:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:16:47.777 23:38:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:47.777 23:38:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:47.777 23:38:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:47.777 23:38:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:47.777 23:38:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:47.777 23:38:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:47.777 23:38:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:47.777 23:38:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:47.777 23:38:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:47.777 23:38:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:47.777 23:38:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:47.777 23:38:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:47.777 "name": "Existed_Raid", 00:16:47.777 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:47.777 "strip_size_kb": 0, 00:16:47.777 "state": "configuring", 00:16:47.777 "raid_level": "raid1", 00:16:47.777 "superblock": false, 00:16:47.777 "num_base_bdevs": 4, 00:16:47.777 "num_base_bdevs_discovered": 2, 00:16:47.777 "num_base_bdevs_operational": 4, 00:16:47.777 "base_bdevs_list": [ 00:16:47.777 { 00:16:47.777 "name": "BaseBdev1", 00:16:47.777 "uuid": "1188e5d3-6772-4912-9276-19cac0f4929c", 00:16:47.777 "is_configured": true, 00:16:47.777 "data_offset": 0, 00:16:47.777 "data_size": 65536 00:16:47.777 }, 00:16:47.777 { 00:16:47.777 "name": null, 00:16:47.777 "uuid": "ad8b39ec-9a0b-4699-8c1d-00d471acfcb5", 00:16:47.777 "is_configured": false, 00:16:47.777 "data_offset": 0, 00:16:47.777 "data_size": 65536 00:16:47.777 }, 00:16:47.777 { 00:16:47.777 "name": null, 00:16:47.777 "uuid": "d51b6be8-9e56-4c8c-a265-c8068d84b50a", 00:16:47.777 "is_configured": false, 00:16:47.777 "data_offset": 0, 00:16:47.777 "data_size": 65536 00:16:47.777 }, 00:16:47.777 { 00:16:47.777 "name": "BaseBdev4", 00:16:47.777 "uuid": "2b5aa4f3-a2c9-41e0-baad-96d0e69755fd", 00:16:47.777 "is_configured": true, 00:16:47.777 "data_offset": 0, 00:16:47.777 "data_size": 65536 00:16:47.777 } 00:16:47.777 ] 00:16:47.777 }' 00:16:47.777 23:38:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:47.777 23:38:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:48.345 23:38:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:48.345 23:38:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:48.605 23:38:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:16:48.605 23:38:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:16:48.605 [2024-07-24 23:38:33.545732] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:48.605 23:38:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:16:48.605 23:38:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:48.605 23:38:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:48.605 23:38:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:48.605 23:38:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:48.605 23:38:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:48.605 23:38:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:48.605 23:38:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:48.605 23:38:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:48.605 23:38:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:48.605 23:38:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:48.605 23:38:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:48.864 23:38:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:48.864 "name": "Existed_Raid", 00:16:48.864 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:48.864 "strip_size_kb": 0, 00:16:48.864 "state": "configuring", 00:16:48.864 "raid_level": "raid1", 00:16:48.864 "superblock": false, 00:16:48.864 "num_base_bdevs": 4, 00:16:48.864 "num_base_bdevs_discovered": 3, 00:16:48.864 "num_base_bdevs_operational": 4, 00:16:48.864 "base_bdevs_list": [ 00:16:48.864 { 00:16:48.864 "name": "BaseBdev1", 00:16:48.864 "uuid": "1188e5d3-6772-4912-9276-19cac0f4929c", 00:16:48.864 "is_configured": true, 00:16:48.864 "data_offset": 0, 00:16:48.864 "data_size": 65536 00:16:48.864 }, 00:16:48.864 { 00:16:48.864 "name": null, 00:16:48.864 "uuid": "ad8b39ec-9a0b-4699-8c1d-00d471acfcb5", 00:16:48.864 "is_configured": false, 00:16:48.864 "data_offset": 0, 00:16:48.864 "data_size": 65536 00:16:48.864 }, 00:16:48.864 { 00:16:48.864 "name": "BaseBdev3", 00:16:48.864 "uuid": "d51b6be8-9e56-4c8c-a265-c8068d84b50a", 00:16:48.864 "is_configured": true, 00:16:48.864 "data_offset": 0, 00:16:48.864 "data_size": 65536 00:16:48.864 }, 00:16:48.864 { 00:16:48.864 "name": "BaseBdev4", 00:16:48.864 "uuid": "2b5aa4f3-a2c9-41e0-baad-96d0e69755fd", 00:16:48.864 "is_configured": true, 00:16:48.864 "data_offset": 0, 00:16:48.864 "data_size": 65536 00:16:48.864 } 00:16:48.864 ] 00:16:48.864 }' 00:16:48.864 23:38:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:48.864 23:38:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:49.431 23:38:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:49.431 23:38:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:49.431 23:38:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:16:49.431 23:38:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:49.690 [2024-07-24 23:38:34.560363] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:49.690 23:38:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:16:49.690 23:38:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:49.690 23:38:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:49.690 23:38:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:49.690 23:38:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:49.690 23:38:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:49.690 23:38:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:49.690 23:38:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:49.690 23:38:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:49.690 23:38:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:49.690 23:38:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:49.690 23:38:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:49.949 23:38:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:49.949 "name": "Existed_Raid", 00:16:49.949 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:49.949 "strip_size_kb": 0, 00:16:49.949 "state": "configuring", 00:16:49.949 "raid_level": "raid1", 00:16:49.949 "superblock": false, 00:16:49.949 "num_base_bdevs": 4, 00:16:49.949 "num_base_bdevs_discovered": 2, 00:16:49.949 "num_base_bdevs_operational": 4, 00:16:49.949 "base_bdevs_list": [ 00:16:49.949 { 00:16:49.949 "name": null, 00:16:49.949 "uuid": "1188e5d3-6772-4912-9276-19cac0f4929c", 00:16:49.949 "is_configured": false, 00:16:49.949 "data_offset": 0, 00:16:49.949 "data_size": 65536 00:16:49.949 }, 00:16:49.949 { 00:16:49.949 "name": null, 00:16:49.949 "uuid": "ad8b39ec-9a0b-4699-8c1d-00d471acfcb5", 00:16:49.949 "is_configured": false, 00:16:49.949 "data_offset": 0, 00:16:49.949 "data_size": 65536 00:16:49.949 }, 00:16:49.949 { 00:16:49.949 "name": "BaseBdev3", 00:16:49.949 "uuid": "d51b6be8-9e56-4c8c-a265-c8068d84b50a", 00:16:49.949 "is_configured": true, 00:16:49.949 "data_offset": 0, 00:16:49.949 "data_size": 65536 00:16:49.949 }, 00:16:49.949 { 00:16:49.949 "name": "BaseBdev4", 00:16:49.949 "uuid": "2b5aa4f3-a2c9-41e0-baad-96d0e69755fd", 00:16:49.949 "is_configured": true, 00:16:49.949 "data_offset": 0, 00:16:49.949 "data_size": 65536 00:16:49.949 } 00:16:49.949 ] 00:16:49.949 }' 00:16:49.949 23:38:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:49.949 23:38:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:50.517 23:38:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:50.517 23:38:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:50.517 23:38:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:16:50.517 23:38:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:16:50.775 [2024-07-24 23:38:35.544175] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:50.775 23:38:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:16:50.775 23:38:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:50.775 23:38:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:50.775 23:38:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:50.775 23:38:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:50.775 23:38:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:50.775 23:38:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:50.775 23:38:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:50.775 23:38:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:50.775 23:38:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:50.775 23:38:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:50.775 23:38:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:50.775 23:38:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:50.775 "name": "Existed_Raid", 00:16:50.775 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:50.775 "strip_size_kb": 0, 00:16:50.775 "state": "configuring", 00:16:50.775 "raid_level": "raid1", 00:16:50.775 "superblock": false, 00:16:50.776 "num_base_bdevs": 4, 00:16:50.776 "num_base_bdevs_discovered": 3, 00:16:50.776 "num_base_bdevs_operational": 4, 00:16:50.776 "base_bdevs_list": [ 00:16:50.776 { 00:16:50.776 "name": null, 00:16:50.776 "uuid": "1188e5d3-6772-4912-9276-19cac0f4929c", 00:16:50.776 "is_configured": false, 00:16:50.776 "data_offset": 0, 00:16:50.776 "data_size": 65536 00:16:50.776 }, 00:16:50.776 { 00:16:50.776 "name": "BaseBdev2", 00:16:50.776 "uuid": "ad8b39ec-9a0b-4699-8c1d-00d471acfcb5", 00:16:50.776 "is_configured": true, 00:16:50.776 "data_offset": 0, 00:16:50.776 "data_size": 65536 00:16:50.776 }, 00:16:50.776 { 00:16:50.776 "name": "BaseBdev3", 00:16:50.776 "uuid": "d51b6be8-9e56-4c8c-a265-c8068d84b50a", 00:16:50.776 "is_configured": true, 00:16:50.776 "data_offset": 0, 00:16:50.776 "data_size": 65536 00:16:50.776 }, 00:16:50.776 { 00:16:50.776 "name": "BaseBdev4", 00:16:50.776 "uuid": "2b5aa4f3-a2c9-41e0-baad-96d0e69755fd", 00:16:50.776 "is_configured": true, 00:16:50.776 "data_offset": 0, 00:16:50.776 "data_size": 65536 00:16:50.776 } 00:16:50.776 ] 00:16:50.776 }' 00:16:50.776 23:38:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:50.776 23:38:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:51.344 23:38:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:51.344 23:38:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:51.602 23:38:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:16:51.602 23:38:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:51.602 23:38:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:16:51.602 23:38:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 1188e5d3-6772-4912-9276-19cac0f4929c 00:16:51.861 [2024-07-24 23:38:36.721791] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:16:51.861 [2024-07-24 23:38:36.721821] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1811250 00:16:51.861 [2024-07-24 23:38:36.721825] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:16:51.861 [2024-07-24 23:38:36.721953] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x180e580 00:16:51.861 [2024-07-24 23:38:36.722037] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1811250 00:16:51.861 [2024-07-24 23:38:36.722042] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1811250 00:16:51.861 [2024-07-24 23:38:36.722171] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:51.861 NewBaseBdev 00:16:51.861 23:38:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:16:51.861 23:38:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=NewBaseBdev 00:16:51.861 23:38:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:16:51.861 23:38:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:16:51.861 23:38:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:16:51.861 23:38:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:16:51.861 23:38:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:52.120 23:38:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:16:52.120 [ 00:16:52.120 { 00:16:52.120 "name": "NewBaseBdev", 00:16:52.120 "aliases": [ 00:16:52.120 "1188e5d3-6772-4912-9276-19cac0f4929c" 00:16:52.120 ], 00:16:52.120 "product_name": "Malloc disk", 00:16:52.120 "block_size": 512, 00:16:52.120 "num_blocks": 65536, 00:16:52.120 "uuid": "1188e5d3-6772-4912-9276-19cac0f4929c", 00:16:52.120 "assigned_rate_limits": { 00:16:52.120 "rw_ios_per_sec": 0, 00:16:52.120 "rw_mbytes_per_sec": 0, 00:16:52.120 "r_mbytes_per_sec": 0, 00:16:52.120 "w_mbytes_per_sec": 0 00:16:52.120 }, 00:16:52.120 "claimed": true, 00:16:52.120 "claim_type": "exclusive_write", 00:16:52.120 "zoned": false, 00:16:52.120 "supported_io_types": { 00:16:52.120 "read": true, 00:16:52.120 "write": true, 00:16:52.120 "unmap": true, 00:16:52.120 "flush": true, 00:16:52.120 "reset": true, 00:16:52.120 "nvme_admin": false, 00:16:52.120 "nvme_io": false, 00:16:52.120 "nvme_io_md": false, 00:16:52.120 "write_zeroes": true, 00:16:52.120 "zcopy": true, 00:16:52.120 "get_zone_info": false, 00:16:52.120 "zone_management": false, 00:16:52.120 "zone_append": false, 00:16:52.120 "compare": false, 00:16:52.120 "compare_and_write": false, 00:16:52.120 "abort": true, 00:16:52.120 "seek_hole": false, 00:16:52.120 "seek_data": false, 00:16:52.120 "copy": true, 00:16:52.120 "nvme_iov_md": false 00:16:52.120 }, 00:16:52.120 "memory_domains": [ 00:16:52.120 { 00:16:52.120 "dma_device_id": "system", 00:16:52.120 "dma_device_type": 1 00:16:52.120 }, 00:16:52.120 { 00:16:52.120 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:52.120 "dma_device_type": 2 00:16:52.120 } 00:16:52.120 ], 00:16:52.120 "driver_specific": {} 00:16:52.120 } 00:16:52.120 ] 00:16:52.120 23:38:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:16:52.120 23:38:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:16:52.120 23:38:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:52.120 23:38:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:52.120 23:38:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:52.120 23:38:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:52.120 23:38:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:52.120 23:38:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:52.120 23:38:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:52.121 23:38:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:52.121 23:38:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:52.121 23:38:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:52.121 23:38:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:52.379 23:38:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:52.379 "name": "Existed_Raid", 00:16:52.379 "uuid": "e79a1bd6-9412-420d-b447-bfb3366d2a57", 00:16:52.379 "strip_size_kb": 0, 00:16:52.380 "state": "online", 00:16:52.380 "raid_level": "raid1", 00:16:52.380 "superblock": false, 00:16:52.380 "num_base_bdevs": 4, 00:16:52.380 "num_base_bdevs_discovered": 4, 00:16:52.380 "num_base_bdevs_operational": 4, 00:16:52.380 "base_bdevs_list": [ 00:16:52.380 { 00:16:52.380 "name": "NewBaseBdev", 00:16:52.380 "uuid": "1188e5d3-6772-4912-9276-19cac0f4929c", 00:16:52.380 "is_configured": true, 00:16:52.380 "data_offset": 0, 00:16:52.380 "data_size": 65536 00:16:52.380 }, 00:16:52.380 { 00:16:52.380 "name": "BaseBdev2", 00:16:52.380 "uuid": "ad8b39ec-9a0b-4699-8c1d-00d471acfcb5", 00:16:52.380 "is_configured": true, 00:16:52.380 "data_offset": 0, 00:16:52.380 "data_size": 65536 00:16:52.380 }, 00:16:52.380 { 00:16:52.380 "name": "BaseBdev3", 00:16:52.380 "uuid": "d51b6be8-9e56-4c8c-a265-c8068d84b50a", 00:16:52.380 "is_configured": true, 00:16:52.380 "data_offset": 0, 00:16:52.380 "data_size": 65536 00:16:52.380 }, 00:16:52.380 { 00:16:52.380 "name": "BaseBdev4", 00:16:52.380 "uuid": "2b5aa4f3-a2c9-41e0-baad-96d0e69755fd", 00:16:52.380 "is_configured": true, 00:16:52.380 "data_offset": 0, 00:16:52.380 "data_size": 65536 00:16:52.380 } 00:16:52.380 ] 00:16:52.380 }' 00:16:52.380 23:38:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:52.380 23:38:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:52.947 23:38:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:16:52.947 23:38:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:52.947 23:38:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:52.947 23:38:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:52.947 23:38:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:52.947 23:38:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:52.947 23:38:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:52.947 23:38:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:52.947 [2024-07-24 23:38:37.885017] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:52.947 23:38:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:52.947 "name": "Existed_Raid", 00:16:52.947 "aliases": [ 00:16:52.947 "e79a1bd6-9412-420d-b447-bfb3366d2a57" 00:16:52.947 ], 00:16:52.947 "product_name": "Raid Volume", 00:16:52.947 "block_size": 512, 00:16:52.947 "num_blocks": 65536, 00:16:52.947 "uuid": "e79a1bd6-9412-420d-b447-bfb3366d2a57", 00:16:52.947 "assigned_rate_limits": { 00:16:52.947 "rw_ios_per_sec": 0, 00:16:52.947 "rw_mbytes_per_sec": 0, 00:16:52.947 "r_mbytes_per_sec": 0, 00:16:52.947 "w_mbytes_per_sec": 0 00:16:52.947 }, 00:16:52.947 "claimed": false, 00:16:52.947 "zoned": false, 00:16:52.947 "supported_io_types": { 00:16:52.947 "read": true, 00:16:52.947 "write": true, 00:16:52.947 "unmap": false, 00:16:52.947 "flush": false, 00:16:52.947 "reset": true, 00:16:52.947 "nvme_admin": false, 00:16:52.947 "nvme_io": false, 00:16:52.947 "nvme_io_md": false, 00:16:52.947 "write_zeroes": true, 00:16:52.947 "zcopy": false, 00:16:52.947 "get_zone_info": false, 00:16:52.947 "zone_management": false, 00:16:52.947 "zone_append": false, 00:16:52.947 "compare": false, 00:16:52.947 "compare_and_write": false, 00:16:52.947 "abort": false, 00:16:52.947 "seek_hole": false, 00:16:52.947 "seek_data": false, 00:16:52.947 "copy": false, 00:16:52.947 "nvme_iov_md": false 00:16:52.947 }, 00:16:52.947 "memory_domains": [ 00:16:52.947 { 00:16:52.947 "dma_device_id": "system", 00:16:52.947 "dma_device_type": 1 00:16:52.947 }, 00:16:52.947 { 00:16:52.947 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:52.947 "dma_device_type": 2 00:16:52.947 }, 00:16:52.947 { 00:16:52.947 "dma_device_id": "system", 00:16:52.947 "dma_device_type": 1 00:16:52.947 }, 00:16:52.947 { 00:16:52.947 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:52.947 "dma_device_type": 2 00:16:52.947 }, 00:16:52.947 { 00:16:52.947 "dma_device_id": "system", 00:16:52.947 "dma_device_type": 1 00:16:52.947 }, 00:16:52.947 { 00:16:52.947 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:52.947 "dma_device_type": 2 00:16:52.947 }, 00:16:52.947 { 00:16:52.947 "dma_device_id": "system", 00:16:52.947 "dma_device_type": 1 00:16:52.947 }, 00:16:52.947 { 00:16:52.947 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:52.947 "dma_device_type": 2 00:16:52.947 } 00:16:52.947 ], 00:16:52.947 "driver_specific": { 00:16:52.947 "raid": { 00:16:52.947 "uuid": "e79a1bd6-9412-420d-b447-bfb3366d2a57", 00:16:52.947 "strip_size_kb": 0, 00:16:52.947 "state": "online", 00:16:52.947 "raid_level": "raid1", 00:16:52.947 "superblock": false, 00:16:52.948 "num_base_bdevs": 4, 00:16:52.948 "num_base_bdevs_discovered": 4, 00:16:52.948 "num_base_bdevs_operational": 4, 00:16:52.948 "base_bdevs_list": [ 00:16:52.948 { 00:16:52.948 "name": "NewBaseBdev", 00:16:52.948 "uuid": "1188e5d3-6772-4912-9276-19cac0f4929c", 00:16:52.948 "is_configured": true, 00:16:52.948 "data_offset": 0, 00:16:52.948 "data_size": 65536 00:16:52.948 }, 00:16:52.948 { 00:16:52.948 "name": "BaseBdev2", 00:16:52.948 "uuid": "ad8b39ec-9a0b-4699-8c1d-00d471acfcb5", 00:16:52.948 "is_configured": true, 00:16:52.948 "data_offset": 0, 00:16:52.948 "data_size": 65536 00:16:52.948 }, 00:16:52.948 { 00:16:52.948 "name": "BaseBdev3", 00:16:52.948 "uuid": "d51b6be8-9e56-4c8c-a265-c8068d84b50a", 00:16:52.948 "is_configured": true, 00:16:52.948 "data_offset": 0, 00:16:52.948 "data_size": 65536 00:16:52.948 }, 00:16:52.948 { 00:16:52.948 "name": "BaseBdev4", 00:16:52.948 "uuid": "2b5aa4f3-a2c9-41e0-baad-96d0e69755fd", 00:16:52.948 "is_configured": true, 00:16:52.948 "data_offset": 0, 00:16:52.948 "data_size": 65536 00:16:52.948 } 00:16:52.948 ] 00:16:52.948 } 00:16:52.948 } 00:16:52.948 }' 00:16:52.948 23:38:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:52.948 23:38:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:16:52.948 BaseBdev2 00:16:52.948 BaseBdev3 00:16:52.948 BaseBdev4' 00:16:52.948 23:38:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:52.948 23:38:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:16:52.948 23:38:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:53.207 23:38:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:53.207 "name": "NewBaseBdev", 00:16:53.207 "aliases": [ 00:16:53.207 "1188e5d3-6772-4912-9276-19cac0f4929c" 00:16:53.207 ], 00:16:53.207 "product_name": "Malloc disk", 00:16:53.207 "block_size": 512, 00:16:53.207 "num_blocks": 65536, 00:16:53.207 "uuid": "1188e5d3-6772-4912-9276-19cac0f4929c", 00:16:53.207 "assigned_rate_limits": { 00:16:53.207 "rw_ios_per_sec": 0, 00:16:53.207 "rw_mbytes_per_sec": 0, 00:16:53.207 "r_mbytes_per_sec": 0, 00:16:53.207 "w_mbytes_per_sec": 0 00:16:53.207 }, 00:16:53.207 "claimed": true, 00:16:53.207 "claim_type": "exclusive_write", 00:16:53.207 "zoned": false, 00:16:53.207 "supported_io_types": { 00:16:53.207 "read": true, 00:16:53.207 "write": true, 00:16:53.207 "unmap": true, 00:16:53.207 "flush": true, 00:16:53.207 "reset": true, 00:16:53.207 "nvme_admin": false, 00:16:53.207 "nvme_io": false, 00:16:53.207 "nvme_io_md": false, 00:16:53.207 "write_zeroes": true, 00:16:53.207 "zcopy": true, 00:16:53.207 "get_zone_info": false, 00:16:53.207 "zone_management": false, 00:16:53.207 "zone_append": false, 00:16:53.207 "compare": false, 00:16:53.207 "compare_and_write": false, 00:16:53.207 "abort": true, 00:16:53.207 "seek_hole": false, 00:16:53.207 "seek_data": false, 00:16:53.207 "copy": true, 00:16:53.207 "nvme_iov_md": false 00:16:53.207 }, 00:16:53.207 "memory_domains": [ 00:16:53.207 { 00:16:53.207 "dma_device_id": "system", 00:16:53.207 "dma_device_type": 1 00:16:53.207 }, 00:16:53.207 { 00:16:53.207 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:53.207 "dma_device_type": 2 00:16:53.207 } 00:16:53.207 ], 00:16:53.207 "driver_specific": {} 00:16:53.207 }' 00:16:53.207 23:38:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:53.207 23:38:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:53.207 23:38:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:53.207 23:38:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:53.207 23:38:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:53.466 23:38:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:53.466 23:38:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:53.466 23:38:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:53.466 23:38:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:53.466 23:38:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:53.466 23:38:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:53.466 23:38:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:53.466 23:38:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:53.466 23:38:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:53.466 23:38:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:53.725 23:38:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:53.725 "name": "BaseBdev2", 00:16:53.725 "aliases": [ 00:16:53.725 "ad8b39ec-9a0b-4699-8c1d-00d471acfcb5" 00:16:53.725 ], 00:16:53.725 "product_name": "Malloc disk", 00:16:53.725 "block_size": 512, 00:16:53.725 "num_blocks": 65536, 00:16:53.725 "uuid": "ad8b39ec-9a0b-4699-8c1d-00d471acfcb5", 00:16:53.725 "assigned_rate_limits": { 00:16:53.725 "rw_ios_per_sec": 0, 00:16:53.725 "rw_mbytes_per_sec": 0, 00:16:53.725 "r_mbytes_per_sec": 0, 00:16:53.725 "w_mbytes_per_sec": 0 00:16:53.725 }, 00:16:53.725 "claimed": true, 00:16:53.725 "claim_type": "exclusive_write", 00:16:53.725 "zoned": false, 00:16:53.725 "supported_io_types": { 00:16:53.725 "read": true, 00:16:53.725 "write": true, 00:16:53.725 "unmap": true, 00:16:53.725 "flush": true, 00:16:53.725 "reset": true, 00:16:53.725 "nvme_admin": false, 00:16:53.725 "nvme_io": false, 00:16:53.725 "nvme_io_md": false, 00:16:53.725 "write_zeroes": true, 00:16:53.725 "zcopy": true, 00:16:53.725 "get_zone_info": false, 00:16:53.725 "zone_management": false, 00:16:53.725 "zone_append": false, 00:16:53.725 "compare": false, 00:16:53.725 "compare_and_write": false, 00:16:53.725 "abort": true, 00:16:53.725 "seek_hole": false, 00:16:53.725 "seek_data": false, 00:16:53.725 "copy": true, 00:16:53.725 "nvme_iov_md": false 00:16:53.725 }, 00:16:53.725 "memory_domains": [ 00:16:53.725 { 00:16:53.725 "dma_device_id": "system", 00:16:53.725 "dma_device_type": 1 00:16:53.725 }, 00:16:53.725 { 00:16:53.725 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:53.725 "dma_device_type": 2 00:16:53.725 } 00:16:53.725 ], 00:16:53.725 "driver_specific": {} 00:16:53.725 }' 00:16:53.725 23:38:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:53.725 23:38:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:53.725 23:38:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:53.725 23:38:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:53.725 23:38:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:53.725 23:38:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:53.725 23:38:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:53.725 23:38:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:53.725 23:38:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:53.725 23:38:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:53.984 23:38:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:53.984 23:38:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:53.984 23:38:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:53.984 23:38:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:53.984 23:38:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:53.984 23:38:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:53.984 "name": "BaseBdev3", 00:16:53.984 "aliases": [ 00:16:53.984 "d51b6be8-9e56-4c8c-a265-c8068d84b50a" 00:16:53.984 ], 00:16:53.984 "product_name": "Malloc disk", 00:16:53.984 "block_size": 512, 00:16:53.984 "num_blocks": 65536, 00:16:53.984 "uuid": "d51b6be8-9e56-4c8c-a265-c8068d84b50a", 00:16:53.984 "assigned_rate_limits": { 00:16:53.984 "rw_ios_per_sec": 0, 00:16:53.984 "rw_mbytes_per_sec": 0, 00:16:53.984 "r_mbytes_per_sec": 0, 00:16:53.984 "w_mbytes_per_sec": 0 00:16:53.984 }, 00:16:53.984 "claimed": true, 00:16:53.984 "claim_type": "exclusive_write", 00:16:53.984 "zoned": false, 00:16:53.984 "supported_io_types": { 00:16:53.984 "read": true, 00:16:53.984 "write": true, 00:16:53.984 "unmap": true, 00:16:53.984 "flush": true, 00:16:53.984 "reset": true, 00:16:53.984 "nvme_admin": false, 00:16:53.984 "nvme_io": false, 00:16:53.984 "nvme_io_md": false, 00:16:53.984 "write_zeroes": true, 00:16:53.984 "zcopy": true, 00:16:53.984 "get_zone_info": false, 00:16:53.984 "zone_management": false, 00:16:53.984 "zone_append": false, 00:16:53.984 "compare": false, 00:16:53.984 "compare_and_write": false, 00:16:53.984 "abort": true, 00:16:53.984 "seek_hole": false, 00:16:53.984 "seek_data": false, 00:16:53.984 "copy": true, 00:16:53.984 "nvme_iov_md": false 00:16:53.984 }, 00:16:53.984 "memory_domains": [ 00:16:53.984 { 00:16:53.984 "dma_device_id": "system", 00:16:53.984 "dma_device_type": 1 00:16:53.984 }, 00:16:53.984 { 00:16:53.984 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:53.984 "dma_device_type": 2 00:16:53.984 } 00:16:53.984 ], 00:16:53.984 "driver_specific": {} 00:16:53.984 }' 00:16:53.984 23:38:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:53.984 23:38:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:54.243 23:38:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:54.243 23:38:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:54.243 23:38:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:54.243 23:38:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:54.243 23:38:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:54.244 23:38:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:54.244 23:38:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:54.244 23:38:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:54.244 23:38:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:54.244 23:38:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:54.244 23:38:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:54.244 23:38:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:16:54.244 23:38:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:54.502 23:38:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:54.502 "name": "BaseBdev4", 00:16:54.502 "aliases": [ 00:16:54.502 "2b5aa4f3-a2c9-41e0-baad-96d0e69755fd" 00:16:54.502 ], 00:16:54.502 "product_name": "Malloc disk", 00:16:54.502 "block_size": 512, 00:16:54.502 "num_blocks": 65536, 00:16:54.502 "uuid": "2b5aa4f3-a2c9-41e0-baad-96d0e69755fd", 00:16:54.502 "assigned_rate_limits": { 00:16:54.502 "rw_ios_per_sec": 0, 00:16:54.502 "rw_mbytes_per_sec": 0, 00:16:54.502 "r_mbytes_per_sec": 0, 00:16:54.502 "w_mbytes_per_sec": 0 00:16:54.502 }, 00:16:54.502 "claimed": true, 00:16:54.502 "claim_type": "exclusive_write", 00:16:54.502 "zoned": false, 00:16:54.502 "supported_io_types": { 00:16:54.502 "read": true, 00:16:54.502 "write": true, 00:16:54.502 "unmap": true, 00:16:54.502 "flush": true, 00:16:54.502 "reset": true, 00:16:54.502 "nvme_admin": false, 00:16:54.502 "nvme_io": false, 00:16:54.502 "nvme_io_md": false, 00:16:54.502 "write_zeroes": true, 00:16:54.502 "zcopy": true, 00:16:54.502 "get_zone_info": false, 00:16:54.502 "zone_management": false, 00:16:54.502 "zone_append": false, 00:16:54.502 "compare": false, 00:16:54.502 "compare_and_write": false, 00:16:54.502 "abort": true, 00:16:54.502 "seek_hole": false, 00:16:54.502 "seek_data": false, 00:16:54.502 "copy": true, 00:16:54.502 "nvme_iov_md": false 00:16:54.502 }, 00:16:54.502 "memory_domains": [ 00:16:54.502 { 00:16:54.502 "dma_device_id": "system", 00:16:54.502 "dma_device_type": 1 00:16:54.502 }, 00:16:54.502 { 00:16:54.502 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:54.502 "dma_device_type": 2 00:16:54.502 } 00:16:54.502 ], 00:16:54.502 "driver_specific": {} 00:16:54.502 }' 00:16:54.502 23:38:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:54.502 23:38:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:54.502 23:38:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:54.502 23:38:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:54.502 23:38:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:54.762 23:38:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:54.762 23:38:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:54.762 23:38:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:54.762 23:38:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:54.762 23:38:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:54.762 23:38:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:54.762 23:38:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:54.762 23:38:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:55.021 [2024-07-24 23:38:39.785736] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:55.021 [2024-07-24 23:38:39.785754] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:55.021 [2024-07-24 23:38:39.785789] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:55.021 [2024-07-24 23:38:39.785974] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:55.021 [2024-07-24 23:38:39.785981] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1811250 name Existed_Raid, state offline 00:16:55.021 23:38:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 330377 00:16:55.021 23:38:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@950 -- # '[' -z 330377 ']' 00:16:55.021 23:38:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # kill -0 330377 00:16:55.021 23:38:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # uname 00:16:55.021 23:38:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:16:55.021 23:38:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 330377 00:16:55.021 23:38:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:16:55.021 23:38:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:16:55.021 23:38:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 330377' 00:16:55.021 killing process with pid 330377 00:16:55.021 23:38:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@969 -- # kill 330377 00:16:55.021 [2024-07-24 23:38:39.844073] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:55.021 23:38:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@974 -- # wait 330377 00:16:55.021 [2024-07-24 23:38:39.875178] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:55.314 23:38:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:16:55.314 00:16:55.314 real 0m23.909s 00:16:55.314 user 0m44.616s 00:16:55.314 sys 0m3.666s 00:16:55.314 23:38:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:16:55.314 23:38:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:55.314 ************************************ 00:16:55.314 END TEST raid_state_function_test 00:16:55.314 ************************************ 00:16:55.314 23:38:40 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 4 true 00:16:55.314 23:38:40 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:16:55.314 23:38:40 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:16:55.314 23:38:40 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:55.314 ************************************ 00:16:55.314 START TEST raid_state_function_test_sb 00:16:55.314 ************************************ 00:16:55.314 23:38:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # raid_state_function_test raid1 4 true 00:16:55.314 23:38:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:16:55.314 23:38:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:16:55.314 23:38:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:16:55.314 23:38:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:16:55.314 23:38:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:16:55.314 23:38:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:55.314 23:38:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:16:55.314 23:38:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:55.314 23:38:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:55.314 23:38:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:16:55.314 23:38:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:55.314 23:38:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:55.314 23:38:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:16:55.314 23:38:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:55.314 23:38:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:55.314 23:38:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:16:55.314 23:38:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:55.314 23:38:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:55.314 23:38:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:16:55.314 23:38:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:16:55.314 23:38:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:16:55.314 23:38:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:16:55.314 23:38:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:16:55.314 23:38:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:16:55.314 23:38:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:16:55.314 23:38:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:16:55.314 23:38:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:16:55.314 23:38:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:16:55.314 23:38:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=335033 00:16:55.314 23:38:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 335033' 00:16:55.314 Process raid pid: 335033 00:16:55.314 23:38:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:16:55.314 23:38:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 335033 /var/tmp/spdk-raid.sock 00:16:55.314 23:38:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@831 -- # '[' -z 335033 ']' 00:16:55.314 23:38:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:55.314 23:38:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # local max_retries=100 00:16:55.314 23:38:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:55.314 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:55.314 23:38:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@840 -- # xtrace_disable 00:16:55.314 23:38:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:55.314 [2024-07-24 23:38:40.162050] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:16:55.314 [2024-07-24 23:38:40.162089] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:16:55.314 [2024-07-24 23:38:40.226897] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:55.593 [2024-07-24 23:38:40.305745] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:55.593 [2024-07-24 23:38:40.355930] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:55.593 [2024-07-24 23:38:40.355954] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:56.161 23:38:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:16:56.161 23:38:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@864 -- # return 0 00:16:56.161 23:38:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:56.161 [2024-07-24 23:38:41.090649] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:56.161 [2024-07-24 23:38:41.090679] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:56.161 [2024-07-24 23:38:41.090687] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:56.161 [2024-07-24 23:38:41.090692] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:56.161 [2024-07-24 23:38:41.090697] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:56.161 [2024-07-24 23:38:41.090702] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:56.161 [2024-07-24 23:38:41.090706] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:16:56.161 [2024-07-24 23:38:41.090711] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:16:56.161 23:38:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:16:56.161 23:38:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:56.161 23:38:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:56.161 23:38:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:56.161 23:38:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:56.161 23:38:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:56.161 23:38:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:56.161 23:38:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:56.161 23:38:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:56.161 23:38:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:56.161 23:38:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:56.161 23:38:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:56.420 23:38:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:56.420 "name": "Existed_Raid", 00:16:56.420 "uuid": "e710a0f9-42a7-4787-8dcf-c5c95e29e3fc", 00:16:56.420 "strip_size_kb": 0, 00:16:56.420 "state": "configuring", 00:16:56.420 "raid_level": "raid1", 00:16:56.420 "superblock": true, 00:16:56.420 "num_base_bdevs": 4, 00:16:56.420 "num_base_bdevs_discovered": 0, 00:16:56.420 "num_base_bdevs_operational": 4, 00:16:56.420 "base_bdevs_list": [ 00:16:56.420 { 00:16:56.420 "name": "BaseBdev1", 00:16:56.420 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:56.420 "is_configured": false, 00:16:56.420 "data_offset": 0, 00:16:56.420 "data_size": 0 00:16:56.420 }, 00:16:56.420 { 00:16:56.420 "name": "BaseBdev2", 00:16:56.420 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:56.420 "is_configured": false, 00:16:56.420 "data_offset": 0, 00:16:56.420 "data_size": 0 00:16:56.420 }, 00:16:56.420 { 00:16:56.420 "name": "BaseBdev3", 00:16:56.420 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:56.420 "is_configured": false, 00:16:56.420 "data_offset": 0, 00:16:56.420 "data_size": 0 00:16:56.420 }, 00:16:56.420 { 00:16:56.420 "name": "BaseBdev4", 00:16:56.420 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:56.420 "is_configured": false, 00:16:56.420 "data_offset": 0, 00:16:56.420 "data_size": 0 00:16:56.420 } 00:16:56.420 ] 00:16:56.420 }' 00:16:56.420 23:38:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:56.420 23:38:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:56.988 23:38:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:56.988 [2024-07-24 23:38:41.912687] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:56.988 [2024-07-24 23:38:41.912707] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1041b50 name Existed_Raid, state configuring 00:16:56.988 23:38:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:57.246 [2024-07-24 23:38:42.085145] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:57.246 [2024-07-24 23:38:42.085163] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:57.246 [2024-07-24 23:38:42.085168] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:57.246 [2024-07-24 23:38:42.085173] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:57.246 [2024-07-24 23:38:42.085176] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:57.246 [2024-07-24 23:38:42.085181] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:57.246 [2024-07-24 23:38:42.085201] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:16:57.246 [2024-07-24 23:38:42.085206] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:16:57.246 23:38:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:57.505 [2024-07-24 23:38:42.257740] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:57.505 BaseBdev1 00:16:57.505 23:38:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:16:57.505 23:38:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:16:57.505 23:38:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:16:57.505 23:38:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:16:57.505 23:38:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:16:57.505 23:38:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:16:57.505 23:38:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:57.505 23:38:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:57.764 [ 00:16:57.764 { 00:16:57.764 "name": "BaseBdev1", 00:16:57.764 "aliases": [ 00:16:57.764 "8a0eaded-3367-426a-9b3e-e3a1c45270c4" 00:16:57.764 ], 00:16:57.764 "product_name": "Malloc disk", 00:16:57.764 "block_size": 512, 00:16:57.764 "num_blocks": 65536, 00:16:57.764 "uuid": "8a0eaded-3367-426a-9b3e-e3a1c45270c4", 00:16:57.764 "assigned_rate_limits": { 00:16:57.764 "rw_ios_per_sec": 0, 00:16:57.764 "rw_mbytes_per_sec": 0, 00:16:57.764 "r_mbytes_per_sec": 0, 00:16:57.764 "w_mbytes_per_sec": 0 00:16:57.764 }, 00:16:57.764 "claimed": true, 00:16:57.764 "claim_type": "exclusive_write", 00:16:57.764 "zoned": false, 00:16:57.764 "supported_io_types": { 00:16:57.764 "read": true, 00:16:57.764 "write": true, 00:16:57.764 "unmap": true, 00:16:57.764 "flush": true, 00:16:57.764 "reset": true, 00:16:57.764 "nvme_admin": false, 00:16:57.764 "nvme_io": false, 00:16:57.764 "nvme_io_md": false, 00:16:57.764 "write_zeroes": true, 00:16:57.764 "zcopy": true, 00:16:57.764 "get_zone_info": false, 00:16:57.764 "zone_management": false, 00:16:57.764 "zone_append": false, 00:16:57.764 "compare": false, 00:16:57.764 "compare_and_write": false, 00:16:57.764 "abort": true, 00:16:57.764 "seek_hole": false, 00:16:57.764 "seek_data": false, 00:16:57.764 "copy": true, 00:16:57.764 "nvme_iov_md": false 00:16:57.764 }, 00:16:57.764 "memory_domains": [ 00:16:57.764 { 00:16:57.764 "dma_device_id": "system", 00:16:57.764 "dma_device_type": 1 00:16:57.764 }, 00:16:57.764 { 00:16:57.764 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:57.764 "dma_device_type": 2 00:16:57.764 } 00:16:57.764 ], 00:16:57.764 "driver_specific": {} 00:16:57.764 } 00:16:57.764 ] 00:16:57.764 23:38:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:16:57.764 23:38:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:16:57.764 23:38:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:57.764 23:38:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:57.764 23:38:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:57.764 23:38:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:57.764 23:38:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:57.764 23:38:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:57.764 23:38:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:57.764 23:38:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:57.764 23:38:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:57.764 23:38:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:57.764 23:38:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:57.764 23:38:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:57.764 "name": "Existed_Raid", 00:16:57.764 "uuid": "53dac64d-964e-4159-bc76-8119f9e07a82", 00:16:57.764 "strip_size_kb": 0, 00:16:57.764 "state": "configuring", 00:16:57.764 "raid_level": "raid1", 00:16:57.764 "superblock": true, 00:16:57.764 "num_base_bdevs": 4, 00:16:57.764 "num_base_bdevs_discovered": 1, 00:16:57.764 "num_base_bdevs_operational": 4, 00:16:57.764 "base_bdevs_list": [ 00:16:57.764 { 00:16:57.764 "name": "BaseBdev1", 00:16:57.764 "uuid": "8a0eaded-3367-426a-9b3e-e3a1c45270c4", 00:16:57.764 "is_configured": true, 00:16:57.764 "data_offset": 2048, 00:16:57.764 "data_size": 63488 00:16:57.764 }, 00:16:57.764 { 00:16:57.764 "name": "BaseBdev2", 00:16:57.764 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:57.764 "is_configured": false, 00:16:57.764 "data_offset": 0, 00:16:57.764 "data_size": 0 00:16:57.764 }, 00:16:57.764 { 00:16:57.764 "name": "BaseBdev3", 00:16:57.764 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:57.764 "is_configured": false, 00:16:57.764 "data_offset": 0, 00:16:57.764 "data_size": 0 00:16:57.764 }, 00:16:57.764 { 00:16:57.764 "name": "BaseBdev4", 00:16:57.764 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:57.764 "is_configured": false, 00:16:57.764 "data_offset": 0, 00:16:57.764 "data_size": 0 00:16:57.764 } 00:16:57.764 ] 00:16:57.764 }' 00:16:57.764 23:38:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:57.764 23:38:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:58.332 23:38:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:58.589 [2024-07-24 23:38:43.408695] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:58.589 [2024-07-24 23:38:43.408725] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x10413a0 name Existed_Raid, state configuring 00:16:58.589 23:38:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:58.589 [2024-07-24 23:38:43.573142] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:58.589 [2024-07-24 23:38:43.574103] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:58.589 [2024-07-24 23:38:43.574129] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:58.589 [2024-07-24 23:38:43.574134] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:58.589 [2024-07-24 23:38:43.574139] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:58.589 [2024-07-24 23:38:43.574143] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:16:58.589 [2024-07-24 23:38:43.574148] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:16:58.848 23:38:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:16:58.848 23:38:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:58.848 23:38:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:16:58.848 23:38:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:58.848 23:38:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:58.848 23:38:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:58.848 23:38:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:58.848 23:38:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:58.848 23:38:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:58.848 23:38:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:58.848 23:38:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:58.848 23:38:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:58.848 23:38:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:58.848 23:38:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:58.848 23:38:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:58.848 "name": "Existed_Raid", 00:16:58.848 "uuid": "4707f680-885b-4c08-af78-ab2462eb4de4", 00:16:58.848 "strip_size_kb": 0, 00:16:58.848 "state": "configuring", 00:16:58.848 "raid_level": "raid1", 00:16:58.848 "superblock": true, 00:16:58.848 "num_base_bdevs": 4, 00:16:58.848 "num_base_bdevs_discovered": 1, 00:16:58.848 "num_base_bdevs_operational": 4, 00:16:58.848 "base_bdevs_list": [ 00:16:58.848 { 00:16:58.848 "name": "BaseBdev1", 00:16:58.848 "uuid": "8a0eaded-3367-426a-9b3e-e3a1c45270c4", 00:16:58.848 "is_configured": true, 00:16:58.848 "data_offset": 2048, 00:16:58.848 "data_size": 63488 00:16:58.848 }, 00:16:58.848 { 00:16:58.848 "name": "BaseBdev2", 00:16:58.848 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:58.848 "is_configured": false, 00:16:58.848 "data_offset": 0, 00:16:58.848 "data_size": 0 00:16:58.848 }, 00:16:58.848 { 00:16:58.848 "name": "BaseBdev3", 00:16:58.848 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:58.848 "is_configured": false, 00:16:58.848 "data_offset": 0, 00:16:58.848 "data_size": 0 00:16:58.848 }, 00:16:58.848 { 00:16:58.848 "name": "BaseBdev4", 00:16:58.848 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:58.848 "is_configured": false, 00:16:58.848 "data_offset": 0, 00:16:58.848 "data_size": 0 00:16:58.848 } 00:16:58.848 ] 00:16:58.848 }' 00:16:58.848 23:38:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:58.848 23:38:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:59.415 23:38:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:59.673 [2024-07-24 23:38:44.421933] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:59.673 BaseBdev2 00:16:59.674 23:38:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:16:59.674 23:38:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:16:59.674 23:38:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:16:59.674 23:38:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:16:59.674 23:38:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:16:59.674 23:38:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:16:59.674 23:38:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:59.674 23:38:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:59.932 [ 00:16:59.932 { 00:16:59.932 "name": "BaseBdev2", 00:16:59.932 "aliases": [ 00:16:59.932 "bbf03e82-03f4-4e03-8c6b-e3e342dd47a6" 00:16:59.932 ], 00:16:59.932 "product_name": "Malloc disk", 00:16:59.932 "block_size": 512, 00:16:59.932 "num_blocks": 65536, 00:16:59.932 "uuid": "bbf03e82-03f4-4e03-8c6b-e3e342dd47a6", 00:16:59.932 "assigned_rate_limits": { 00:16:59.932 "rw_ios_per_sec": 0, 00:16:59.932 "rw_mbytes_per_sec": 0, 00:16:59.932 "r_mbytes_per_sec": 0, 00:16:59.932 "w_mbytes_per_sec": 0 00:16:59.932 }, 00:16:59.932 "claimed": true, 00:16:59.932 "claim_type": "exclusive_write", 00:16:59.932 "zoned": false, 00:16:59.932 "supported_io_types": { 00:16:59.932 "read": true, 00:16:59.932 "write": true, 00:16:59.932 "unmap": true, 00:16:59.932 "flush": true, 00:16:59.932 "reset": true, 00:16:59.932 "nvme_admin": false, 00:16:59.932 "nvme_io": false, 00:16:59.932 "nvme_io_md": false, 00:16:59.932 "write_zeroes": true, 00:16:59.932 "zcopy": true, 00:16:59.932 "get_zone_info": false, 00:16:59.932 "zone_management": false, 00:16:59.932 "zone_append": false, 00:16:59.932 "compare": false, 00:16:59.932 "compare_and_write": false, 00:16:59.932 "abort": true, 00:16:59.932 "seek_hole": false, 00:16:59.932 "seek_data": false, 00:16:59.932 "copy": true, 00:16:59.932 "nvme_iov_md": false 00:16:59.932 }, 00:16:59.932 "memory_domains": [ 00:16:59.932 { 00:16:59.932 "dma_device_id": "system", 00:16:59.932 "dma_device_type": 1 00:16:59.932 }, 00:16:59.932 { 00:16:59.932 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:59.932 "dma_device_type": 2 00:16:59.932 } 00:16:59.932 ], 00:16:59.932 "driver_specific": {} 00:16:59.932 } 00:16:59.932 ] 00:16:59.932 23:38:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:16:59.932 23:38:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:59.932 23:38:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:59.932 23:38:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:16:59.932 23:38:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:59.932 23:38:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:59.932 23:38:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:59.932 23:38:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:59.932 23:38:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:59.932 23:38:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:59.932 23:38:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:59.932 23:38:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:59.932 23:38:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:59.932 23:38:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:59.932 23:38:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:00.190 23:38:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:00.190 "name": "Existed_Raid", 00:17:00.190 "uuid": "4707f680-885b-4c08-af78-ab2462eb4de4", 00:17:00.190 "strip_size_kb": 0, 00:17:00.191 "state": "configuring", 00:17:00.191 "raid_level": "raid1", 00:17:00.191 "superblock": true, 00:17:00.191 "num_base_bdevs": 4, 00:17:00.191 "num_base_bdevs_discovered": 2, 00:17:00.191 "num_base_bdevs_operational": 4, 00:17:00.191 "base_bdevs_list": [ 00:17:00.191 { 00:17:00.191 "name": "BaseBdev1", 00:17:00.191 "uuid": "8a0eaded-3367-426a-9b3e-e3a1c45270c4", 00:17:00.191 "is_configured": true, 00:17:00.191 "data_offset": 2048, 00:17:00.191 "data_size": 63488 00:17:00.191 }, 00:17:00.191 { 00:17:00.191 "name": "BaseBdev2", 00:17:00.191 "uuid": "bbf03e82-03f4-4e03-8c6b-e3e342dd47a6", 00:17:00.191 "is_configured": true, 00:17:00.191 "data_offset": 2048, 00:17:00.191 "data_size": 63488 00:17:00.191 }, 00:17:00.191 { 00:17:00.191 "name": "BaseBdev3", 00:17:00.191 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:00.191 "is_configured": false, 00:17:00.191 "data_offset": 0, 00:17:00.191 "data_size": 0 00:17:00.191 }, 00:17:00.191 { 00:17:00.191 "name": "BaseBdev4", 00:17:00.191 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:00.191 "is_configured": false, 00:17:00.191 "data_offset": 0, 00:17:00.191 "data_size": 0 00:17:00.191 } 00:17:00.191 ] 00:17:00.191 }' 00:17:00.191 23:38:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:00.191 23:38:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:00.449 23:38:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:00.708 [2024-07-24 23:38:45.539506] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:00.708 BaseBdev3 00:17:00.708 23:38:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:17:00.708 23:38:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:17:00.708 23:38:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:17:00.708 23:38:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:17:00.708 23:38:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:17:00.708 23:38:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:17:00.708 23:38:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:00.967 23:38:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:00.967 [ 00:17:00.967 { 00:17:00.967 "name": "BaseBdev3", 00:17:00.967 "aliases": [ 00:17:00.967 "0aabb49a-915d-4c6a-89f2-f057c7bce860" 00:17:00.967 ], 00:17:00.967 "product_name": "Malloc disk", 00:17:00.967 "block_size": 512, 00:17:00.967 "num_blocks": 65536, 00:17:00.967 "uuid": "0aabb49a-915d-4c6a-89f2-f057c7bce860", 00:17:00.967 "assigned_rate_limits": { 00:17:00.967 "rw_ios_per_sec": 0, 00:17:00.967 "rw_mbytes_per_sec": 0, 00:17:00.967 "r_mbytes_per_sec": 0, 00:17:00.967 "w_mbytes_per_sec": 0 00:17:00.967 }, 00:17:00.967 "claimed": true, 00:17:00.967 "claim_type": "exclusive_write", 00:17:00.967 "zoned": false, 00:17:00.967 "supported_io_types": { 00:17:00.967 "read": true, 00:17:00.967 "write": true, 00:17:00.967 "unmap": true, 00:17:00.967 "flush": true, 00:17:00.967 "reset": true, 00:17:00.967 "nvme_admin": false, 00:17:00.967 "nvme_io": false, 00:17:00.967 "nvme_io_md": false, 00:17:00.967 "write_zeroes": true, 00:17:00.967 "zcopy": true, 00:17:00.967 "get_zone_info": false, 00:17:00.967 "zone_management": false, 00:17:00.967 "zone_append": false, 00:17:00.967 "compare": false, 00:17:00.967 "compare_and_write": false, 00:17:00.967 "abort": true, 00:17:00.967 "seek_hole": false, 00:17:00.967 "seek_data": false, 00:17:00.967 "copy": true, 00:17:00.967 "nvme_iov_md": false 00:17:00.967 }, 00:17:00.967 "memory_domains": [ 00:17:00.967 { 00:17:00.967 "dma_device_id": "system", 00:17:00.967 "dma_device_type": 1 00:17:00.967 }, 00:17:00.967 { 00:17:00.967 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:00.967 "dma_device_type": 2 00:17:00.967 } 00:17:00.967 ], 00:17:00.967 "driver_specific": {} 00:17:00.967 } 00:17:00.967 ] 00:17:00.967 23:38:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:17:00.967 23:38:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:00.968 23:38:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:00.968 23:38:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:00.968 23:38:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:00.968 23:38:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:00.968 23:38:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:00.968 23:38:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:00.968 23:38:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:00.968 23:38:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:00.968 23:38:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:00.968 23:38:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:00.968 23:38:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:00.968 23:38:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:00.968 23:38:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:01.227 23:38:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:01.227 "name": "Existed_Raid", 00:17:01.227 "uuid": "4707f680-885b-4c08-af78-ab2462eb4de4", 00:17:01.227 "strip_size_kb": 0, 00:17:01.227 "state": "configuring", 00:17:01.227 "raid_level": "raid1", 00:17:01.227 "superblock": true, 00:17:01.227 "num_base_bdevs": 4, 00:17:01.227 "num_base_bdevs_discovered": 3, 00:17:01.227 "num_base_bdevs_operational": 4, 00:17:01.227 "base_bdevs_list": [ 00:17:01.227 { 00:17:01.227 "name": "BaseBdev1", 00:17:01.227 "uuid": "8a0eaded-3367-426a-9b3e-e3a1c45270c4", 00:17:01.227 "is_configured": true, 00:17:01.227 "data_offset": 2048, 00:17:01.227 "data_size": 63488 00:17:01.227 }, 00:17:01.227 { 00:17:01.227 "name": "BaseBdev2", 00:17:01.227 "uuid": "bbf03e82-03f4-4e03-8c6b-e3e342dd47a6", 00:17:01.227 "is_configured": true, 00:17:01.227 "data_offset": 2048, 00:17:01.227 "data_size": 63488 00:17:01.227 }, 00:17:01.227 { 00:17:01.227 "name": "BaseBdev3", 00:17:01.227 "uuid": "0aabb49a-915d-4c6a-89f2-f057c7bce860", 00:17:01.227 "is_configured": true, 00:17:01.227 "data_offset": 2048, 00:17:01.227 "data_size": 63488 00:17:01.227 }, 00:17:01.227 { 00:17:01.227 "name": "BaseBdev4", 00:17:01.227 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:01.227 "is_configured": false, 00:17:01.227 "data_offset": 0, 00:17:01.227 "data_size": 0 00:17:01.227 } 00:17:01.227 ] 00:17:01.227 }' 00:17:01.227 23:38:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:01.227 23:38:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:01.794 23:38:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:17:01.794 [2024-07-24 23:38:46.709371] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:17:01.794 [2024-07-24 23:38:46.709510] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x10423d0 00:17:01.794 [2024-07-24 23:38:46.709521] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:01.794 [2024-07-24 23:38:46.709663] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x10420a0 00:17:01.794 [2024-07-24 23:38:46.709767] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x10423d0 00:17:01.794 [2024-07-24 23:38:46.709774] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x10423d0 00:17:01.794 [2024-07-24 23:38:46.709843] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:01.794 BaseBdev4 00:17:01.794 23:38:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:17:01.794 23:38:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev4 00:17:01.794 23:38:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:17:01.794 23:38:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:17:01.794 23:38:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:17:01.794 23:38:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:17:01.794 23:38:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:02.052 23:38:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:17:02.317 [ 00:17:02.317 { 00:17:02.317 "name": "BaseBdev4", 00:17:02.317 "aliases": [ 00:17:02.317 "a78f1c34-5a37-4062-bdbd-17b8fb7c5f80" 00:17:02.317 ], 00:17:02.317 "product_name": "Malloc disk", 00:17:02.317 "block_size": 512, 00:17:02.317 "num_blocks": 65536, 00:17:02.317 "uuid": "a78f1c34-5a37-4062-bdbd-17b8fb7c5f80", 00:17:02.317 "assigned_rate_limits": { 00:17:02.317 "rw_ios_per_sec": 0, 00:17:02.317 "rw_mbytes_per_sec": 0, 00:17:02.317 "r_mbytes_per_sec": 0, 00:17:02.317 "w_mbytes_per_sec": 0 00:17:02.317 }, 00:17:02.317 "claimed": true, 00:17:02.317 "claim_type": "exclusive_write", 00:17:02.317 "zoned": false, 00:17:02.317 "supported_io_types": { 00:17:02.317 "read": true, 00:17:02.317 "write": true, 00:17:02.317 "unmap": true, 00:17:02.317 "flush": true, 00:17:02.317 "reset": true, 00:17:02.317 "nvme_admin": false, 00:17:02.317 "nvme_io": false, 00:17:02.317 "nvme_io_md": false, 00:17:02.317 "write_zeroes": true, 00:17:02.317 "zcopy": true, 00:17:02.317 "get_zone_info": false, 00:17:02.317 "zone_management": false, 00:17:02.317 "zone_append": false, 00:17:02.317 "compare": false, 00:17:02.317 "compare_and_write": false, 00:17:02.317 "abort": true, 00:17:02.317 "seek_hole": false, 00:17:02.317 "seek_data": false, 00:17:02.317 "copy": true, 00:17:02.317 "nvme_iov_md": false 00:17:02.317 }, 00:17:02.317 "memory_domains": [ 00:17:02.317 { 00:17:02.317 "dma_device_id": "system", 00:17:02.317 "dma_device_type": 1 00:17:02.317 }, 00:17:02.317 { 00:17:02.317 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:02.317 "dma_device_type": 2 00:17:02.317 } 00:17:02.317 ], 00:17:02.317 "driver_specific": {} 00:17:02.317 } 00:17:02.317 ] 00:17:02.317 23:38:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:17:02.317 23:38:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:02.317 23:38:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:02.317 23:38:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:17:02.317 23:38:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:02.317 23:38:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:02.317 23:38:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:02.317 23:38:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:02.317 23:38:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:02.317 23:38:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:02.317 23:38:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:02.317 23:38:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:02.317 23:38:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:02.317 23:38:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:02.317 23:38:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:02.317 23:38:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:02.317 "name": "Existed_Raid", 00:17:02.317 "uuid": "4707f680-885b-4c08-af78-ab2462eb4de4", 00:17:02.317 "strip_size_kb": 0, 00:17:02.317 "state": "online", 00:17:02.317 "raid_level": "raid1", 00:17:02.317 "superblock": true, 00:17:02.317 "num_base_bdevs": 4, 00:17:02.317 "num_base_bdevs_discovered": 4, 00:17:02.317 "num_base_bdevs_operational": 4, 00:17:02.317 "base_bdevs_list": [ 00:17:02.317 { 00:17:02.317 "name": "BaseBdev1", 00:17:02.317 "uuid": "8a0eaded-3367-426a-9b3e-e3a1c45270c4", 00:17:02.317 "is_configured": true, 00:17:02.317 "data_offset": 2048, 00:17:02.317 "data_size": 63488 00:17:02.317 }, 00:17:02.317 { 00:17:02.317 "name": "BaseBdev2", 00:17:02.317 "uuid": "bbf03e82-03f4-4e03-8c6b-e3e342dd47a6", 00:17:02.317 "is_configured": true, 00:17:02.317 "data_offset": 2048, 00:17:02.317 "data_size": 63488 00:17:02.317 }, 00:17:02.317 { 00:17:02.317 "name": "BaseBdev3", 00:17:02.317 "uuid": "0aabb49a-915d-4c6a-89f2-f057c7bce860", 00:17:02.317 "is_configured": true, 00:17:02.317 "data_offset": 2048, 00:17:02.317 "data_size": 63488 00:17:02.317 }, 00:17:02.317 { 00:17:02.317 "name": "BaseBdev4", 00:17:02.317 "uuid": "a78f1c34-5a37-4062-bdbd-17b8fb7c5f80", 00:17:02.317 "is_configured": true, 00:17:02.317 "data_offset": 2048, 00:17:02.317 "data_size": 63488 00:17:02.317 } 00:17:02.317 ] 00:17:02.317 }' 00:17:02.317 23:38:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:02.317 23:38:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:02.887 23:38:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:17:02.887 23:38:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:17:02.887 23:38:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:02.887 23:38:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:02.887 23:38:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:02.887 23:38:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:17:02.887 23:38:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:02.887 23:38:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:02.887 [2024-07-24 23:38:47.872610] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:03.146 23:38:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:03.146 "name": "Existed_Raid", 00:17:03.146 "aliases": [ 00:17:03.146 "4707f680-885b-4c08-af78-ab2462eb4de4" 00:17:03.146 ], 00:17:03.146 "product_name": "Raid Volume", 00:17:03.146 "block_size": 512, 00:17:03.146 "num_blocks": 63488, 00:17:03.146 "uuid": "4707f680-885b-4c08-af78-ab2462eb4de4", 00:17:03.146 "assigned_rate_limits": { 00:17:03.146 "rw_ios_per_sec": 0, 00:17:03.146 "rw_mbytes_per_sec": 0, 00:17:03.146 "r_mbytes_per_sec": 0, 00:17:03.146 "w_mbytes_per_sec": 0 00:17:03.146 }, 00:17:03.146 "claimed": false, 00:17:03.146 "zoned": false, 00:17:03.146 "supported_io_types": { 00:17:03.146 "read": true, 00:17:03.146 "write": true, 00:17:03.146 "unmap": false, 00:17:03.146 "flush": false, 00:17:03.146 "reset": true, 00:17:03.146 "nvme_admin": false, 00:17:03.146 "nvme_io": false, 00:17:03.146 "nvme_io_md": false, 00:17:03.146 "write_zeroes": true, 00:17:03.146 "zcopy": false, 00:17:03.146 "get_zone_info": false, 00:17:03.146 "zone_management": false, 00:17:03.146 "zone_append": false, 00:17:03.146 "compare": false, 00:17:03.146 "compare_and_write": false, 00:17:03.146 "abort": false, 00:17:03.146 "seek_hole": false, 00:17:03.146 "seek_data": false, 00:17:03.146 "copy": false, 00:17:03.146 "nvme_iov_md": false 00:17:03.146 }, 00:17:03.146 "memory_domains": [ 00:17:03.146 { 00:17:03.146 "dma_device_id": "system", 00:17:03.146 "dma_device_type": 1 00:17:03.146 }, 00:17:03.146 { 00:17:03.146 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:03.146 "dma_device_type": 2 00:17:03.146 }, 00:17:03.146 { 00:17:03.146 "dma_device_id": "system", 00:17:03.146 "dma_device_type": 1 00:17:03.146 }, 00:17:03.146 { 00:17:03.146 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:03.146 "dma_device_type": 2 00:17:03.146 }, 00:17:03.146 { 00:17:03.146 "dma_device_id": "system", 00:17:03.146 "dma_device_type": 1 00:17:03.146 }, 00:17:03.146 { 00:17:03.146 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:03.146 "dma_device_type": 2 00:17:03.146 }, 00:17:03.146 { 00:17:03.146 "dma_device_id": "system", 00:17:03.146 "dma_device_type": 1 00:17:03.146 }, 00:17:03.146 { 00:17:03.146 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:03.146 "dma_device_type": 2 00:17:03.146 } 00:17:03.146 ], 00:17:03.146 "driver_specific": { 00:17:03.146 "raid": { 00:17:03.146 "uuid": "4707f680-885b-4c08-af78-ab2462eb4de4", 00:17:03.146 "strip_size_kb": 0, 00:17:03.146 "state": "online", 00:17:03.146 "raid_level": "raid1", 00:17:03.146 "superblock": true, 00:17:03.146 "num_base_bdevs": 4, 00:17:03.146 "num_base_bdevs_discovered": 4, 00:17:03.146 "num_base_bdevs_operational": 4, 00:17:03.146 "base_bdevs_list": [ 00:17:03.146 { 00:17:03.146 "name": "BaseBdev1", 00:17:03.147 "uuid": "8a0eaded-3367-426a-9b3e-e3a1c45270c4", 00:17:03.147 "is_configured": true, 00:17:03.147 "data_offset": 2048, 00:17:03.147 "data_size": 63488 00:17:03.147 }, 00:17:03.147 { 00:17:03.147 "name": "BaseBdev2", 00:17:03.147 "uuid": "bbf03e82-03f4-4e03-8c6b-e3e342dd47a6", 00:17:03.147 "is_configured": true, 00:17:03.147 "data_offset": 2048, 00:17:03.147 "data_size": 63488 00:17:03.147 }, 00:17:03.147 { 00:17:03.147 "name": "BaseBdev3", 00:17:03.147 "uuid": "0aabb49a-915d-4c6a-89f2-f057c7bce860", 00:17:03.147 "is_configured": true, 00:17:03.147 "data_offset": 2048, 00:17:03.147 "data_size": 63488 00:17:03.147 }, 00:17:03.147 { 00:17:03.147 "name": "BaseBdev4", 00:17:03.147 "uuid": "a78f1c34-5a37-4062-bdbd-17b8fb7c5f80", 00:17:03.147 "is_configured": true, 00:17:03.147 "data_offset": 2048, 00:17:03.147 "data_size": 63488 00:17:03.147 } 00:17:03.147 ] 00:17:03.147 } 00:17:03.147 } 00:17:03.147 }' 00:17:03.147 23:38:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:03.147 23:38:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:17:03.147 BaseBdev2 00:17:03.147 BaseBdev3 00:17:03.147 BaseBdev4' 00:17:03.147 23:38:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:03.147 23:38:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:17:03.147 23:38:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:03.147 23:38:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:03.147 "name": "BaseBdev1", 00:17:03.147 "aliases": [ 00:17:03.147 "8a0eaded-3367-426a-9b3e-e3a1c45270c4" 00:17:03.147 ], 00:17:03.147 "product_name": "Malloc disk", 00:17:03.147 "block_size": 512, 00:17:03.147 "num_blocks": 65536, 00:17:03.147 "uuid": "8a0eaded-3367-426a-9b3e-e3a1c45270c4", 00:17:03.147 "assigned_rate_limits": { 00:17:03.147 "rw_ios_per_sec": 0, 00:17:03.147 "rw_mbytes_per_sec": 0, 00:17:03.147 "r_mbytes_per_sec": 0, 00:17:03.147 "w_mbytes_per_sec": 0 00:17:03.147 }, 00:17:03.147 "claimed": true, 00:17:03.147 "claim_type": "exclusive_write", 00:17:03.147 "zoned": false, 00:17:03.147 "supported_io_types": { 00:17:03.147 "read": true, 00:17:03.147 "write": true, 00:17:03.147 "unmap": true, 00:17:03.147 "flush": true, 00:17:03.147 "reset": true, 00:17:03.147 "nvme_admin": false, 00:17:03.147 "nvme_io": false, 00:17:03.147 "nvme_io_md": false, 00:17:03.147 "write_zeroes": true, 00:17:03.147 "zcopy": true, 00:17:03.147 "get_zone_info": false, 00:17:03.147 "zone_management": false, 00:17:03.147 "zone_append": false, 00:17:03.147 "compare": false, 00:17:03.147 "compare_and_write": false, 00:17:03.147 "abort": true, 00:17:03.147 "seek_hole": false, 00:17:03.147 "seek_data": false, 00:17:03.147 "copy": true, 00:17:03.147 "nvme_iov_md": false 00:17:03.147 }, 00:17:03.147 "memory_domains": [ 00:17:03.147 { 00:17:03.147 "dma_device_id": "system", 00:17:03.147 "dma_device_type": 1 00:17:03.147 }, 00:17:03.147 { 00:17:03.147 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:03.147 "dma_device_type": 2 00:17:03.147 } 00:17:03.147 ], 00:17:03.147 "driver_specific": {} 00:17:03.147 }' 00:17:03.147 23:38:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:03.147 23:38:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:03.406 23:38:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:03.406 23:38:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:03.406 23:38:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:03.406 23:38:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:03.406 23:38:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:03.406 23:38:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:03.406 23:38:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:03.406 23:38:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:03.406 23:38:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:03.406 23:38:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:03.406 23:38:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:03.406 23:38:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:03.406 23:38:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:03.665 23:38:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:03.665 "name": "BaseBdev2", 00:17:03.665 "aliases": [ 00:17:03.665 "bbf03e82-03f4-4e03-8c6b-e3e342dd47a6" 00:17:03.665 ], 00:17:03.665 "product_name": "Malloc disk", 00:17:03.665 "block_size": 512, 00:17:03.665 "num_blocks": 65536, 00:17:03.665 "uuid": "bbf03e82-03f4-4e03-8c6b-e3e342dd47a6", 00:17:03.665 "assigned_rate_limits": { 00:17:03.665 "rw_ios_per_sec": 0, 00:17:03.665 "rw_mbytes_per_sec": 0, 00:17:03.665 "r_mbytes_per_sec": 0, 00:17:03.665 "w_mbytes_per_sec": 0 00:17:03.665 }, 00:17:03.665 "claimed": true, 00:17:03.665 "claim_type": "exclusive_write", 00:17:03.665 "zoned": false, 00:17:03.665 "supported_io_types": { 00:17:03.665 "read": true, 00:17:03.665 "write": true, 00:17:03.665 "unmap": true, 00:17:03.665 "flush": true, 00:17:03.665 "reset": true, 00:17:03.665 "nvme_admin": false, 00:17:03.665 "nvme_io": false, 00:17:03.665 "nvme_io_md": false, 00:17:03.665 "write_zeroes": true, 00:17:03.665 "zcopy": true, 00:17:03.665 "get_zone_info": false, 00:17:03.665 "zone_management": false, 00:17:03.665 "zone_append": false, 00:17:03.665 "compare": false, 00:17:03.665 "compare_and_write": false, 00:17:03.665 "abort": true, 00:17:03.665 "seek_hole": false, 00:17:03.665 "seek_data": false, 00:17:03.665 "copy": true, 00:17:03.665 "nvme_iov_md": false 00:17:03.665 }, 00:17:03.665 "memory_domains": [ 00:17:03.665 { 00:17:03.665 "dma_device_id": "system", 00:17:03.665 "dma_device_type": 1 00:17:03.665 }, 00:17:03.665 { 00:17:03.665 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:03.665 "dma_device_type": 2 00:17:03.665 } 00:17:03.665 ], 00:17:03.665 "driver_specific": {} 00:17:03.665 }' 00:17:03.665 23:38:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:03.665 23:38:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:03.665 23:38:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:03.665 23:38:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:03.665 23:38:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:03.924 23:38:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:03.924 23:38:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:03.924 23:38:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:03.924 23:38:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:03.924 23:38:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:03.924 23:38:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:03.924 23:38:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:03.924 23:38:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:03.924 23:38:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:03.924 23:38:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:04.183 23:38:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:04.183 "name": "BaseBdev3", 00:17:04.183 "aliases": [ 00:17:04.183 "0aabb49a-915d-4c6a-89f2-f057c7bce860" 00:17:04.183 ], 00:17:04.183 "product_name": "Malloc disk", 00:17:04.183 "block_size": 512, 00:17:04.183 "num_blocks": 65536, 00:17:04.183 "uuid": "0aabb49a-915d-4c6a-89f2-f057c7bce860", 00:17:04.183 "assigned_rate_limits": { 00:17:04.183 "rw_ios_per_sec": 0, 00:17:04.183 "rw_mbytes_per_sec": 0, 00:17:04.183 "r_mbytes_per_sec": 0, 00:17:04.183 "w_mbytes_per_sec": 0 00:17:04.183 }, 00:17:04.183 "claimed": true, 00:17:04.183 "claim_type": "exclusive_write", 00:17:04.183 "zoned": false, 00:17:04.183 "supported_io_types": { 00:17:04.183 "read": true, 00:17:04.183 "write": true, 00:17:04.183 "unmap": true, 00:17:04.183 "flush": true, 00:17:04.183 "reset": true, 00:17:04.183 "nvme_admin": false, 00:17:04.183 "nvme_io": false, 00:17:04.183 "nvme_io_md": false, 00:17:04.183 "write_zeroes": true, 00:17:04.183 "zcopy": true, 00:17:04.183 "get_zone_info": false, 00:17:04.183 "zone_management": false, 00:17:04.183 "zone_append": false, 00:17:04.183 "compare": false, 00:17:04.183 "compare_and_write": false, 00:17:04.183 "abort": true, 00:17:04.183 "seek_hole": false, 00:17:04.183 "seek_data": false, 00:17:04.183 "copy": true, 00:17:04.183 "nvme_iov_md": false 00:17:04.183 }, 00:17:04.183 "memory_domains": [ 00:17:04.183 { 00:17:04.183 "dma_device_id": "system", 00:17:04.183 "dma_device_type": 1 00:17:04.183 }, 00:17:04.183 { 00:17:04.183 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:04.183 "dma_device_type": 2 00:17:04.183 } 00:17:04.183 ], 00:17:04.183 "driver_specific": {} 00:17:04.183 }' 00:17:04.183 23:38:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:04.183 23:38:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:04.183 23:38:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:04.183 23:38:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:04.183 23:38:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:04.183 23:38:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:04.183 23:38:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:04.183 23:38:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:04.183 23:38:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:04.183 23:38:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:04.443 23:38:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:04.443 23:38:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:04.443 23:38:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:04.443 23:38:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:17:04.443 23:38:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:04.443 23:38:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:04.443 "name": "BaseBdev4", 00:17:04.443 "aliases": [ 00:17:04.443 "a78f1c34-5a37-4062-bdbd-17b8fb7c5f80" 00:17:04.443 ], 00:17:04.443 "product_name": "Malloc disk", 00:17:04.443 "block_size": 512, 00:17:04.443 "num_blocks": 65536, 00:17:04.443 "uuid": "a78f1c34-5a37-4062-bdbd-17b8fb7c5f80", 00:17:04.443 "assigned_rate_limits": { 00:17:04.443 "rw_ios_per_sec": 0, 00:17:04.443 "rw_mbytes_per_sec": 0, 00:17:04.443 "r_mbytes_per_sec": 0, 00:17:04.443 "w_mbytes_per_sec": 0 00:17:04.443 }, 00:17:04.443 "claimed": true, 00:17:04.443 "claim_type": "exclusive_write", 00:17:04.443 "zoned": false, 00:17:04.443 "supported_io_types": { 00:17:04.443 "read": true, 00:17:04.443 "write": true, 00:17:04.443 "unmap": true, 00:17:04.443 "flush": true, 00:17:04.443 "reset": true, 00:17:04.443 "nvme_admin": false, 00:17:04.443 "nvme_io": false, 00:17:04.443 "nvme_io_md": false, 00:17:04.443 "write_zeroes": true, 00:17:04.443 "zcopy": true, 00:17:04.443 "get_zone_info": false, 00:17:04.443 "zone_management": false, 00:17:04.443 "zone_append": false, 00:17:04.443 "compare": false, 00:17:04.443 "compare_and_write": false, 00:17:04.443 "abort": true, 00:17:04.443 "seek_hole": false, 00:17:04.443 "seek_data": false, 00:17:04.443 "copy": true, 00:17:04.443 "nvme_iov_md": false 00:17:04.443 }, 00:17:04.443 "memory_domains": [ 00:17:04.443 { 00:17:04.443 "dma_device_id": "system", 00:17:04.443 "dma_device_type": 1 00:17:04.443 }, 00:17:04.443 { 00:17:04.443 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:04.443 "dma_device_type": 2 00:17:04.443 } 00:17:04.443 ], 00:17:04.443 "driver_specific": {} 00:17:04.443 }' 00:17:04.443 23:38:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:04.443 23:38:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:04.702 23:38:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:04.702 23:38:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:04.702 23:38:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:04.702 23:38:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:04.702 23:38:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:04.702 23:38:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:04.702 23:38:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:04.702 23:38:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:04.702 23:38:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:04.702 23:38:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:04.702 23:38:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:04.962 [2024-07-24 23:38:49.769342] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:04.962 23:38:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:17:04.962 23:38:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:17:04.962 23:38:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:04.962 23:38:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:17:04.962 23:38:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:17:04.962 23:38:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:17:04.962 23:38:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:04.962 23:38:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:04.962 23:38:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:04.962 23:38:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:04.962 23:38:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:04.962 23:38:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:04.962 23:38:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:04.962 23:38:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:04.962 23:38:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:04.962 23:38:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:04.962 23:38:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:04.962 23:38:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:04.962 "name": "Existed_Raid", 00:17:04.962 "uuid": "4707f680-885b-4c08-af78-ab2462eb4de4", 00:17:04.962 "strip_size_kb": 0, 00:17:04.962 "state": "online", 00:17:04.962 "raid_level": "raid1", 00:17:04.962 "superblock": true, 00:17:04.962 "num_base_bdevs": 4, 00:17:04.962 "num_base_bdevs_discovered": 3, 00:17:04.962 "num_base_bdevs_operational": 3, 00:17:04.962 "base_bdevs_list": [ 00:17:04.962 { 00:17:04.962 "name": null, 00:17:04.962 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:04.962 "is_configured": false, 00:17:04.962 "data_offset": 2048, 00:17:04.962 "data_size": 63488 00:17:04.962 }, 00:17:04.962 { 00:17:04.962 "name": "BaseBdev2", 00:17:04.962 "uuid": "bbf03e82-03f4-4e03-8c6b-e3e342dd47a6", 00:17:04.962 "is_configured": true, 00:17:04.962 "data_offset": 2048, 00:17:04.962 "data_size": 63488 00:17:04.962 }, 00:17:04.962 { 00:17:04.962 "name": "BaseBdev3", 00:17:04.962 "uuid": "0aabb49a-915d-4c6a-89f2-f057c7bce860", 00:17:04.962 "is_configured": true, 00:17:04.962 "data_offset": 2048, 00:17:04.962 "data_size": 63488 00:17:04.962 }, 00:17:04.962 { 00:17:04.962 "name": "BaseBdev4", 00:17:04.962 "uuid": "a78f1c34-5a37-4062-bdbd-17b8fb7c5f80", 00:17:04.962 "is_configured": true, 00:17:04.962 "data_offset": 2048, 00:17:04.962 "data_size": 63488 00:17:04.962 } 00:17:04.962 ] 00:17:04.962 }' 00:17:04.962 23:38:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:04.962 23:38:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:05.531 23:38:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:17:05.531 23:38:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:05.531 23:38:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:05.531 23:38:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:05.790 23:38:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:05.790 23:38:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:05.790 23:38:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:17:05.790 [2024-07-24 23:38:50.744744] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:05.790 23:38:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:05.790 23:38:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:05.790 23:38:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:05.790 23:38:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:06.050 23:38:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:06.050 23:38:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:06.050 23:38:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:17:06.309 [2024-07-24 23:38:51.099387] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:06.309 23:38:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:06.309 23:38:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:06.309 23:38:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:06.309 23:38:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:06.309 23:38:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:06.309 23:38:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:06.309 23:38:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:17:06.569 [2024-07-24 23:38:51.450021] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:17:06.569 [2024-07-24 23:38:51.450081] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:06.569 [2024-07-24 23:38:51.459953] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:06.569 [2024-07-24 23:38:51.459995] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:06.569 [2024-07-24 23:38:51.460001] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x10423d0 name Existed_Raid, state offline 00:17:06.569 23:38:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:06.569 23:38:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:06.569 23:38:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:06.569 23:38:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:17:06.828 23:38:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:17:06.828 23:38:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:17:06.828 23:38:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:17:06.828 23:38:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:17:06.828 23:38:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:06.828 23:38:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:06.828 BaseBdev2 00:17:06.828 23:38:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:17:06.828 23:38:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:17:06.828 23:38:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:17:06.828 23:38:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:17:06.828 23:38:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:17:06.828 23:38:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:17:06.828 23:38:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:07.086 23:38:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:07.345 [ 00:17:07.345 { 00:17:07.345 "name": "BaseBdev2", 00:17:07.345 "aliases": [ 00:17:07.345 "8463f779-f333-4604-80b2-fef5546dc115" 00:17:07.345 ], 00:17:07.345 "product_name": "Malloc disk", 00:17:07.345 "block_size": 512, 00:17:07.345 "num_blocks": 65536, 00:17:07.345 "uuid": "8463f779-f333-4604-80b2-fef5546dc115", 00:17:07.345 "assigned_rate_limits": { 00:17:07.345 "rw_ios_per_sec": 0, 00:17:07.345 "rw_mbytes_per_sec": 0, 00:17:07.345 "r_mbytes_per_sec": 0, 00:17:07.345 "w_mbytes_per_sec": 0 00:17:07.345 }, 00:17:07.345 "claimed": false, 00:17:07.345 "zoned": false, 00:17:07.345 "supported_io_types": { 00:17:07.345 "read": true, 00:17:07.345 "write": true, 00:17:07.345 "unmap": true, 00:17:07.345 "flush": true, 00:17:07.345 "reset": true, 00:17:07.345 "nvme_admin": false, 00:17:07.345 "nvme_io": false, 00:17:07.345 "nvme_io_md": false, 00:17:07.345 "write_zeroes": true, 00:17:07.345 "zcopy": true, 00:17:07.345 "get_zone_info": false, 00:17:07.345 "zone_management": false, 00:17:07.345 "zone_append": false, 00:17:07.345 "compare": false, 00:17:07.345 "compare_and_write": false, 00:17:07.345 "abort": true, 00:17:07.345 "seek_hole": false, 00:17:07.345 "seek_data": false, 00:17:07.345 "copy": true, 00:17:07.345 "nvme_iov_md": false 00:17:07.345 }, 00:17:07.345 "memory_domains": [ 00:17:07.345 { 00:17:07.345 "dma_device_id": "system", 00:17:07.345 "dma_device_type": 1 00:17:07.345 }, 00:17:07.345 { 00:17:07.345 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:07.345 "dma_device_type": 2 00:17:07.345 } 00:17:07.345 ], 00:17:07.345 "driver_specific": {} 00:17:07.345 } 00:17:07.345 ] 00:17:07.345 23:38:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:17:07.345 23:38:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:07.345 23:38:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:07.345 23:38:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:07.345 BaseBdev3 00:17:07.345 23:38:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:17:07.345 23:38:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:17:07.345 23:38:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:17:07.345 23:38:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:17:07.345 23:38:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:17:07.345 23:38:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:17:07.345 23:38:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:07.603 23:38:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:07.862 [ 00:17:07.862 { 00:17:07.862 "name": "BaseBdev3", 00:17:07.862 "aliases": [ 00:17:07.862 "a2371ff5-a2ac-451d-92f6-6d81b64f0241" 00:17:07.862 ], 00:17:07.862 "product_name": "Malloc disk", 00:17:07.862 "block_size": 512, 00:17:07.862 "num_blocks": 65536, 00:17:07.862 "uuid": "a2371ff5-a2ac-451d-92f6-6d81b64f0241", 00:17:07.862 "assigned_rate_limits": { 00:17:07.862 "rw_ios_per_sec": 0, 00:17:07.862 "rw_mbytes_per_sec": 0, 00:17:07.862 "r_mbytes_per_sec": 0, 00:17:07.862 "w_mbytes_per_sec": 0 00:17:07.862 }, 00:17:07.862 "claimed": false, 00:17:07.862 "zoned": false, 00:17:07.862 "supported_io_types": { 00:17:07.862 "read": true, 00:17:07.862 "write": true, 00:17:07.862 "unmap": true, 00:17:07.862 "flush": true, 00:17:07.862 "reset": true, 00:17:07.862 "nvme_admin": false, 00:17:07.862 "nvme_io": false, 00:17:07.862 "nvme_io_md": false, 00:17:07.862 "write_zeroes": true, 00:17:07.862 "zcopy": true, 00:17:07.862 "get_zone_info": false, 00:17:07.862 "zone_management": false, 00:17:07.862 "zone_append": false, 00:17:07.862 "compare": false, 00:17:07.862 "compare_and_write": false, 00:17:07.862 "abort": true, 00:17:07.862 "seek_hole": false, 00:17:07.862 "seek_data": false, 00:17:07.862 "copy": true, 00:17:07.862 "nvme_iov_md": false 00:17:07.862 }, 00:17:07.862 "memory_domains": [ 00:17:07.862 { 00:17:07.862 "dma_device_id": "system", 00:17:07.862 "dma_device_type": 1 00:17:07.862 }, 00:17:07.862 { 00:17:07.862 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:07.862 "dma_device_type": 2 00:17:07.862 } 00:17:07.862 ], 00:17:07.862 "driver_specific": {} 00:17:07.862 } 00:17:07.862 ] 00:17:07.862 23:38:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:17:07.862 23:38:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:07.862 23:38:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:07.862 23:38:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:17:07.862 BaseBdev4 00:17:07.862 23:38:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:17:07.862 23:38:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev4 00:17:07.862 23:38:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:17:07.862 23:38:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:17:07.862 23:38:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:17:07.862 23:38:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:17:07.862 23:38:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:08.128 23:38:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:17:08.388 [ 00:17:08.388 { 00:17:08.388 "name": "BaseBdev4", 00:17:08.388 "aliases": [ 00:17:08.388 "65af24fa-f5e6-494b-917d-d89115eecbb7" 00:17:08.388 ], 00:17:08.388 "product_name": "Malloc disk", 00:17:08.388 "block_size": 512, 00:17:08.388 "num_blocks": 65536, 00:17:08.388 "uuid": "65af24fa-f5e6-494b-917d-d89115eecbb7", 00:17:08.388 "assigned_rate_limits": { 00:17:08.388 "rw_ios_per_sec": 0, 00:17:08.388 "rw_mbytes_per_sec": 0, 00:17:08.388 "r_mbytes_per_sec": 0, 00:17:08.388 "w_mbytes_per_sec": 0 00:17:08.388 }, 00:17:08.388 "claimed": false, 00:17:08.388 "zoned": false, 00:17:08.388 "supported_io_types": { 00:17:08.388 "read": true, 00:17:08.388 "write": true, 00:17:08.388 "unmap": true, 00:17:08.388 "flush": true, 00:17:08.388 "reset": true, 00:17:08.388 "nvme_admin": false, 00:17:08.388 "nvme_io": false, 00:17:08.388 "nvme_io_md": false, 00:17:08.388 "write_zeroes": true, 00:17:08.388 "zcopy": true, 00:17:08.388 "get_zone_info": false, 00:17:08.388 "zone_management": false, 00:17:08.388 "zone_append": false, 00:17:08.388 "compare": false, 00:17:08.388 "compare_and_write": false, 00:17:08.388 "abort": true, 00:17:08.388 "seek_hole": false, 00:17:08.388 "seek_data": false, 00:17:08.388 "copy": true, 00:17:08.388 "nvme_iov_md": false 00:17:08.388 }, 00:17:08.388 "memory_domains": [ 00:17:08.388 { 00:17:08.388 "dma_device_id": "system", 00:17:08.388 "dma_device_type": 1 00:17:08.388 }, 00:17:08.388 { 00:17:08.388 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:08.388 "dma_device_type": 2 00:17:08.388 } 00:17:08.388 ], 00:17:08.388 "driver_specific": {} 00:17:08.388 } 00:17:08.388 ] 00:17:08.388 23:38:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:17:08.388 23:38:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:08.388 23:38:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:08.388 23:38:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:08.388 [2024-07-24 23:38:53.283591] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:08.388 [2024-07-24 23:38:53.283619] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:08.388 [2024-07-24 23:38:53.283631] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:08.388 [2024-07-24 23:38:53.284663] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:08.388 [2024-07-24 23:38:53.284693] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:17:08.388 23:38:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:08.388 23:38:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:08.388 23:38:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:08.388 23:38:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:08.388 23:38:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:08.388 23:38:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:08.388 23:38:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:08.388 23:38:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:08.388 23:38:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:08.388 23:38:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:08.388 23:38:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:08.388 23:38:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:08.647 23:38:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:08.647 "name": "Existed_Raid", 00:17:08.647 "uuid": "6e3ff313-a01f-459e-a59d-857eef4c0809", 00:17:08.647 "strip_size_kb": 0, 00:17:08.647 "state": "configuring", 00:17:08.648 "raid_level": "raid1", 00:17:08.648 "superblock": true, 00:17:08.648 "num_base_bdevs": 4, 00:17:08.648 "num_base_bdevs_discovered": 3, 00:17:08.648 "num_base_bdevs_operational": 4, 00:17:08.648 "base_bdevs_list": [ 00:17:08.648 { 00:17:08.648 "name": "BaseBdev1", 00:17:08.648 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:08.648 "is_configured": false, 00:17:08.648 "data_offset": 0, 00:17:08.648 "data_size": 0 00:17:08.648 }, 00:17:08.648 { 00:17:08.648 "name": "BaseBdev2", 00:17:08.648 "uuid": "8463f779-f333-4604-80b2-fef5546dc115", 00:17:08.648 "is_configured": true, 00:17:08.648 "data_offset": 2048, 00:17:08.648 "data_size": 63488 00:17:08.648 }, 00:17:08.648 { 00:17:08.648 "name": "BaseBdev3", 00:17:08.648 "uuid": "a2371ff5-a2ac-451d-92f6-6d81b64f0241", 00:17:08.648 "is_configured": true, 00:17:08.648 "data_offset": 2048, 00:17:08.648 "data_size": 63488 00:17:08.648 }, 00:17:08.648 { 00:17:08.648 "name": "BaseBdev4", 00:17:08.648 "uuid": "65af24fa-f5e6-494b-917d-d89115eecbb7", 00:17:08.648 "is_configured": true, 00:17:08.648 "data_offset": 2048, 00:17:08.648 "data_size": 63488 00:17:08.648 } 00:17:08.648 ] 00:17:08.648 }' 00:17:08.648 23:38:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:08.648 23:38:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:09.216 23:38:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:17:09.216 [2024-07-24 23:38:54.101680] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:09.216 23:38:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:09.216 23:38:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:09.216 23:38:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:09.216 23:38:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:09.216 23:38:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:09.216 23:38:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:09.216 23:38:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:09.216 23:38:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:09.216 23:38:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:09.216 23:38:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:09.216 23:38:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:09.216 23:38:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:09.480 23:38:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:09.480 "name": "Existed_Raid", 00:17:09.480 "uuid": "6e3ff313-a01f-459e-a59d-857eef4c0809", 00:17:09.480 "strip_size_kb": 0, 00:17:09.480 "state": "configuring", 00:17:09.480 "raid_level": "raid1", 00:17:09.480 "superblock": true, 00:17:09.480 "num_base_bdevs": 4, 00:17:09.480 "num_base_bdevs_discovered": 2, 00:17:09.480 "num_base_bdevs_operational": 4, 00:17:09.480 "base_bdevs_list": [ 00:17:09.480 { 00:17:09.480 "name": "BaseBdev1", 00:17:09.480 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:09.480 "is_configured": false, 00:17:09.480 "data_offset": 0, 00:17:09.480 "data_size": 0 00:17:09.480 }, 00:17:09.480 { 00:17:09.480 "name": null, 00:17:09.480 "uuid": "8463f779-f333-4604-80b2-fef5546dc115", 00:17:09.480 "is_configured": false, 00:17:09.480 "data_offset": 2048, 00:17:09.480 "data_size": 63488 00:17:09.480 }, 00:17:09.480 { 00:17:09.480 "name": "BaseBdev3", 00:17:09.480 "uuid": "a2371ff5-a2ac-451d-92f6-6d81b64f0241", 00:17:09.480 "is_configured": true, 00:17:09.480 "data_offset": 2048, 00:17:09.480 "data_size": 63488 00:17:09.480 }, 00:17:09.480 { 00:17:09.480 "name": "BaseBdev4", 00:17:09.480 "uuid": "65af24fa-f5e6-494b-917d-d89115eecbb7", 00:17:09.480 "is_configured": true, 00:17:09.480 "data_offset": 2048, 00:17:09.480 "data_size": 63488 00:17:09.480 } 00:17:09.480 ] 00:17:09.480 }' 00:17:09.480 23:38:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:09.480 23:38:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:10.078 23:38:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:10.078 23:38:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:10.078 23:38:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:17:10.078 23:38:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:10.337 [2024-07-24 23:38:55.119029] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:10.338 BaseBdev1 00:17:10.338 23:38:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:17:10.338 23:38:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:17:10.338 23:38:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:17:10.338 23:38:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:17:10.338 23:38:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:17:10.338 23:38:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:17:10.338 23:38:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:10.338 23:38:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:10.597 [ 00:17:10.597 { 00:17:10.597 "name": "BaseBdev1", 00:17:10.597 "aliases": [ 00:17:10.597 "a52a8787-deea-4057-b4bd-cbdc5c009552" 00:17:10.597 ], 00:17:10.597 "product_name": "Malloc disk", 00:17:10.597 "block_size": 512, 00:17:10.597 "num_blocks": 65536, 00:17:10.597 "uuid": "a52a8787-deea-4057-b4bd-cbdc5c009552", 00:17:10.597 "assigned_rate_limits": { 00:17:10.597 "rw_ios_per_sec": 0, 00:17:10.597 "rw_mbytes_per_sec": 0, 00:17:10.597 "r_mbytes_per_sec": 0, 00:17:10.597 "w_mbytes_per_sec": 0 00:17:10.597 }, 00:17:10.597 "claimed": true, 00:17:10.597 "claim_type": "exclusive_write", 00:17:10.597 "zoned": false, 00:17:10.597 "supported_io_types": { 00:17:10.597 "read": true, 00:17:10.597 "write": true, 00:17:10.597 "unmap": true, 00:17:10.597 "flush": true, 00:17:10.597 "reset": true, 00:17:10.597 "nvme_admin": false, 00:17:10.597 "nvme_io": false, 00:17:10.597 "nvme_io_md": false, 00:17:10.597 "write_zeroes": true, 00:17:10.597 "zcopy": true, 00:17:10.597 "get_zone_info": false, 00:17:10.597 "zone_management": false, 00:17:10.597 "zone_append": false, 00:17:10.597 "compare": false, 00:17:10.597 "compare_and_write": false, 00:17:10.597 "abort": true, 00:17:10.597 "seek_hole": false, 00:17:10.597 "seek_data": false, 00:17:10.597 "copy": true, 00:17:10.597 "nvme_iov_md": false 00:17:10.597 }, 00:17:10.597 "memory_domains": [ 00:17:10.597 { 00:17:10.597 "dma_device_id": "system", 00:17:10.597 "dma_device_type": 1 00:17:10.597 }, 00:17:10.597 { 00:17:10.597 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:10.597 "dma_device_type": 2 00:17:10.597 } 00:17:10.597 ], 00:17:10.597 "driver_specific": {} 00:17:10.597 } 00:17:10.597 ] 00:17:10.597 23:38:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:17:10.598 23:38:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:10.598 23:38:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:10.598 23:38:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:10.598 23:38:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:10.598 23:38:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:10.598 23:38:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:10.598 23:38:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:10.598 23:38:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:10.598 23:38:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:10.598 23:38:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:10.598 23:38:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:10.598 23:38:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:10.857 23:38:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:10.857 "name": "Existed_Raid", 00:17:10.857 "uuid": "6e3ff313-a01f-459e-a59d-857eef4c0809", 00:17:10.857 "strip_size_kb": 0, 00:17:10.857 "state": "configuring", 00:17:10.857 "raid_level": "raid1", 00:17:10.857 "superblock": true, 00:17:10.857 "num_base_bdevs": 4, 00:17:10.857 "num_base_bdevs_discovered": 3, 00:17:10.857 "num_base_bdevs_operational": 4, 00:17:10.857 "base_bdevs_list": [ 00:17:10.857 { 00:17:10.857 "name": "BaseBdev1", 00:17:10.857 "uuid": "a52a8787-deea-4057-b4bd-cbdc5c009552", 00:17:10.857 "is_configured": true, 00:17:10.857 "data_offset": 2048, 00:17:10.857 "data_size": 63488 00:17:10.857 }, 00:17:10.857 { 00:17:10.857 "name": null, 00:17:10.857 "uuid": "8463f779-f333-4604-80b2-fef5546dc115", 00:17:10.857 "is_configured": false, 00:17:10.857 "data_offset": 2048, 00:17:10.857 "data_size": 63488 00:17:10.857 }, 00:17:10.857 { 00:17:10.857 "name": "BaseBdev3", 00:17:10.857 "uuid": "a2371ff5-a2ac-451d-92f6-6d81b64f0241", 00:17:10.857 "is_configured": true, 00:17:10.857 "data_offset": 2048, 00:17:10.857 "data_size": 63488 00:17:10.857 }, 00:17:10.857 { 00:17:10.857 "name": "BaseBdev4", 00:17:10.857 "uuid": "65af24fa-f5e6-494b-917d-d89115eecbb7", 00:17:10.857 "is_configured": true, 00:17:10.857 "data_offset": 2048, 00:17:10.857 "data_size": 63488 00:17:10.857 } 00:17:10.857 ] 00:17:10.857 }' 00:17:10.857 23:38:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:10.857 23:38:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:11.116 23:38:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:11.116 23:38:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:11.375 23:38:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:17:11.375 23:38:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:17:11.634 [2024-07-24 23:38:56.406354] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:11.634 23:38:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:11.634 23:38:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:11.634 23:38:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:11.634 23:38:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:11.634 23:38:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:11.634 23:38:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:11.634 23:38:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:11.634 23:38:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:11.634 23:38:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:11.634 23:38:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:11.634 23:38:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:11.634 23:38:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:11.634 23:38:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:11.634 "name": "Existed_Raid", 00:17:11.634 "uuid": "6e3ff313-a01f-459e-a59d-857eef4c0809", 00:17:11.634 "strip_size_kb": 0, 00:17:11.634 "state": "configuring", 00:17:11.634 "raid_level": "raid1", 00:17:11.634 "superblock": true, 00:17:11.634 "num_base_bdevs": 4, 00:17:11.634 "num_base_bdevs_discovered": 2, 00:17:11.634 "num_base_bdevs_operational": 4, 00:17:11.634 "base_bdevs_list": [ 00:17:11.634 { 00:17:11.634 "name": "BaseBdev1", 00:17:11.634 "uuid": "a52a8787-deea-4057-b4bd-cbdc5c009552", 00:17:11.634 "is_configured": true, 00:17:11.634 "data_offset": 2048, 00:17:11.634 "data_size": 63488 00:17:11.634 }, 00:17:11.634 { 00:17:11.634 "name": null, 00:17:11.634 "uuid": "8463f779-f333-4604-80b2-fef5546dc115", 00:17:11.634 "is_configured": false, 00:17:11.634 "data_offset": 2048, 00:17:11.634 "data_size": 63488 00:17:11.634 }, 00:17:11.634 { 00:17:11.634 "name": null, 00:17:11.634 "uuid": "a2371ff5-a2ac-451d-92f6-6d81b64f0241", 00:17:11.634 "is_configured": false, 00:17:11.634 "data_offset": 2048, 00:17:11.634 "data_size": 63488 00:17:11.634 }, 00:17:11.634 { 00:17:11.634 "name": "BaseBdev4", 00:17:11.634 "uuid": "65af24fa-f5e6-494b-917d-d89115eecbb7", 00:17:11.634 "is_configured": true, 00:17:11.634 "data_offset": 2048, 00:17:11.634 "data_size": 63488 00:17:11.634 } 00:17:11.634 ] 00:17:11.634 }' 00:17:11.634 23:38:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:11.634 23:38:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:12.202 23:38:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:12.202 23:38:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:12.461 23:38:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:17:12.461 23:38:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:17:12.461 [2024-07-24 23:38:57.408956] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:12.461 23:38:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:12.461 23:38:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:12.461 23:38:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:12.461 23:38:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:12.461 23:38:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:12.461 23:38:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:12.461 23:38:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:12.461 23:38:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:12.461 23:38:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:12.461 23:38:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:12.461 23:38:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:12.461 23:38:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:12.720 23:38:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:12.720 "name": "Existed_Raid", 00:17:12.720 "uuid": "6e3ff313-a01f-459e-a59d-857eef4c0809", 00:17:12.720 "strip_size_kb": 0, 00:17:12.720 "state": "configuring", 00:17:12.720 "raid_level": "raid1", 00:17:12.720 "superblock": true, 00:17:12.720 "num_base_bdevs": 4, 00:17:12.720 "num_base_bdevs_discovered": 3, 00:17:12.720 "num_base_bdevs_operational": 4, 00:17:12.720 "base_bdevs_list": [ 00:17:12.720 { 00:17:12.720 "name": "BaseBdev1", 00:17:12.720 "uuid": "a52a8787-deea-4057-b4bd-cbdc5c009552", 00:17:12.720 "is_configured": true, 00:17:12.720 "data_offset": 2048, 00:17:12.720 "data_size": 63488 00:17:12.720 }, 00:17:12.720 { 00:17:12.720 "name": null, 00:17:12.720 "uuid": "8463f779-f333-4604-80b2-fef5546dc115", 00:17:12.720 "is_configured": false, 00:17:12.720 "data_offset": 2048, 00:17:12.720 "data_size": 63488 00:17:12.720 }, 00:17:12.720 { 00:17:12.720 "name": "BaseBdev3", 00:17:12.720 "uuid": "a2371ff5-a2ac-451d-92f6-6d81b64f0241", 00:17:12.720 "is_configured": true, 00:17:12.720 "data_offset": 2048, 00:17:12.720 "data_size": 63488 00:17:12.720 }, 00:17:12.720 { 00:17:12.720 "name": "BaseBdev4", 00:17:12.720 "uuid": "65af24fa-f5e6-494b-917d-d89115eecbb7", 00:17:12.720 "is_configured": true, 00:17:12.720 "data_offset": 2048, 00:17:12.720 "data_size": 63488 00:17:12.720 } 00:17:12.720 ] 00:17:12.720 }' 00:17:12.720 23:38:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:12.720 23:38:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:13.288 23:38:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:13.288 23:38:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:13.288 23:38:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:17:13.288 23:38:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:13.548 [2024-07-24 23:38:58.379479] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:13.548 23:38:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:13.548 23:38:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:13.548 23:38:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:13.548 23:38:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:13.548 23:38:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:13.548 23:38:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:13.548 23:38:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:13.548 23:38:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:13.548 23:38:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:13.548 23:38:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:13.548 23:38:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:13.548 23:38:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:13.807 23:38:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:13.807 "name": "Existed_Raid", 00:17:13.807 "uuid": "6e3ff313-a01f-459e-a59d-857eef4c0809", 00:17:13.807 "strip_size_kb": 0, 00:17:13.807 "state": "configuring", 00:17:13.807 "raid_level": "raid1", 00:17:13.807 "superblock": true, 00:17:13.807 "num_base_bdevs": 4, 00:17:13.807 "num_base_bdevs_discovered": 2, 00:17:13.807 "num_base_bdevs_operational": 4, 00:17:13.807 "base_bdevs_list": [ 00:17:13.807 { 00:17:13.807 "name": null, 00:17:13.807 "uuid": "a52a8787-deea-4057-b4bd-cbdc5c009552", 00:17:13.807 "is_configured": false, 00:17:13.807 "data_offset": 2048, 00:17:13.807 "data_size": 63488 00:17:13.807 }, 00:17:13.807 { 00:17:13.807 "name": null, 00:17:13.807 "uuid": "8463f779-f333-4604-80b2-fef5546dc115", 00:17:13.807 "is_configured": false, 00:17:13.807 "data_offset": 2048, 00:17:13.807 "data_size": 63488 00:17:13.807 }, 00:17:13.807 { 00:17:13.807 "name": "BaseBdev3", 00:17:13.807 "uuid": "a2371ff5-a2ac-451d-92f6-6d81b64f0241", 00:17:13.807 "is_configured": true, 00:17:13.807 "data_offset": 2048, 00:17:13.807 "data_size": 63488 00:17:13.807 }, 00:17:13.807 { 00:17:13.807 "name": "BaseBdev4", 00:17:13.807 "uuid": "65af24fa-f5e6-494b-917d-d89115eecbb7", 00:17:13.807 "is_configured": true, 00:17:13.807 "data_offset": 2048, 00:17:13.807 "data_size": 63488 00:17:13.807 } 00:17:13.807 ] 00:17:13.807 }' 00:17:13.807 23:38:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:13.807 23:38:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:14.066 23:38:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:14.066 23:38:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:14.360 23:38:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:17:14.360 23:38:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:17:14.360 [2024-07-24 23:38:59.355776] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:14.651 23:38:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:14.651 23:38:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:14.651 23:38:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:14.651 23:38:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:14.651 23:38:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:14.651 23:38:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:14.651 23:38:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:14.651 23:38:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:14.651 23:38:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:14.651 23:38:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:14.651 23:38:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:14.651 23:38:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:14.651 23:38:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:14.651 "name": "Existed_Raid", 00:17:14.651 "uuid": "6e3ff313-a01f-459e-a59d-857eef4c0809", 00:17:14.651 "strip_size_kb": 0, 00:17:14.651 "state": "configuring", 00:17:14.651 "raid_level": "raid1", 00:17:14.651 "superblock": true, 00:17:14.651 "num_base_bdevs": 4, 00:17:14.651 "num_base_bdevs_discovered": 3, 00:17:14.651 "num_base_bdevs_operational": 4, 00:17:14.651 "base_bdevs_list": [ 00:17:14.651 { 00:17:14.651 "name": null, 00:17:14.651 "uuid": "a52a8787-deea-4057-b4bd-cbdc5c009552", 00:17:14.651 "is_configured": false, 00:17:14.651 "data_offset": 2048, 00:17:14.651 "data_size": 63488 00:17:14.651 }, 00:17:14.651 { 00:17:14.651 "name": "BaseBdev2", 00:17:14.651 "uuid": "8463f779-f333-4604-80b2-fef5546dc115", 00:17:14.651 "is_configured": true, 00:17:14.651 "data_offset": 2048, 00:17:14.651 "data_size": 63488 00:17:14.651 }, 00:17:14.651 { 00:17:14.651 "name": "BaseBdev3", 00:17:14.651 "uuid": "a2371ff5-a2ac-451d-92f6-6d81b64f0241", 00:17:14.651 "is_configured": true, 00:17:14.651 "data_offset": 2048, 00:17:14.651 "data_size": 63488 00:17:14.651 }, 00:17:14.651 { 00:17:14.651 "name": "BaseBdev4", 00:17:14.651 "uuid": "65af24fa-f5e6-494b-917d-d89115eecbb7", 00:17:14.651 "is_configured": true, 00:17:14.651 "data_offset": 2048, 00:17:14.651 "data_size": 63488 00:17:14.651 } 00:17:14.651 ] 00:17:14.651 }' 00:17:14.651 23:38:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:14.652 23:38:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:15.220 23:39:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:15.220 23:39:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:15.220 23:39:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:17:15.220 23:39:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:15.220 23:39:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:17:15.479 23:39:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u a52a8787-deea-4057-b4bd-cbdc5c009552 00:17:15.739 [2024-07-24 23:39:00.561570] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:17:15.739 [2024-07-24 23:39:00.561698] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1043020 00:17:15.739 [2024-07-24 23:39:00.561706] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:15.739 [2024-07-24 23:39:00.561821] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x102e2e0 00:17:15.739 [2024-07-24 23:39:00.561912] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1043020 00:17:15.739 [2024-07-24 23:39:00.561918] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1043020 00:17:15.739 [2024-07-24 23:39:00.561981] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:15.739 NewBaseBdev 00:17:15.739 23:39:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:17:15.739 23:39:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=NewBaseBdev 00:17:15.739 23:39:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:17:15.739 23:39:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:17:15.739 23:39:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:17:15.739 23:39:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:17:15.739 23:39:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:15.998 23:39:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:17:15.998 [ 00:17:15.998 { 00:17:15.998 "name": "NewBaseBdev", 00:17:15.998 "aliases": [ 00:17:15.998 "a52a8787-deea-4057-b4bd-cbdc5c009552" 00:17:15.998 ], 00:17:15.998 "product_name": "Malloc disk", 00:17:15.998 "block_size": 512, 00:17:15.998 "num_blocks": 65536, 00:17:15.998 "uuid": "a52a8787-deea-4057-b4bd-cbdc5c009552", 00:17:15.998 "assigned_rate_limits": { 00:17:15.998 "rw_ios_per_sec": 0, 00:17:15.998 "rw_mbytes_per_sec": 0, 00:17:15.998 "r_mbytes_per_sec": 0, 00:17:15.998 "w_mbytes_per_sec": 0 00:17:15.998 }, 00:17:15.998 "claimed": true, 00:17:15.998 "claim_type": "exclusive_write", 00:17:15.998 "zoned": false, 00:17:15.998 "supported_io_types": { 00:17:15.998 "read": true, 00:17:15.998 "write": true, 00:17:15.998 "unmap": true, 00:17:15.998 "flush": true, 00:17:15.998 "reset": true, 00:17:15.998 "nvme_admin": false, 00:17:15.998 "nvme_io": false, 00:17:15.998 "nvme_io_md": false, 00:17:15.998 "write_zeroes": true, 00:17:15.998 "zcopy": true, 00:17:15.998 "get_zone_info": false, 00:17:15.998 "zone_management": false, 00:17:15.998 "zone_append": false, 00:17:15.998 "compare": false, 00:17:15.998 "compare_and_write": false, 00:17:15.998 "abort": true, 00:17:15.998 "seek_hole": false, 00:17:15.998 "seek_data": false, 00:17:15.998 "copy": true, 00:17:15.998 "nvme_iov_md": false 00:17:15.998 }, 00:17:15.998 "memory_domains": [ 00:17:15.998 { 00:17:15.998 "dma_device_id": "system", 00:17:15.998 "dma_device_type": 1 00:17:15.998 }, 00:17:15.998 { 00:17:15.999 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:15.999 "dma_device_type": 2 00:17:15.999 } 00:17:15.999 ], 00:17:15.999 "driver_specific": {} 00:17:15.999 } 00:17:15.999 ] 00:17:15.999 23:39:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:17:15.999 23:39:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:17:15.999 23:39:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:15.999 23:39:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:15.999 23:39:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:15.999 23:39:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:15.999 23:39:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:15.999 23:39:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:15.999 23:39:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:15.999 23:39:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:15.999 23:39:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:15.999 23:39:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:15.999 23:39:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:16.258 23:39:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:16.258 "name": "Existed_Raid", 00:17:16.258 "uuid": "6e3ff313-a01f-459e-a59d-857eef4c0809", 00:17:16.258 "strip_size_kb": 0, 00:17:16.258 "state": "online", 00:17:16.258 "raid_level": "raid1", 00:17:16.258 "superblock": true, 00:17:16.258 "num_base_bdevs": 4, 00:17:16.258 "num_base_bdevs_discovered": 4, 00:17:16.258 "num_base_bdevs_operational": 4, 00:17:16.258 "base_bdevs_list": [ 00:17:16.258 { 00:17:16.258 "name": "NewBaseBdev", 00:17:16.258 "uuid": "a52a8787-deea-4057-b4bd-cbdc5c009552", 00:17:16.258 "is_configured": true, 00:17:16.258 "data_offset": 2048, 00:17:16.258 "data_size": 63488 00:17:16.258 }, 00:17:16.258 { 00:17:16.258 "name": "BaseBdev2", 00:17:16.258 "uuid": "8463f779-f333-4604-80b2-fef5546dc115", 00:17:16.258 "is_configured": true, 00:17:16.258 "data_offset": 2048, 00:17:16.258 "data_size": 63488 00:17:16.258 }, 00:17:16.258 { 00:17:16.258 "name": "BaseBdev3", 00:17:16.258 "uuid": "a2371ff5-a2ac-451d-92f6-6d81b64f0241", 00:17:16.258 "is_configured": true, 00:17:16.258 "data_offset": 2048, 00:17:16.258 "data_size": 63488 00:17:16.258 }, 00:17:16.258 { 00:17:16.258 "name": "BaseBdev4", 00:17:16.258 "uuid": "65af24fa-f5e6-494b-917d-d89115eecbb7", 00:17:16.258 "is_configured": true, 00:17:16.258 "data_offset": 2048, 00:17:16.258 "data_size": 63488 00:17:16.258 } 00:17:16.258 ] 00:17:16.258 }' 00:17:16.258 23:39:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:16.258 23:39:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:16.825 23:39:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:17:16.825 23:39:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:17:16.825 23:39:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:16.825 23:39:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:16.825 23:39:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:16.825 23:39:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:17:16.825 23:39:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:16.825 23:39:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:16.825 [2024-07-24 23:39:01.708743] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:16.825 23:39:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:16.825 "name": "Existed_Raid", 00:17:16.825 "aliases": [ 00:17:16.825 "6e3ff313-a01f-459e-a59d-857eef4c0809" 00:17:16.825 ], 00:17:16.825 "product_name": "Raid Volume", 00:17:16.825 "block_size": 512, 00:17:16.825 "num_blocks": 63488, 00:17:16.825 "uuid": "6e3ff313-a01f-459e-a59d-857eef4c0809", 00:17:16.825 "assigned_rate_limits": { 00:17:16.825 "rw_ios_per_sec": 0, 00:17:16.825 "rw_mbytes_per_sec": 0, 00:17:16.825 "r_mbytes_per_sec": 0, 00:17:16.825 "w_mbytes_per_sec": 0 00:17:16.825 }, 00:17:16.825 "claimed": false, 00:17:16.825 "zoned": false, 00:17:16.825 "supported_io_types": { 00:17:16.825 "read": true, 00:17:16.825 "write": true, 00:17:16.825 "unmap": false, 00:17:16.825 "flush": false, 00:17:16.825 "reset": true, 00:17:16.825 "nvme_admin": false, 00:17:16.825 "nvme_io": false, 00:17:16.825 "nvme_io_md": false, 00:17:16.825 "write_zeroes": true, 00:17:16.825 "zcopy": false, 00:17:16.825 "get_zone_info": false, 00:17:16.825 "zone_management": false, 00:17:16.825 "zone_append": false, 00:17:16.825 "compare": false, 00:17:16.825 "compare_and_write": false, 00:17:16.825 "abort": false, 00:17:16.825 "seek_hole": false, 00:17:16.825 "seek_data": false, 00:17:16.825 "copy": false, 00:17:16.825 "nvme_iov_md": false 00:17:16.825 }, 00:17:16.825 "memory_domains": [ 00:17:16.825 { 00:17:16.825 "dma_device_id": "system", 00:17:16.825 "dma_device_type": 1 00:17:16.825 }, 00:17:16.825 { 00:17:16.825 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:16.825 "dma_device_type": 2 00:17:16.826 }, 00:17:16.826 { 00:17:16.826 "dma_device_id": "system", 00:17:16.826 "dma_device_type": 1 00:17:16.826 }, 00:17:16.826 { 00:17:16.826 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:16.826 "dma_device_type": 2 00:17:16.826 }, 00:17:16.826 { 00:17:16.826 "dma_device_id": "system", 00:17:16.826 "dma_device_type": 1 00:17:16.826 }, 00:17:16.826 { 00:17:16.826 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:16.826 "dma_device_type": 2 00:17:16.826 }, 00:17:16.826 { 00:17:16.826 "dma_device_id": "system", 00:17:16.826 "dma_device_type": 1 00:17:16.826 }, 00:17:16.826 { 00:17:16.826 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:16.826 "dma_device_type": 2 00:17:16.826 } 00:17:16.826 ], 00:17:16.826 "driver_specific": { 00:17:16.826 "raid": { 00:17:16.826 "uuid": "6e3ff313-a01f-459e-a59d-857eef4c0809", 00:17:16.826 "strip_size_kb": 0, 00:17:16.826 "state": "online", 00:17:16.826 "raid_level": "raid1", 00:17:16.826 "superblock": true, 00:17:16.826 "num_base_bdevs": 4, 00:17:16.826 "num_base_bdevs_discovered": 4, 00:17:16.826 "num_base_bdevs_operational": 4, 00:17:16.826 "base_bdevs_list": [ 00:17:16.826 { 00:17:16.826 "name": "NewBaseBdev", 00:17:16.826 "uuid": "a52a8787-deea-4057-b4bd-cbdc5c009552", 00:17:16.826 "is_configured": true, 00:17:16.826 "data_offset": 2048, 00:17:16.826 "data_size": 63488 00:17:16.826 }, 00:17:16.826 { 00:17:16.826 "name": "BaseBdev2", 00:17:16.826 "uuid": "8463f779-f333-4604-80b2-fef5546dc115", 00:17:16.826 "is_configured": true, 00:17:16.826 "data_offset": 2048, 00:17:16.826 "data_size": 63488 00:17:16.826 }, 00:17:16.826 { 00:17:16.826 "name": "BaseBdev3", 00:17:16.826 "uuid": "a2371ff5-a2ac-451d-92f6-6d81b64f0241", 00:17:16.826 "is_configured": true, 00:17:16.826 "data_offset": 2048, 00:17:16.826 "data_size": 63488 00:17:16.826 }, 00:17:16.826 { 00:17:16.826 "name": "BaseBdev4", 00:17:16.826 "uuid": "65af24fa-f5e6-494b-917d-d89115eecbb7", 00:17:16.826 "is_configured": true, 00:17:16.826 "data_offset": 2048, 00:17:16.826 "data_size": 63488 00:17:16.826 } 00:17:16.826 ] 00:17:16.826 } 00:17:16.826 } 00:17:16.826 }' 00:17:16.826 23:39:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:16.826 23:39:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:17:16.826 BaseBdev2 00:17:16.826 BaseBdev3 00:17:16.826 BaseBdev4' 00:17:16.826 23:39:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:16.826 23:39:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:17:16.826 23:39:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:17.084 23:39:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:17.084 "name": "NewBaseBdev", 00:17:17.084 "aliases": [ 00:17:17.084 "a52a8787-deea-4057-b4bd-cbdc5c009552" 00:17:17.084 ], 00:17:17.084 "product_name": "Malloc disk", 00:17:17.084 "block_size": 512, 00:17:17.084 "num_blocks": 65536, 00:17:17.084 "uuid": "a52a8787-deea-4057-b4bd-cbdc5c009552", 00:17:17.084 "assigned_rate_limits": { 00:17:17.084 "rw_ios_per_sec": 0, 00:17:17.084 "rw_mbytes_per_sec": 0, 00:17:17.084 "r_mbytes_per_sec": 0, 00:17:17.084 "w_mbytes_per_sec": 0 00:17:17.084 }, 00:17:17.084 "claimed": true, 00:17:17.084 "claim_type": "exclusive_write", 00:17:17.084 "zoned": false, 00:17:17.084 "supported_io_types": { 00:17:17.084 "read": true, 00:17:17.084 "write": true, 00:17:17.084 "unmap": true, 00:17:17.084 "flush": true, 00:17:17.084 "reset": true, 00:17:17.084 "nvme_admin": false, 00:17:17.084 "nvme_io": false, 00:17:17.084 "nvme_io_md": false, 00:17:17.084 "write_zeroes": true, 00:17:17.084 "zcopy": true, 00:17:17.084 "get_zone_info": false, 00:17:17.084 "zone_management": false, 00:17:17.084 "zone_append": false, 00:17:17.084 "compare": false, 00:17:17.084 "compare_and_write": false, 00:17:17.084 "abort": true, 00:17:17.084 "seek_hole": false, 00:17:17.084 "seek_data": false, 00:17:17.084 "copy": true, 00:17:17.084 "nvme_iov_md": false 00:17:17.084 }, 00:17:17.084 "memory_domains": [ 00:17:17.084 { 00:17:17.084 "dma_device_id": "system", 00:17:17.084 "dma_device_type": 1 00:17:17.084 }, 00:17:17.084 { 00:17:17.084 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:17.084 "dma_device_type": 2 00:17:17.084 } 00:17:17.084 ], 00:17:17.084 "driver_specific": {} 00:17:17.084 }' 00:17:17.084 23:39:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:17.084 23:39:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:17.084 23:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:17.084 23:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:17.084 23:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:17.343 23:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:17.343 23:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:17.343 23:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:17.343 23:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:17.343 23:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:17.343 23:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:17.343 23:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:17.343 23:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:17.343 23:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:17.343 23:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:17.602 23:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:17.602 "name": "BaseBdev2", 00:17:17.602 "aliases": [ 00:17:17.602 "8463f779-f333-4604-80b2-fef5546dc115" 00:17:17.602 ], 00:17:17.602 "product_name": "Malloc disk", 00:17:17.602 "block_size": 512, 00:17:17.602 "num_blocks": 65536, 00:17:17.603 "uuid": "8463f779-f333-4604-80b2-fef5546dc115", 00:17:17.603 "assigned_rate_limits": { 00:17:17.603 "rw_ios_per_sec": 0, 00:17:17.603 "rw_mbytes_per_sec": 0, 00:17:17.603 "r_mbytes_per_sec": 0, 00:17:17.603 "w_mbytes_per_sec": 0 00:17:17.603 }, 00:17:17.603 "claimed": true, 00:17:17.603 "claim_type": "exclusive_write", 00:17:17.603 "zoned": false, 00:17:17.603 "supported_io_types": { 00:17:17.603 "read": true, 00:17:17.603 "write": true, 00:17:17.603 "unmap": true, 00:17:17.603 "flush": true, 00:17:17.603 "reset": true, 00:17:17.603 "nvme_admin": false, 00:17:17.603 "nvme_io": false, 00:17:17.603 "nvme_io_md": false, 00:17:17.603 "write_zeroes": true, 00:17:17.603 "zcopy": true, 00:17:17.603 "get_zone_info": false, 00:17:17.603 "zone_management": false, 00:17:17.603 "zone_append": false, 00:17:17.603 "compare": false, 00:17:17.603 "compare_and_write": false, 00:17:17.603 "abort": true, 00:17:17.603 "seek_hole": false, 00:17:17.603 "seek_data": false, 00:17:17.603 "copy": true, 00:17:17.603 "nvme_iov_md": false 00:17:17.603 }, 00:17:17.603 "memory_domains": [ 00:17:17.603 { 00:17:17.603 "dma_device_id": "system", 00:17:17.603 "dma_device_type": 1 00:17:17.603 }, 00:17:17.603 { 00:17:17.603 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:17.603 "dma_device_type": 2 00:17:17.603 } 00:17:17.603 ], 00:17:17.603 "driver_specific": {} 00:17:17.603 }' 00:17:17.603 23:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:17.603 23:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:17.603 23:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:17.603 23:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:17.603 23:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:17.603 23:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:17.603 23:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:17.862 23:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:17.862 23:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:17.862 23:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:17.862 23:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:17.862 23:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:17.862 23:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:17.862 23:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:17.862 23:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:18.121 23:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:18.121 "name": "BaseBdev3", 00:17:18.121 "aliases": [ 00:17:18.121 "a2371ff5-a2ac-451d-92f6-6d81b64f0241" 00:17:18.121 ], 00:17:18.121 "product_name": "Malloc disk", 00:17:18.121 "block_size": 512, 00:17:18.121 "num_blocks": 65536, 00:17:18.121 "uuid": "a2371ff5-a2ac-451d-92f6-6d81b64f0241", 00:17:18.121 "assigned_rate_limits": { 00:17:18.121 "rw_ios_per_sec": 0, 00:17:18.121 "rw_mbytes_per_sec": 0, 00:17:18.121 "r_mbytes_per_sec": 0, 00:17:18.121 "w_mbytes_per_sec": 0 00:17:18.121 }, 00:17:18.121 "claimed": true, 00:17:18.121 "claim_type": "exclusive_write", 00:17:18.121 "zoned": false, 00:17:18.121 "supported_io_types": { 00:17:18.121 "read": true, 00:17:18.121 "write": true, 00:17:18.121 "unmap": true, 00:17:18.121 "flush": true, 00:17:18.121 "reset": true, 00:17:18.121 "nvme_admin": false, 00:17:18.121 "nvme_io": false, 00:17:18.121 "nvme_io_md": false, 00:17:18.121 "write_zeroes": true, 00:17:18.121 "zcopy": true, 00:17:18.121 "get_zone_info": false, 00:17:18.121 "zone_management": false, 00:17:18.121 "zone_append": false, 00:17:18.121 "compare": false, 00:17:18.121 "compare_and_write": false, 00:17:18.121 "abort": true, 00:17:18.121 "seek_hole": false, 00:17:18.121 "seek_data": false, 00:17:18.121 "copy": true, 00:17:18.121 "nvme_iov_md": false 00:17:18.121 }, 00:17:18.121 "memory_domains": [ 00:17:18.121 { 00:17:18.121 "dma_device_id": "system", 00:17:18.121 "dma_device_type": 1 00:17:18.121 }, 00:17:18.121 { 00:17:18.121 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:18.121 "dma_device_type": 2 00:17:18.121 } 00:17:18.121 ], 00:17:18.121 "driver_specific": {} 00:17:18.121 }' 00:17:18.121 23:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:18.121 23:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:18.121 23:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:18.121 23:39:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:18.121 23:39:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:18.121 23:39:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:18.121 23:39:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:18.121 23:39:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:18.380 23:39:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:18.380 23:39:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:18.380 23:39:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:18.380 23:39:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:18.380 23:39:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:18.380 23:39:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:17:18.380 23:39:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:18.380 23:39:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:18.380 "name": "BaseBdev4", 00:17:18.380 "aliases": [ 00:17:18.380 "65af24fa-f5e6-494b-917d-d89115eecbb7" 00:17:18.380 ], 00:17:18.380 "product_name": "Malloc disk", 00:17:18.380 "block_size": 512, 00:17:18.380 "num_blocks": 65536, 00:17:18.380 "uuid": "65af24fa-f5e6-494b-917d-d89115eecbb7", 00:17:18.380 "assigned_rate_limits": { 00:17:18.380 "rw_ios_per_sec": 0, 00:17:18.380 "rw_mbytes_per_sec": 0, 00:17:18.380 "r_mbytes_per_sec": 0, 00:17:18.380 "w_mbytes_per_sec": 0 00:17:18.380 }, 00:17:18.380 "claimed": true, 00:17:18.380 "claim_type": "exclusive_write", 00:17:18.380 "zoned": false, 00:17:18.380 "supported_io_types": { 00:17:18.380 "read": true, 00:17:18.380 "write": true, 00:17:18.380 "unmap": true, 00:17:18.380 "flush": true, 00:17:18.380 "reset": true, 00:17:18.380 "nvme_admin": false, 00:17:18.380 "nvme_io": false, 00:17:18.380 "nvme_io_md": false, 00:17:18.380 "write_zeroes": true, 00:17:18.380 "zcopy": true, 00:17:18.380 "get_zone_info": false, 00:17:18.380 "zone_management": false, 00:17:18.380 "zone_append": false, 00:17:18.380 "compare": false, 00:17:18.380 "compare_and_write": false, 00:17:18.380 "abort": true, 00:17:18.380 "seek_hole": false, 00:17:18.380 "seek_data": false, 00:17:18.380 "copy": true, 00:17:18.380 "nvme_iov_md": false 00:17:18.380 }, 00:17:18.380 "memory_domains": [ 00:17:18.380 { 00:17:18.380 "dma_device_id": "system", 00:17:18.380 "dma_device_type": 1 00:17:18.380 }, 00:17:18.380 { 00:17:18.380 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:18.380 "dma_device_type": 2 00:17:18.380 } 00:17:18.380 ], 00:17:18.380 "driver_specific": {} 00:17:18.380 }' 00:17:18.380 23:39:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:18.637 23:39:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:18.637 23:39:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:18.637 23:39:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:18.637 23:39:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:18.637 23:39:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:18.637 23:39:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:18.637 23:39:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:18.637 23:39:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:18.637 23:39:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:18.896 23:39:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:18.896 23:39:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:18.896 23:39:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:18.896 [2024-07-24 23:39:03.846096] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:18.896 [2024-07-24 23:39:03.846115] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:18.896 [2024-07-24 23:39:03.846155] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:18.896 [2024-07-24 23:39:03.846340] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:18.896 [2024-07-24 23:39:03.846347] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1043020 name Existed_Raid, state offline 00:17:18.896 23:39:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 335033 00:17:18.896 23:39:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@950 -- # '[' -z 335033 ']' 00:17:18.896 23:39:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # kill -0 335033 00:17:18.896 23:39:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # uname 00:17:18.896 23:39:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:17:18.896 23:39:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 335033 00:17:19.155 23:39:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:17:19.155 23:39:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:17:19.155 23:39:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # echo 'killing process with pid 335033' 00:17:19.155 killing process with pid 335033 00:17:19.155 23:39:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@969 -- # kill 335033 00:17:19.155 [2024-07-24 23:39:03.902900] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:19.155 23:39:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@974 -- # wait 335033 00:17:19.155 [2024-07-24 23:39:03.934661] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:19.155 23:39:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:17:19.155 00:17:19.155 real 0m23.999s 00:17:19.155 user 0m44.758s 00:17:19.155 sys 0m3.655s 00:17:19.155 23:39:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1126 -- # xtrace_disable 00:17:19.155 23:39:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:19.155 ************************************ 00:17:19.155 END TEST raid_state_function_test_sb 00:17:19.155 ************************************ 00:17:19.155 23:39:04 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 4 00:17:19.155 23:39:04 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:17:19.155 23:39:04 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:17:19.155 23:39:04 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:19.414 ************************************ 00:17:19.414 START TEST raid_superblock_test 00:17:19.414 ************************************ 00:17:19.414 23:39:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # raid_superblock_test raid1 4 00:17:19.414 23:39:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:17:19.414 23:39:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:17:19.414 23:39:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:17:19.414 23:39:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:17:19.414 23:39:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:17:19.414 23:39:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:17:19.414 23:39:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:17:19.414 23:39:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:17:19.414 23:39:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:17:19.414 23:39:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:17:19.414 23:39:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:17:19.414 23:39:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:17:19.414 23:39:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:17:19.414 23:39:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:17:19.414 23:39:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:17:19.414 23:39:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=339716 00:17:19.414 23:39:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 339716 /var/tmp/spdk-raid.sock 00:17:19.414 23:39:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:17:19.414 23:39:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@831 -- # '[' -z 339716 ']' 00:17:19.414 23:39:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:19.414 23:39:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:17:19.414 23:39:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:19.414 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:19.414 23:39:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:17:19.414 23:39:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:19.414 [2024-07-24 23:39:04.219795] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:17:19.414 [2024-07-24 23:39:04.219832] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid339716 ] 00:17:19.414 [2024-07-24 23:39:04.281849] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:19.414 [2024-07-24 23:39:04.359319] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:19.414 [2024-07-24 23:39:04.407591] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:19.414 [2024-07-24 23:39:04.407617] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:20.350 23:39:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:17:20.350 23:39:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@864 -- # return 0 00:17:20.350 23:39:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:17:20.350 23:39:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:17:20.350 23:39:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:17:20.350 23:39:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:17:20.350 23:39:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:17:20.350 23:39:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:17:20.351 23:39:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:17:20.351 23:39:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:17:20.351 23:39:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:17:20.351 malloc1 00:17:20.351 23:39:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:17:20.351 [2024-07-24 23:39:05.338641] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:17:20.351 [2024-07-24 23:39:05.338672] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:20.351 [2024-07-24 23:39:05.338683] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfb0dd0 00:17:20.351 [2024-07-24 23:39:05.338705] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:20.351 [2024-07-24 23:39:05.339861] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:20.351 [2024-07-24 23:39:05.339882] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:17:20.351 pt1 00:17:20.609 23:39:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:17:20.609 23:39:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:17:20.609 23:39:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:17:20.609 23:39:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:17:20.609 23:39:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:17:20.609 23:39:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:17:20.609 23:39:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:17:20.609 23:39:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:17:20.609 23:39:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:17:20.609 malloc2 00:17:20.609 23:39:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:20.868 [2024-07-24 23:39:05.679062] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:20.868 [2024-07-24 23:39:05.679091] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:20.868 [2024-07-24 23:39:05.679103] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfb18d0 00:17:20.868 [2024-07-24 23:39:05.679110] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:20.868 [2024-07-24 23:39:05.680141] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:20.868 [2024-07-24 23:39:05.680161] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:20.868 pt2 00:17:20.868 23:39:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:17:20.868 23:39:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:17:20.868 23:39:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:17:20.868 23:39:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:17:20.868 23:39:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:17:20.868 23:39:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:17:20.868 23:39:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:17:20.868 23:39:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:17:20.868 23:39:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:17:20.868 malloc3 00:17:20.868 23:39:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:17:21.126 [2024-07-24 23:39:06.003518] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:17:21.126 [2024-07-24 23:39:06.003551] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:21.126 [2024-07-24 23:39:06.003561] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1072740 00:17:21.126 [2024-07-24 23:39:06.003567] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:21.126 [2024-07-24 23:39:06.004715] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:21.126 [2024-07-24 23:39:06.004735] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:17:21.126 pt3 00:17:21.126 23:39:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:17:21.126 23:39:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:17:21.126 23:39:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:17:21.126 23:39:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:17:21.126 23:39:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:17:21.126 23:39:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:17:21.126 23:39:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:17:21.126 23:39:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:17:21.126 23:39:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:17:21.385 malloc4 00:17:21.385 23:39:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:17:21.385 [2024-07-24 23:39:06.347778] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:17:21.385 [2024-07-24 23:39:06.347810] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:21.385 [2024-07-24 23:39:06.347822] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfa8990 00:17:21.385 [2024-07-24 23:39:06.347844] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:21.385 [2024-07-24 23:39:06.348906] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:21.385 [2024-07-24 23:39:06.348925] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:17:21.385 pt4 00:17:21.385 23:39:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:17:21.385 23:39:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:17:21.385 23:39:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:17:21.644 [2024-07-24 23:39:06.504198] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:17:21.644 [2024-07-24 23:39:06.505045] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:21.644 [2024-07-24 23:39:06.505084] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:17:21.644 [2024-07-24 23:39:06.505113] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:17:21.644 [2024-07-24 23:39:06.505233] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xfaa9d0 00:17:21.644 [2024-07-24 23:39:06.505240] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:21.644 [2024-07-24 23:39:06.505377] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xfb37d0 00:17:21.644 [2024-07-24 23:39:06.505489] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xfaa9d0 00:17:21.644 [2024-07-24 23:39:06.505495] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xfaa9d0 00:17:21.644 [2024-07-24 23:39:06.505563] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:21.644 23:39:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:17:21.644 23:39:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:21.644 23:39:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:21.644 23:39:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:21.644 23:39:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:21.644 23:39:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:21.644 23:39:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:21.644 23:39:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:21.644 23:39:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:21.644 23:39:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:21.644 23:39:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:21.644 23:39:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:21.902 23:39:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:21.902 "name": "raid_bdev1", 00:17:21.902 "uuid": "61fa06a6-5f05-4c87-bcf0-4dddfb08decf", 00:17:21.902 "strip_size_kb": 0, 00:17:21.902 "state": "online", 00:17:21.902 "raid_level": "raid1", 00:17:21.902 "superblock": true, 00:17:21.902 "num_base_bdevs": 4, 00:17:21.902 "num_base_bdevs_discovered": 4, 00:17:21.902 "num_base_bdevs_operational": 4, 00:17:21.902 "base_bdevs_list": [ 00:17:21.902 { 00:17:21.902 "name": "pt1", 00:17:21.902 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:21.902 "is_configured": true, 00:17:21.902 "data_offset": 2048, 00:17:21.902 "data_size": 63488 00:17:21.902 }, 00:17:21.902 { 00:17:21.902 "name": "pt2", 00:17:21.903 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:21.903 "is_configured": true, 00:17:21.903 "data_offset": 2048, 00:17:21.903 "data_size": 63488 00:17:21.903 }, 00:17:21.903 { 00:17:21.903 "name": "pt3", 00:17:21.903 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:21.903 "is_configured": true, 00:17:21.903 "data_offset": 2048, 00:17:21.903 "data_size": 63488 00:17:21.903 }, 00:17:21.903 { 00:17:21.903 "name": "pt4", 00:17:21.903 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:21.903 "is_configured": true, 00:17:21.903 "data_offset": 2048, 00:17:21.903 "data_size": 63488 00:17:21.903 } 00:17:21.903 ] 00:17:21.903 }' 00:17:21.903 23:39:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:21.903 23:39:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:22.470 23:39:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:17:22.470 23:39:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:17:22.470 23:39:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:22.470 23:39:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:22.470 23:39:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:22.470 23:39:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:22.470 23:39:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:22.470 23:39:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:22.470 [2024-07-24 23:39:07.358727] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:22.470 23:39:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:22.470 "name": "raid_bdev1", 00:17:22.470 "aliases": [ 00:17:22.470 "61fa06a6-5f05-4c87-bcf0-4dddfb08decf" 00:17:22.470 ], 00:17:22.470 "product_name": "Raid Volume", 00:17:22.470 "block_size": 512, 00:17:22.470 "num_blocks": 63488, 00:17:22.470 "uuid": "61fa06a6-5f05-4c87-bcf0-4dddfb08decf", 00:17:22.470 "assigned_rate_limits": { 00:17:22.470 "rw_ios_per_sec": 0, 00:17:22.470 "rw_mbytes_per_sec": 0, 00:17:22.470 "r_mbytes_per_sec": 0, 00:17:22.470 "w_mbytes_per_sec": 0 00:17:22.470 }, 00:17:22.470 "claimed": false, 00:17:22.470 "zoned": false, 00:17:22.470 "supported_io_types": { 00:17:22.470 "read": true, 00:17:22.470 "write": true, 00:17:22.470 "unmap": false, 00:17:22.470 "flush": false, 00:17:22.470 "reset": true, 00:17:22.470 "nvme_admin": false, 00:17:22.470 "nvme_io": false, 00:17:22.470 "nvme_io_md": false, 00:17:22.470 "write_zeroes": true, 00:17:22.470 "zcopy": false, 00:17:22.470 "get_zone_info": false, 00:17:22.470 "zone_management": false, 00:17:22.470 "zone_append": false, 00:17:22.470 "compare": false, 00:17:22.470 "compare_and_write": false, 00:17:22.470 "abort": false, 00:17:22.470 "seek_hole": false, 00:17:22.470 "seek_data": false, 00:17:22.470 "copy": false, 00:17:22.470 "nvme_iov_md": false 00:17:22.470 }, 00:17:22.470 "memory_domains": [ 00:17:22.470 { 00:17:22.470 "dma_device_id": "system", 00:17:22.470 "dma_device_type": 1 00:17:22.470 }, 00:17:22.470 { 00:17:22.470 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:22.470 "dma_device_type": 2 00:17:22.470 }, 00:17:22.470 { 00:17:22.470 "dma_device_id": "system", 00:17:22.470 "dma_device_type": 1 00:17:22.470 }, 00:17:22.470 { 00:17:22.470 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:22.470 "dma_device_type": 2 00:17:22.470 }, 00:17:22.470 { 00:17:22.470 "dma_device_id": "system", 00:17:22.470 "dma_device_type": 1 00:17:22.470 }, 00:17:22.470 { 00:17:22.470 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:22.470 "dma_device_type": 2 00:17:22.470 }, 00:17:22.470 { 00:17:22.470 "dma_device_id": "system", 00:17:22.470 "dma_device_type": 1 00:17:22.470 }, 00:17:22.470 { 00:17:22.470 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:22.470 "dma_device_type": 2 00:17:22.470 } 00:17:22.470 ], 00:17:22.470 "driver_specific": { 00:17:22.470 "raid": { 00:17:22.470 "uuid": "61fa06a6-5f05-4c87-bcf0-4dddfb08decf", 00:17:22.470 "strip_size_kb": 0, 00:17:22.470 "state": "online", 00:17:22.470 "raid_level": "raid1", 00:17:22.470 "superblock": true, 00:17:22.470 "num_base_bdevs": 4, 00:17:22.470 "num_base_bdevs_discovered": 4, 00:17:22.470 "num_base_bdevs_operational": 4, 00:17:22.470 "base_bdevs_list": [ 00:17:22.470 { 00:17:22.470 "name": "pt1", 00:17:22.470 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:22.470 "is_configured": true, 00:17:22.470 "data_offset": 2048, 00:17:22.470 "data_size": 63488 00:17:22.470 }, 00:17:22.470 { 00:17:22.470 "name": "pt2", 00:17:22.470 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:22.470 "is_configured": true, 00:17:22.470 "data_offset": 2048, 00:17:22.470 "data_size": 63488 00:17:22.470 }, 00:17:22.470 { 00:17:22.470 "name": "pt3", 00:17:22.470 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:22.470 "is_configured": true, 00:17:22.470 "data_offset": 2048, 00:17:22.470 "data_size": 63488 00:17:22.470 }, 00:17:22.470 { 00:17:22.470 "name": "pt4", 00:17:22.470 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:22.470 "is_configured": true, 00:17:22.470 "data_offset": 2048, 00:17:22.470 "data_size": 63488 00:17:22.470 } 00:17:22.470 ] 00:17:22.470 } 00:17:22.470 } 00:17:22.470 }' 00:17:22.470 23:39:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:22.470 23:39:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:17:22.470 pt2 00:17:22.470 pt3 00:17:22.470 pt4' 00:17:22.470 23:39:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:22.470 23:39:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:17:22.470 23:39:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:22.729 23:39:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:22.729 "name": "pt1", 00:17:22.729 "aliases": [ 00:17:22.729 "00000000-0000-0000-0000-000000000001" 00:17:22.729 ], 00:17:22.729 "product_name": "passthru", 00:17:22.729 "block_size": 512, 00:17:22.729 "num_blocks": 65536, 00:17:22.729 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:22.729 "assigned_rate_limits": { 00:17:22.729 "rw_ios_per_sec": 0, 00:17:22.729 "rw_mbytes_per_sec": 0, 00:17:22.729 "r_mbytes_per_sec": 0, 00:17:22.729 "w_mbytes_per_sec": 0 00:17:22.729 }, 00:17:22.729 "claimed": true, 00:17:22.729 "claim_type": "exclusive_write", 00:17:22.729 "zoned": false, 00:17:22.729 "supported_io_types": { 00:17:22.729 "read": true, 00:17:22.729 "write": true, 00:17:22.729 "unmap": true, 00:17:22.729 "flush": true, 00:17:22.729 "reset": true, 00:17:22.729 "nvme_admin": false, 00:17:22.729 "nvme_io": false, 00:17:22.729 "nvme_io_md": false, 00:17:22.729 "write_zeroes": true, 00:17:22.729 "zcopy": true, 00:17:22.729 "get_zone_info": false, 00:17:22.729 "zone_management": false, 00:17:22.729 "zone_append": false, 00:17:22.729 "compare": false, 00:17:22.729 "compare_and_write": false, 00:17:22.729 "abort": true, 00:17:22.729 "seek_hole": false, 00:17:22.729 "seek_data": false, 00:17:22.729 "copy": true, 00:17:22.729 "nvme_iov_md": false 00:17:22.729 }, 00:17:22.729 "memory_domains": [ 00:17:22.729 { 00:17:22.729 "dma_device_id": "system", 00:17:22.729 "dma_device_type": 1 00:17:22.729 }, 00:17:22.729 { 00:17:22.729 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:22.729 "dma_device_type": 2 00:17:22.729 } 00:17:22.729 ], 00:17:22.729 "driver_specific": { 00:17:22.729 "passthru": { 00:17:22.729 "name": "pt1", 00:17:22.729 "base_bdev_name": "malloc1" 00:17:22.729 } 00:17:22.729 } 00:17:22.729 }' 00:17:22.729 23:39:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:22.729 23:39:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:22.729 23:39:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:22.729 23:39:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:22.729 23:39:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:22.988 23:39:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:22.988 23:39:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:22.988 23:39:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:22.988 23:39:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:22.988 23:39:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:22.988 23:39:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:22.988 23:39:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:22.988 23:39:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:22.988 23:39:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:17:22.988 23:39:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:23.247 23:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:23.247 "name": "pt2", 00:17:23.247 "aliases": [ 00:17:23.247 "00000000-0000-0000-0000-000000000002" 00:17:23.247 ], 00:17:23.247 "product_name": "passthru", 00:17:23.247 "block_size": 512, 00:17:23.247 "num_blocks": 65536, 00:17:23.247 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:23.247 "assigned_rate_limits": { 00:17:23.247 "rw_ios_per_sec": 0, 00:17:23.247 "rw_mbytes_per_sec": 0, 00:17:23.247 "r_mbytes_per_sec": 0, 00:17:23.248 "w_mbytes_per_sec": 0 00:17:23.248 }, 00:17:23.248 "claimed": true, 00:17:23.248 "claim_type": "exclusive_write", 00:17:23.248 "zoned": false, 00:17:23.248 "supported_io_types": { 00:17:23.248 "read": true, 00:17:23.248 "write": true, 00:17:23.248 "unmap": true, 00:17:23.248 "flush": true, 00:17:23.248 "reset": true, 00:17:23.248 "nvme_admin": false, 00:17:23.248 "nvme_io": false, 00:17:23.248 "nvme_io_md": false, 00:17:23.248 "write_zeroes": true, 00:17:23.248 "zcopy": true, 00:17:23.248 "get_zone_info": false, 00:17:23.248 "zone_management": false, 00:17:23.248 "zone_append": false, 00:17:23.248 "compare": false, 00:17:23.248 "compare_and_write": false, 00:17:23.248 "abort": true, 00:17:23.248 "seek_hole": false, 00:17:23.248 "seek_data": false, 00:17:23.248 "copy": true, 00:17:23.248 "nvme_iov_md": false 00:17:23.248 }, 00:17:23.248 "memory_domains": [ 00:17:23.248 { 00:17:23.248 "dma_device_id": "system", 00:17:23.248 "dma_device_type": 1 00:17:23.248 }, 00:17:23.248 { 00:17:23.248 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:23.248 "dma_device_type": 2 00:17:23.248 } 00:17:23.248 ], 00:17:23.248 "driver_specific": { 00:17:23.248 "passthru": { 00:17:23.248 "name": "pt2", 00:17:23.248 "base_bdev_name": "malloc2" 00:17:23.248 } 00:17:23.248 } 00:17:23.248 }' 00:17:23.248 23:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:23.248 23:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:23.248 23:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:23.248 23:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:23.248 23:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:23.248 23:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:23.248 23:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:23.507 23:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:23.507 23:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:23.507 23:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:23.507 23:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:23.507 23:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:23.507 23:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:23.507 23:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:17:23.507 23:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:23.766 23:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:23.766 "name": "pt3", 00:17:23.766 "aliases": [ 00:17:23.766 "00000000-0000-0000-0000-000000000003" 00:17:23.766 ], 00:17:23.766 "product_name": "passthru", 00:17:23.766 "block_size": 512, 00:17:23.766 "num_blocks": 65536, 00:17:23.766 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:23.766 "assigned_rate_limits": { 00:17:23.766 "rw_ios_per_sec": 0, 00:17:23.766 "rw_mbytes_per_sec": 0, 00:17:23.766 "r_mbytes_per_sec": 0, 00:17:23.766 "w_mbytes_per_sec": 0 00:17:23.766 }, 00:17:23.766 "claimed": true, 00:17:23.766 "claim_type": "exclusive_write", 00:17:23.766 "zoned": false, 00:17:23.766 "supported_io_types": { 00:17:23.766 "read": true, 00:17:23.766 "write": true, 00:17:23.766 "unmap": true, 00:17:23.766 "flush": true, 00:17:23.766 "reset": true, 00:17:23.766 "nvme_admin": false, 00:17:23.766 "nvme_io": false, 00:17:23.766 "nvme_io_md": false, 00:17:23.766 "write_zeroes": true, 00:17:23.766 "zcopy": true, 00:17:23.766 "get_zone_info": false, 00:17:23.766 "zone_management": false, 00:17:23.766 "zone_append": false, 00:17:23.766 "compare": false, 00:17:23.766 "compare_and_write": false, 00:17:23.766 "abort": true, 00:17:23.766 "seek_hole": false, 00:17:23.766 "seek_data": false, 00:17:23.766 "copy": true, 00:17:23.766 "nvme_iov_md": false 00:17:23.766 }, 00:17:23.766 "memory_domains": [ 00:17:23.766 { 00:17:23.766 "dma_device_id": "system", 00:17:23.766 "dma_device_type": 1 00:17:23.766 }, 00:17:23.766 { 00:17:23.766 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:23.766 "dma_device_type": 2 00:17:23.766 } 00:17:23.766 ], 00:17:23.766 "driver_specific": { 00:17:23.766 "passthru": { 00:17:23.766 "name": "pt3", 00:17:23.766 "base_bdev_name": "malloc3" 00:17:23.766 } 00:17:23.766 } 00:17:23.766 }' 00:17:23.766 23:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:23.766 23:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:23.766 23:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:23.766 23:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:23.766 23:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:23.766 23:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:23.766 23:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:23.766 23:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:24.024 23:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:24.024 23:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:24.024 23:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:24.024 23:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:24.024 23:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:24.024 23:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:17:24.024 23:39:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:24.282 23:39:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:24.282 "name": "pt4", 00:17:24.282 "aliases": [ 00:17:24.282 "00000000-0000-0000-0000-000000000004" 00:17:24.282 ], 00:17:24.282 "product_name": "passthru", 00:17:24.282 "block_size": 512, 00:17:24.282 "num_blocks": 65536, 00:17:24.282 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:24.282 "assigned_rate_limits": { 00:17:24.282 "rw_ios_per_sec": 0, 00:17:24.282 "rw_mbytes_per_sec": 0, 00:17:24.282 "r_mbytes_per_sec": 0, 00:17:24.282 "w_mbytes_per_sec": 0 00:17:24.282 }, 00:17:24.282 "claimed": true, 00:17:24.282 "claim_type": "exclusive_write", 00:17:24.282 "zoned": false, 00:17:24.282 "supported_io_types": { 00:17:24.282 "read": true, 00:17:24.282 "write": true, 00:17:24.282 "unmap": true, 00:17:24.282 "flush": true, 00:17:24.282 "reset": true, 00:17:24.282 "nvme_admin": false, 00:17:24.282 "nvme_io": false, 00:17:24.282 "nvme_io_md": false, 00:17:24.282 "write_zeroes": true, 00:17:24.282 "zcopy": true, 00:17:24.282 "get_zone_info": false, 00:17:24.282 "zone_management": false, 00:17:24.283 "zone_append": false, 00:17:24.283 "compare": false, 00:17:24.283 "compare_and_write": false, 00:17:24.283 "abort": true, 00:17:24.283 "seek_hole": false, 00:17:24.283 "seek_data": false, 00:17:24.283 "copy": true, 00:17:24.283 "nvme_iov_md": false 00:17:24.283 }, 00:17:24.283 "memory_domains": [ 00:17:24.283 { 00:17:24.283 "dma_device_id": "system", 00:17:24.283 "dma_device_type": 1 00:17:24.283 }, 00:17:24.283 { 00:17:24.283 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:24.283 "dma_device_type": 2 00:17:24.283 } 00:17:24.283 ], 00:17:24.283 "driver_specific": { 00:17:24.283 "passthru": { 00:17:24.283 "name": "pt4", 00:17:24.283 "base_bdev_name": "malloc4" 00:17:24.283 } 00:17:24.283 } 00:17:24.283 }' 00:17:24.283 23:39:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:24.283 23:39:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:24.283 23:39:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:24.283 23:39:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:24.283 23:39:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:24.283 23:39:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:24.283 23:39:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:24.283 23:39:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:24.283 23:39:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:24.283 23:39:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:24.283 23:39:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:24.541 23:39:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:24.541 23:39:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:24.541 23:39:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:17:24.541 [2024-07-24 23:39:09.452137] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:24.541 23:39:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=61fa06a6-5f05-4c87-bcf0-4dddfb08decf 00:17:24.541 23:39:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 61fa06a6-5f05-4c87-bcf0-4dddfb08decf ']' 00:17:24.541 23:39:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:24.799 [2024-07-24 23:39:09.624375] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:24.799 [2024-07-24 23:39:09.624388] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:24.799 [2024-07-24 23:39:09.624421] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:24.799 [2024-07-24 23:39:09.624485] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:24.799 [2024-07-24 23:39:09.624492] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfaa9d0 name raid_bdev1, state offline 00:17:24.799 23:39:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:24.799 23:39:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:17:25.057 23:39:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:17:25.057 23:39:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:17:25.057 23:39:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:17:25.057 23:39:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:17:25.057 23:39:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:17:25.057 23:39:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:17:25.315 23:39:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:17:25.315 23:39:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:17:25.315 23:39:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:17:25.315 23:39:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:17:25.573 23:39:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:17:25.573 23:39:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:17:25.832 23:39:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:17:25.832 23:39:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:17:25.832 23:39:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # local es=0 00:17:25.832 23:39:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:17:25.832 23:39:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:25.832 23:39:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:17:25.832 23:39:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:25.832 23:39:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:17:25.832 23:39:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:25.832 23:39:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:17:25.832 23:39:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:25.832 23:39:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:17:25.832 23:39:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:17:25.832 [2024-07-24 23:39:10.807427] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:17:25.832 [2024-07-24 23:39:10.808378] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:17:25.832 [2024-07-24 23:39:10.808408] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:17:25.832 [2024-07-24 23:39:10.808429] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:17:25.832 [2024-07-24 23:39:10.808459] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:17:25.832 [2024-07-24 23:39:10.808493] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:17:25.832 [2024-07-24 23:39:10.808506] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:17:25.832 [2024-07-24 23:39:10.808517] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:17:25.832 [2024-07-24 23:39:10.808526] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:25.832 [2024-07-24 23:39:10.808532] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfb1250 name raid_bdev1, state configuring 00:17:25.832 request: 00:17:25.832 { 00:17:25.832 "name": "raid_bdev1", 00:17:25.832 "raid_level": "raid1", 00:17:25.832 "base_bdevs": [ 00:17:25.832 "malloc1", 00:17:25.832 "malloc2", 00:17:25.832 "malloc3", 00:17:25.832 "malloc4" 00:17:25.832 ], 00:17:25.833 "superblock": false, 00:17:25.833 "method": "bdev_raid_create", 00:17:25.833 "req_id": 1 00:17:25.833 } 00:17:25.833 Got JSON-RPC error response 00:17:25.833 response: 00:17:25.833 { 00:17:25.833 "code": -17, 00:17:25.833 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:17:25.833 } 00:17:25.833 23:39:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # es=1 00:17:25.833 23:39:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:17:25.833 23:39:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:17:25.833 23:39:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:17:25.833 23:39:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:25.833 23:39:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:17:26.092 23:39:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:17:26.092 23:39:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:17:26.092 23:39:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:17:26.350 [2024-07-24 23:39:11.124209] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:17:26.350 [2024-07-24 23:39:11.124240] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:26.350 [2024-07-24 23:39:11.124253] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1072970 00:17:26.350 [2024-07-24 23:39:11.124276] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:26.350 [2024-07-24 23:39:11.125426] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:26.350 [2024-07-24 23:39:11.125448] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:17:26.350 [2024-07-24 23:39:11.125507] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:17:26.350 [2024-07-24 23:39:11.125528] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:17:26.350 pt1 00:17:26.350 23:39:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 4 00:17:26.350 23:39:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:26.350 23:39:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:26.350 23:39:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:26.350 23:39:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:26.350 23:39:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:26.350 23:39:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:26.350 23:39:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:26.350 23:39:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:26.350 23:39:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:26.350 23:39:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:26.350 23:39:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:26.350 23:39:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:26.350 "name": "raid_bdev1", 00:17:26.350 "uuid": "61fa06a6-5f05-4c87-bcf0-4dddfb08decf", 00:17:26.350 "strip_size_kb": 0, 00:17:26.350 "state": "configuring", 00:17:26.350 "raid_level": "raid1", 00:17:26.350 "superblock": true, 00:17:26.350 "num_base_bdevs": 4, 00:17:26.350 "num_base_bdevs_discovered": 1, 00:17:26.350 "num_base_bdevs_operational": 4, 00:17:26.350 "base_bdevs_list": [ 00:17:26.350 { 00:17:26.350 "name": "pt1", 00:17:26.350 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:26.350 "is_configured": true, 00:17:26.350 "data_offset": 2048, 00:17:26.350 "data_size": 63488 00:17:26.350 }, 00:17:26.350 { 00:17:26.350 "name": null, 00:17:26.350 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:26.350 "is_configured": false, 00:17:26.350 "data_offset": 2048, 00:17:26.350 "data_size": 63488 00:17:26.350 }, 00:17:26.350 { 00:17:26.350 "name": null, 00:17:26.350 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:26.350 "is_configured": false, 00:17:26.350 "data_offset": 2048, 00:17:26.350 "data_size": 63488 00:17:26.350 }, 00:17:26.350 { 00:17:26.350 "name": null, 00:17:26.350 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:26.350 "is_configured": false, 00:17:26.350 "data_offset": 2048, 00:17:26.350 "data_size": 63488 00:17:26.350 } 00:17:26.350 ] 00:17:26.350 }' 00:17:26.351 23:39:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:26.351 23:39:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:26.930 23:39:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:17:26.930 23:39:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:26.930 [2024-07-24 23:39:11.898215] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:26.930 [2024-07-24 23:39:11.898253] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:26.930 [2024-07-24 23:39:11.898265] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfab180 00:17:26.930 [2024-07-24 23:39:11.898287] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:26.930 [2024-07-24 23:39:11.898549] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:26.930 [2024-07-24 23:39:11.898560] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:26.930 [2024-07-24 23:39:11.898605] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:17:26.930 [2024-07-24 23:39:11.898617] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:26.930 pt2 00:17:26.930 23:39:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:17:27.189 [2024-07-24 23:39:12.058632] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:17:27.189 23:39:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 4 00:17:27.189 23:39:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:27.189 23:39:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:27.189 23:39:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:27.189 23:39:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:27.189 23:39:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:27.189 23:39:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:27.189 23:39:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:27.189 23:39:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:27.189 23:39:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:27.189 23:39:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:27.189 23:39:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:27.448 23:39:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:27.448 "name": "raid_bdev1", 00:17:27.448 "uuid": "61fa06a6-5f05-4c87-bcf0-4dddfb08decf", 00:17:27.448 "strip_size_kb": 0, 00:17:27.448 "state": "configuring", 00:17:27.448 "raid_level": "raid1", 00:17:27.448 "superblock": true, 00:17:27.448 "num_base_bdevs": 4, 00:17:27.448 "num_base_bdevs_discovered": 1, 00:17:27.448 "num_base_bdevs_operational": 4, 00:17:27.448 "base_bdevs_list": [ 00:17:27.448 { 00:17:27.448 "name": "pt1", 00:17:27.448 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:27.448 "is_configured": true, 00:17:27.448 "data_offset": 2048, 00:17:27.448 "data_size": 63488 00:17:27.448 }, 00:17:27.448 { 00:17:27.448 "name": null, 00:17:27.448 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:27.448 "is_configured": false, 00:17:27.448 "data_offset": 2048, 00:17:27.448 "data_size": 63488 00:17:27.448 }, 00:17:27.448 { 00:17:27.448 "name": null, 00:17:27.448 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:27.448 "is_configured": false, 00:17:27.448 "data_offset": 2048, 00:17:27.448 "data_size": 63488 00:17:27.448 }, 00:17:27.448 { 00:17:27.448 "name": null, 00:17:27.448 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:27.448 "is_configured": false, 00:17:27.448 "data_offset": 2048, 00:17:27.448 "data_size": 63488 00:17:27.448 } 00:17:27.448 ] 00:17:27.448 }' 00:17:27.448 23:39:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:27.448 23:39:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:28.015 23:39:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:17:28.016 23:39:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:28.016 23:39:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:28.016 [2024-07-24 23:39:12.900824] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:28.016 [2024-07-24 23:39:12.900865] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:28.016 [2024-07-24 23:39:12.900877] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfa85b0 00:17:28.016 [2024-07-24 23:39:12.900885] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:28.016 [2024-07-24 23:39:12.901154] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:28.016 [2024-07-24 23:39:12.901165] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:28.016 [2024-07-24 23:39:12.901214] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:17:28.016 [2024-07-24 23:39:12.901228] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:28.016 pt2 00:17:28.016 23:39:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:17:28.016 23:39:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:28.016 23:39:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:17:28.275 [2024-07-24 23:39:13.065247] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:17:28.275 [2024-07-24 23:39:13.065277] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:28.275 [2024-07-24 23:39:13.065287] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x10741d0 00:17:28.275 [2024-07-24 23:39:13.065294] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:28.275 [2024-07-24 23:39:13.065540] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:28.275 [2024-07-24 23:39:13.065552] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:17:28.275 [2024-07-24 23:39:13.065595] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:17:28.275 [2024-07-24 23:39:13.065609] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:17:28.275 pt3 00:17:28.275 23:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:17:28.275 23:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:28.275 23:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:17:28.275 [2024-07-24 23:39:13.237688] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:17:28.275 [2024-07-24 23:39:13.237713] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:28.275 [2024-07-24 23:39:13.237723] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfa7720 00:17:28.275 [2024-07-24 23:39:13.237729] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:28.275 [2024-07-24 23:39:13.237915] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:28.275 [2024-07-24 23:39:13.237924] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:17:28.275 [2024-07-24 23:39:13.237956] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:17:28.275 [2024-07-24 23:39:13.237973] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:17:28.275 [2024-07-24 23:39:13.238054] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xfa9610 00:17:28.275 [2024-07-24 23:39:13.238059] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:28.275 [2024-07-24 23:39:13.238169] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xfab930 00:17:28.275 [2024-07-24 23:39:13.238254] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xfa9610 00:17:28.275 [2024-07-24 23:39:13.238258] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xfa9610 00:17:28.275 [2024-07-24 23:39:13.238317] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:28.275 pt4 00:17:28.275 23:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:17:28.275 23:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:28.275 23:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:17:28.275 23:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:28.275 23:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:28.275 23:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:28.275 23:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:28.275 23:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:28.275 23:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:28.275 23:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:28.275 23:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:28.275 23:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:28.275 23:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:28.275 23:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:28.537 23:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:28.537 "name": "raid_bdev1", 00:17:28.537 "uuid": "61fa06a6-5f05-4c87-bcf0-4dddfb08decf", 00:17:28.537 "strip_size_kb": 0, 00:17:28.537 "state": "online", 00:17:28.537 "raid_level": "raid1", 00:17:28.537 "superblock": true, 00:17:28.537 "num_base_bdevs": 4, 00:17:28.537 "num_base_bdevs_discovered": 4, 00:17:28.537 "num_base_bdevs_operational": 4, 00:17:28.537 "base_bdevs_list": [ 00:17:28.537 { 00:17:28.537 "name": "pt1", 00:17:28.537 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:28.537 "is_configured": true, 00:17:28.537 "data_offset": 2048, 00:17:28.537 "data_size": 63488 00:17:28.537 }, 00:17:28.537 { 00:17:28.537 "name": "pt2", 00:17:28.537 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:28.537 "is_configured": true, 00:17:28.537 "data_offset": 2048, 00:17:28.537 "data_size": 63488 00:17:28.537 }, 00:17:28.537 { 00:17:28.537 "name": "pt3", 00:17:28.537 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:28.537 "is_configured": true, 00:17:28.537 "data_offset": 2048, 00:17:28.537 "data_size": 63488 00:17:28.537 }, 00:17:28.537 { 00:17:28.537 "name": "pt4", 00:17:28.537 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:28.537 "is_configured": true, 00:17:28.537 "data_offset": 2048, 00:17:28.537 "data_size": 63488 00:17:28.537 } 00:17:28.537 ] 00:17:28.537 }' 00:17:28.537 23:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:28.537 23:39:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:29.106 23:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:17:29.106 23:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:17:29.106 23:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:29.106 23:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:29.106 23:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:29.106 23:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:29.106 23:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:29.106 23:39:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:29.106 [2024-07-24 23:39:14.080057] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:29.106 23:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:29.106 "name": "raid_bdev1", 00:17:29.106 "aliases": [ 00:17:29.106 "61fa06a6-5f05-4c87-bcf0-4dddfb08decf" 00:17:29.106 ], 00:17:29.106 "product_name": "Raid Volume", 00:17:29.106 "block_size": 512, 00:17:29.106 "num_blocks": 63488, 00:17:29.106 "uuid": "61fa06a6-5f05-4c87-bcf0-4dddfb08decf", 00:17:29.106 "assigned_rate_limits": { 00:17:29.106 "rw_ios_per_sec": 0, 00:17:29.106 "rw_mbytes_per_sec": 0, 00:17:29.106 "r_mbytes_per_sec": 0, 00:17:29.106 "w_mbytes_per_sec": 0 00:17:29.106 }, 00:17:29.106 "claimed": false, 00:17:29.106 "zoned": false, 00:17:29.106 "supported_io_types": { 00:17:29.106 "read": true, 00:17:29.106 "write": true, 00:17:29.106 "unmap": false, 00:17:29.106 "flush": false, 00:17:29.106 "reset": true, 00:17:29.106 "nvme_admin": false, 00:17:29.106 "nvme_io": false, 00:17:29.106 "nvme_io_md": false, 00:17:29.106 "write_zeroes": true, 00:17:29.106 "zcopy": false, 00:17:29.106 "get_zone_info": false, 00:17:29.106 "zone_management": false, 00:17:29.106 "zone_append": false, 00:17:29.106 "compare": false, 00:17:29.106 "compare_and_write": false, 00:17:29.106 "abort": false, 00:17:29.106 "seek_hole": false, 00:17:29.106 "seek_data": false, 00:17:29.106 "copy": false, 00:17:29.106 "nvme_iov_md": false 00:17:29.106 }, 00:17:29.106 "memory_domains": [ 00:17:29.106 { 00:17:29.106 "dma_device_id": "system", 00:17:29.106 "dma_device_type": 1 00:17:29.106 }, 00:17:29.106 { 00:17:29.106 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:29.106 "dma_device_type": 2 00:17:29.106 }, 00:17:29.106 { 00:17:29.106 "dma_device_id": "system", 00:17:29.106 "dma_device_type": 1 00:17:29.106 }, 00:17:29.106 { 00:17:29.106 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:29.106 "dma_device_type": 2 00:17:29.106 }, 00:17:29.106 { 00:17:29.106 "dma_device_id": "system", 00:17:29.106 "dma_device_type": 1 00:17:29.106 }, 00:17:29.106 { 00:17:29.106 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:29.106 "dma_device_type": 2 00:17:29.106 }, 00:17:29.106 { 00:17:29.106 "dma_device_id": "system", 00:17:29.106 "dma_device_type": 1 00:17:29.106 }, 00:17:29.106 { 00:17:29.106 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:29.106 "dma_device_type": 2 00:17:29.106 } 00:17:29.106 ], 00:17:29.106 "driver_specific": { 00:17:29.106 "raid": { 00:17:29.106 "uuid": "61fa06a6-5f05-4c87-bcf0-4dddfb08decf", 00:17:29.106 "strip_size_kb": 0, 00:17:29.106 "state": "online", 00:17:29.106 "raid_level": "raid1", 00:17:29.106 "superblock": true, 00:17:29.106 "num_base_bdevs": 4, 00:17:29.106 "num_base_bdevs_discovered": 4, 00:17:29.106 "num_base_bdevs_operational": 4, 00:17:29.106 "base_bdevs_list": [ 00:17:29.106 { 00:17:29.106 "name": "pt1", 00:17:29.106 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:29.106 "is_configured": true, 00:17:29.106 "data_offset": 2048, 00:17:29.106 "data_size": 63488 00:17:29.106 }, 00:17:29.106 { 00:17:29.106 "name": "pt2", 00:17:29.106 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:29.106 "is_configured": true, 00:17:29.106 "data_offset": 2048, 00:17:29.106 "data_size": 63488 00:17:29.106 }, 00:17:29.106 { 00:17:29.106 "name": "pt3", 00:17:29.106 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:29.106 "is_configured": true, 00:17:29.106 "data_offset": 2048, 00:17:29.106 "data_size": 63488 00:17:29.106 }, 00:17:29.106 { 00:17:29.106 "name": "pt4", 00:17:29.106 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:29.106 "is_configured": true, 00:17:29.106 "data_offset": 2048, 00:17:29.106 "data_size": 63488 00:17:29.106 } 00:17:29.106 ] 00:17:29.106 } 00:17:29.106 } 00:17:29.106 }' 00:17:29.106 23:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:29.365 23:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:17:29.365 pt2 00:17:29.365 pt3 00:17:29.365 pt4' 00:17:29.365 23:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:29.365 23:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:29.365 23:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:17:29.365 23:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:29.365 "name": "pt1", 00:17:29.365 "aliases": [ 00:17:29.365 "00000000-0000-0000-0000-000000000001" 00:17:29.365 ], 00:17:29.365 "product_name": "passthru", 00:17:29.365 "block_size": 512, 00:17:29.365 "num_blocks": 65536, 00:17:29.365 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:29.365 "assigned_rate_limits": { 00:17:29.365 "rw_ios_per_sec": 0, 00:17:29.365 "rw_mbytes_per_sec": 0, 00:17:29.365 "r_mbytes_per_sec": 0, 00:17:29.366 "w_mbytes_per_sec": 0 00:17:29.366 }, 00:17:29.366 "claimed": true, 00:17:29.366 "claim_type": "exclusive_write", 00:17:29.366 "zoned": false, 00:17:29.366 "supported_io_types": { 00:17:29.366 "read": true, 00:17:29.366 "write": true, 00:17:29.366 "unmap": true, 00:17:29.366 "flush": true, 00:17:29.366 "reset": true, 00:17:29.366 "nvme_admin": false, 00:17:29.366 "nvme_io": false, 00:17:29.366 "nvme_io_md": false, 00:17:29.366 "write_zeroes": true, 00:17:29.366 "zcopy": true, 00:17:29.366 "get_zone_info": false, 00:17:29.366 "zone_management": false, 00:17:29.366 "zone_append": false, 00:17:29.366 "compare": false, 00:17:29.366 "compare_and_write": false, 00:17:29.366 "abort": true, 00:17:29.366 "seek_hole": false, 00:17:29.366 "seek_data": false, 00:17:29.366 "copy": true, 00:17:29.366 "nvme_iov_md": false 00:17:29.366 }, 00:17:29.366 "memory_domains": [ 00:17:29.366 { 00:17:29.366 "dma_device_id": "system", 00:17:29.366 "dma_device_type": 1 00:17:29.366 }, 00:17:29.366 { 00:17:29.366 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:29.366 "dma_device_type": 2 00:17:29.366 } 00:17:29.366 ], 00:17:29.366 "driver_specific": { 00:17:29.366 "passthru": { 00:17:29.366 "name": "pt1", 00:17:29.366 "base_bdev_name": "malloc1" 00:17:29.366 } 00:17:29.366 } 00:17:29.366 }' 00:17:29.366 23:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:29.366 23:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:29.625 23:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:29.625 23:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:29.625 23:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:29.625 23:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:29.625 23:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:29.625 23:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:29.625 23:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:29.625 23:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:29.625 23:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:29.625 23:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:29.625 23:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:29.625 23:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:29.625 23:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:17:29.883 23:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:29.883 "name": "pt2", 00:17:29.883 "aliases": [ 00:17:29.883 "00000000-0000-0000-0000-000000000002" 00:17:29.883 ], 00:17:29.883 "product_name": "passthru", 00:17:29.883 "block_size": 512, 00:17:29.883 "num_blocks": 65536, 00:17:29.883 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:29.883 "assigned_rate_limits": { 00:17:29.883 "rw_ios_per_sec": 0, 00:17:29.883 "rw_mbytes_per_sec": 0, 00:17:29.883 "r_mbytes_per_sec": 0, 00:17:29.883 "w_mbytes_per_sec": 0 00:17:29.883 }, 00:17:29.883 "claimed": true, 00:17:29.883 "claim_type": "exclusive_write", 00:17:29.883 "zoned": false, 00:17:29.883 "supported_io_types": { 00:17:29.883 "read": true, 00:17:29.883 "write": true, 00:17:29.883 "unmap": true, 00:17:29.883 "flush": true, 00:17:29.883 "reset": true, 00:17:29.883 "nvme_admin": false, 00:17:29.883 "nvme_io": false, 00:17:29.883 "nvme_io_md": false, 00:17:29.883 "write_zeroes": true, 00:17:29.883 "zcopy": true, 00:17:29.883 "get_zone_info": false, 00:17:29.883 "zone_management": false, 00:17:29.883 "zone_append": false, 00:17:29.883 "compare": false, 00:17:29.883 "compare_and_write": false, 00:17:29.883 "abort": true, 00:17:29.883 "seek_hole": false, 00:17:29.883 "seek_data": false, 00:17:29.883 "copy": true, 00:17:29.883 "nvme_iov_md": false 00:17:29.883 }, 00:17:29.883 "memory_domains": [ 00:17:29.883 { 00:17:29.883 "dma_device_id": "system", 00:17:29.883 "dma_device_type": 1 00:17:29.883 }, 00:17:29.883 { 00:17:29.883 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:29.883 "dma_device_type": 2 00:17:29.883 } 00:17:29.883 ], 00:17:29.883 "driver_specific": { 00:17:29.883 "passthru": { 00:17:29.883 "name": "pt2", 00:17:29.883 "base_bdev_name": "malloc2" 00:17:29.883 } 00:17:29.883 } 00:17:29.883 }' 00:17:29.883 23:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:29.884 23:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:29.884 23:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:29.884 23:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:29.884 23:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:30.142 23:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:30.142 23:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:30.142 23:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:30.142 23:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:30.142 23:39:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:30.142 23:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:30.142 23:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:30.142 23:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:30.142 23:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:17:30.142 23:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:30.401 23:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:30.401 "name": "pt3", 00:17:30.401 "aliases": [ 00:17:30.401 "00000000-0000-0000-0000-000000000003" 00:17:30.401 ], 00:17:30.401 "product_name": "passthru", 00:17:30.401 "block_size": 512, 00:17:30.401 "num_blocks": 65536, 00:17:30.401 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:30.401 "assigned_rate_limits": { 00:17:30.401 "rw_ios_per_sec": 0, 00:17:30.401 "rw_mbytes_per_sec": 0, 00:17:30.401 "r_mbytes_per_sec": 0, 00:17:30.401 "w_mbytes_per_sec": 0 00:17:30.401 }, 00:17:30.401 "claimed": true, 00:17:30.401 "claim_type": "exclusive_write", 00:17:30.401 "zoned": false, 00:17:30.401 "supported_io_types": { 00:17:30.401 "read": true, 00:17:30.401 "write": true, 00:17:30.401 "unmap": true, 00:17:30.401 "flush": true, 00:17:30.401 "reset": true, 00:17:30.401 "nvme_admin": false, 00:17:30.401 "nvme_io": false, 00:17:30.401 "nvme_io_md": false, 00:17:30.401 "write_zeroes": true, 00:17:30.401 "zcopy": true, 00:17:30.401 "get_zone_info": false, 00:17:30.401 "zone_management": false, 00:17:30.401 "zone_append": false, 00:17:30.401 "compare": false, 00:17:30.401 "compare_and_write": false, 00:17:30.401 "abort": true, 00:17:30.401 "seek_hole": false, 00:17:30.401 "seek_data": false, 00:17:30.401 "copy": true, 00:17:30.401 "nvme_iov_md": false 00:17:30.401 }, 00:17:30.401 "memory_domains": [ 00:17:30.401 { 00:17:30.401 "dma_device_id": "system", 00:17:30.401 "dma_device_type": 1 00:17:30.401 }, 00:17:30.401 { 00:17:30.401 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:30.401 "dma_device_type": 2 00:17:30.401 } 00:17:30.401 ], 00:17:30.401 "driver_specific": { 00:17:30.401 "passthru": { 00:17:30.401 "name": "pt3", 00:17:30.401 "base_bdev_name": "malloc3" 00:17:30.401 } 00:17:30.401 } 00:17:30.401 }' 00:17:30.401 23:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:30.401 23:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:30.401 23:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:30.401 23:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:30.401 23:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:30.402 23:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:30.402 23:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:30.660 23:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:30.660 23:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:30.660 23:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:30.660 23:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:30.660 23:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:30.660 23:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:30.660 23:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:17:30.660 23:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:30.918 23:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:30.918 "name": "pt4", 00:17:30.918 "aliases": [ 00:17:30.918 "00000000-0000-0000-0000-000000000004" 00:17:30.918 ], 00:17:30.918 "product_name": "passthru", 00:17:30.918 "block_size": 512, 00:17:30.918 "num_blocks": 65536, 00:17:30.918 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:30.918 "assigned_rate_limits": { 00:17:30.918 "rw_ios_per_sec": 0, 00:17:30.918 "rw_mbytes_per_sec": 0, 00:17:30.918 "r_mbytes_per_sec": 0, 00:17:30.918 "w_mbytes_per_sec": 0 00:17:30.918 }, 00:17:30.918 "claimed": true, 00:17:30.918 "claim_type": "exclusive_write", 00:17:30.918 "zoned": false, 00:17:30.918 "supported_io_types": { 00:17:30.918 "read": true, 00:17:30.918 "write": true, 00:17:30.918 "unmap": true, 00:17:30.918 "flush": true, 00:17:30.918 "reset": true, 00:17:30.918 "nvme_admin": false, 00:17:30.918 "nvme_io": false, 00:17:30.918 "nvme_io_md": false, 00:17:30.918 "write_zeroes": true, 00:17:30.918 "zcopy": true, 00:17:30.918 "get_zone_info": false, 00:17:30.918 "zone_management": false, 00:17:30.918 "zone_append": false, 00:17:30.918 "compare": false, 00:17:30.918 "compare_and_write": false, 00:17:30.918 "abort": true, 00:17:30.918 "seek_hole": false, 00:17:30.918 "seek_data": false, 00:17:30.918 "copy": true, 00:17:30.918 "nvme_iov_md": false 00:17:30.918 }, 00:17:30.918 "memory_domains": [ 00:17:30.918 { 00:17:30.918 "dma_device_id": "system", 00:17:30.918 "dma_device_type": 1 00:17:30.918 }, 00:17:30.918 { 00:17:30.918 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:30.918 "dma_device_type": 2 00:17:30.918 } 00:17:30.918 ], 00:17:30.918 "driver_specific": { 00:17:30.918 "passthru": { 00:17:30.918 "name": "pt4", 00:17:30.918 "base_bdev_name": "malloc4" 00:17:30.918 } 00:17:30.918 } 00:17:30.918 }' 00:17:30.918 23:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:30.918 23:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:30.918 23:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:30.918 23:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:30.918 23:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:30.918 23:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:30.918 23:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:31.177 23:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:31.177 23:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:31.177 23:39:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:31.177 23:39:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:31.177 23:39:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:31.177 23:39:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:31.177 23:39:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:17:31.436 [2024-07-24 23:39:16.185495] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:31.436 23:39:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 61fa06a6-5f05-4c87-bcf0-4dddfb08decf '!=' 61fa06a6-5f05-4c87-bcf0-4dddfb08decf ']' 00:17:31.436 23:39:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:17:31.436 23:39:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:31.436 23:39:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:17:31.436 23:39:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:17:31.436 [2024-07-24 23:39:16.365792] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:17:31.436 23:39:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:17:31.436 23:39:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:31.436 23:39:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:31.436 23:39:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:31.436 23:39:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:31.436 23:39:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:31.436 23:39:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:31.436 23:39:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:31.436 23:39:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:31.436 23:39:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:31.436 23:39:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:31.436 23:39:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:31.695 23:39:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:31.695 "name": "raid_bdev1", 00:17:31.695 "uuid": "61fa06a6-5f05-4c87-bcf0-4dddfb08decf", 00:17:31.695 "strip_size_kb": 0, 00:17:31.695 "state": "online", 00:17:31.695 "raid_level": "raid1", 00:17:31.695 "superblock": true, 00:17:31.695 "num_base_bdevs": 4, 00:17:31.695 "num_base_bdevs_discovered": 3, 00:17:31.695 "num_base_bdevs_operational": 3, 00:17:31.695 "base_bdevs_list": [ 00:17:31.695 { 00:17:31.695 "name": null, 00:17:31.695 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:31.695 "is_configured": false, 00:17:31.695 "data_offset": 2048, 00:17:31.695 "data_size": 63488 00:17:31.695 }, 00:17:31.695 { 00:17:31.695 "name": "pt2", 00:17:31.695 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:31.695 "is_configured": true, 00:17:31.695 "data_offset": 2048, 00:17:31.695 "data_size": 63488 00:17:31.695 }, 00:17:31.695 { 00:17:31.695 "name": "pt3", 00:17:31.695 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:31.695 "is_configured": true, 00:17:31.695 "data_offset": 2048, 00:17:31.695 "data_size": 63488 00:17:31.695 }, 00:17:31.695 { 00:17:31.695 "name": "pt4", 00:17:31.695 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:31.695 "is_configured": true, 00:17:31.695 "data_offset": 2048, 00:17:31.695 "data_size": 63488 00:17:31.695 } 00:17:31.695 ] 00:17:31.695 }' 00:17:31.695 23:39:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:31.695 23:39:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:32.263 23:39:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:32.263 [2024-07-24 23:39:17.163858] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:32.263 [2024-07-24 23:39:17.163877] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:32.263 [2024-07-24 23:39:17.163915] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:32.263 [2024-07-24 23:39:17.163965] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:32.263 [2024-07-24 23:39:17.163971] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfa9610 name raid_bdev1, state offline 00:17:32.263 23:39:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:32.263 23:39:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:17:32.522 23:39:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:17:32.522 23:39:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:17:32.522 23:39:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:17:32.522 23:39:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:17:32.522 23:39:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:17:32.522 23:39:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:17:32.522 23:39:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:17:32.522 23:39:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:17:32.781 23:39:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:17:32.781 23:39:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:17:32.781 23:39:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:17:33.073 23:39:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:17:33.073 23:39:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:17:33.073 23:39:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:17:33.073 23:39:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:17:33.073 23:39:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:33.073 [2024-07-24 23:39:17.989955] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:33.073 [2024-07-24 23:39:17.989986] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:33.073 [2024-07-24 23:39:17.989995] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfaa470 00:17:33.073 [2024-07-24 23:39:17.990001] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:33.073 [2024-07-24 23:39:17.991184] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:33.073 [2024-07-24 23:39:17.991206] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:33.073 [2024-07-24 23:39:17.991252] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:17:33.073 [2024-07-24 23:39:17.991271] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:33.073 pt2 00:17:33.073 23:39:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:17:33.073 23:39:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:33.073 23:39:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:33.073 23:39:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:33.073 23:39:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:33.073 23:39:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:33.073 23:39:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:33.073 23:39:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:33.073 23:39:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:33.073 23:39:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:33.073 23:39:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:33.073 23:39:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:33.355 23:39:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:33.355 "name": "raid_bdev1", 00:17:33.355 "uuid": "61fa06a6-5f05-4c87-bcf0-4dddfb08decf", 00:17:33.355 "strip_size_kb": 0, 00:17:33.355 "state": "configuring", 00:17:33.355 "raid_level": "raid1", 00:17:33.355 "superblock": true, 00:17:33.355 "num_base_bdevs": 4, 00:17:33.355 "num_base_bdevs_discovered": 1, 00:17:33.355 "num_base_bdevs_operational": 3, 00:17:33.355 "base_bdevs_list": [ 00:17:33.355 { 00:17:33.355 "name": null, 00:17:33.355 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:33.355 "is_configured": false, 00:17:33.355 "data_offset": 2048, 00:17:33.355 "data_size": 63488 00:17:33.355 }, 00:17:33.355 { 00:17:33.355 "name": "pt2", 00:17:33.355 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:33.355 "is_configured": true, 00:17:33.355 "data_offset": 2048, 00:17:33.355 "data_size": 63488 00:17:33.355 }, 00:17:33.355 { 00:17:33.355 "name": null, 00:17:33.355 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:33.355 "is_configured": false, 00:17:33.355 "data_offset": 2048, 00:17:33.355 "data_size": 63488 00:17:33.355 }, 00:17:33.355 { 00:17:33.355 "name": null, 00:17:33.355 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:33.355 "is_configured": false, 00:17:33.355 "data_offset": 2048, 00:17:33.355 "data_size": 63488 00:17:33.355 } 00:17:33.355 ] 00:17:33.355 }' 00:17:33.355 23:39:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:33.355 23:39:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:33.923 23:39:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:17:33.923 23:39:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:17:33.923 23:39:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:17:33.923 [2024-07-24 23:39:18.784021] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:17:33.923 [2024-07-24 23:39:18.784062] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:33.923 [2024-07-24 23:39:18.784074] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfb2b30 00:17:33.923 [2024-07-24 23:39:18.784080] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:33.923 [2024-07-24 23:39:18.784328] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:33.923 [2024-07-24 23:39:18.784338] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:17:33.923 [2024-07-24 23:39:18.784381] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:17:33.923 [2024-07-24 23:39:18.784394] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:17:33.923 pt3 00:17:33.923 23:39:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:17:33.923 23:39:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:33.923 23:39:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:33.923 23:39:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:33.923 23:39:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:33.923 23:39:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:33.923 23:39:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:33.923 23:39:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:33.923 23:39:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:33.923 23:39:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:33.923 23:39:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:33.923 23:39:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:34.183 23:39:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:34.183 "name": "raid_bdev1", 00:17:34.183 "uuid": "61fa06a6-5f05-4c87-bcf0-4dddfb08decf", 00:17:34.183 "strip_size_kb": 0, 00:17:34.183 "state": "configuring", 00:17:34.183 "raid_level": "raid1", 00:17:34.183 "superblock": true, 00:17:34.183 "num_base_bdevs": 4, 00:17:34.183 "num_base_bdevs_discovered": 2, 00:17:34.183 "num_base_bdevs_operational": 3, 00:17:34.183 "base_bdevs_list": [ 00:17:34.183 { 00:17:34.183 "name": null, 00:17:34.183 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:34.183 "is_configured": false, 00:17:34.183 "data_offset": 2048, 00:17:34.183 "data_size": 63488 00:17:34.183 }, 00:17:34.183 { 00:17:34.183 "name": "pt2", 00:17:34.183 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:34.183 "is_configured": true, 00:17:34.183 "data_offset": 2048, 00:17:34.183 "data_size": 63488 00:17:34.183 }, 00:17:34.183 { 00:17:34.183 "name": "pt3", 00:17:34.183 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:34.183 "is_configured": true, 00:17:34.183 "data_offset": 2048, 00:17:34.183 "data_size": 63488 00:17:34.183 }, 00:17:34.183 { 00:17:34.183 "name": null, 00:17:34.183 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:34.183 "is_configured": false, 00:17:34.183 "data_offset": 2048, 00:17:34.183 "data_size": 63488 00:17:34.183 } 00:17:34.183 ] 00:17:34.183 }' 00:17:34.183 23:39:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:34.183 23:39:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:34.750 23:39:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:17:34.750 23:39:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:17:34.750 23:39:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=3 00:17:34.750 23:39:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:17:34.750 [2024-07-24 23:39:19.590095] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:17:34.750 [2024-07-24 23:39:19.590131] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:34.750 [2024-07-24 23:39:19.590142] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfab3b0 00:17:34.750 [2024-07-24 23:39:19.590151] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:34.750 [2024-07-24 23:39:19.590380] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:34.750 [2024-07-24 23:39:19.590389] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:17:34.750 [2024-07-24 23:39:19.590428] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:17:34.750 [2024-07-24 23:39:19.590440] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:17:34.750 [2024-07-24 23:39:19.590537] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xfab6d0 00:17:34.750 [2024-07-24 23:39:19.590544] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:34.750 [2024-07-24 23:39:19.590657] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xfb38f0 00:17:34.750 [2024-07-24 23:39:19.590746] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xfab6d0 00:17:34.750 [2024-07-24 23:39:19.590751] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xfab6d0 00:17:34.750 [2024-07-24 23:39:19.590817] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:34.750 pt4 00:17:34.750 23:39:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:17:34.750 23:39:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:34.750 23:39:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:34.750 23:39:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:34.750 23:39:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:34.750 23:39:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:34.750 23:39:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:34.750 23:39:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:34.750 23:39:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:34.750 23:39:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:34.750 23:39:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:34.750 23:39:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:35.009 23:39:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:35.009 "name": "raid_bdev1", 00:17:35.009 "uuid": "61fa06a6-5f05-4c87-bcf0-4dddfb08decf", 00:17:35.009 "strip_size_kb": 0, 00:17:35.009 "state": "online", 00:17:35.009 "raid_level": "raid1", 00:17:35.009 "superblock": true, 00:17:35.009 "num_base_bdevs": 4, 00:17:35.009 "num_base_bdevs_discovered": 3, 00:17:35.009 "num_base_bdevs_operational": 3, 00:17:35.009 "base_bdevs_list": [ 00:17:35.009 { 00:17:35.009 "name": null, 00:17:35.009 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:35.009 "is_configured": false, 00:17:35.009 "data_offset": 2048, 00:17:35.009 "data_size": 63488 00:17:35.009 }, 00:17:35.009 { 00:17:35.009 "name": "pt2", 00:17:35.009 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:35.009 "is_configured": true, 00:17:35.009 "data_offset": 2048, 00:17:35.009 "data_size": 63488 00:17:35.009 }, 00:17:35.009 { 00:17:35.009 "name": "pt3", 00:17:35.009 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:35.009 "is_configured": true, 00:17:35.009 "data_offset": 2048, 00:17:35.009 "data_size": 63488 00:17:35.009 }, 00:17:35.009 { 00:17:35.009 "name": "pt4", 00:17:35.009 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:35.009 "is_configured": true, 00:17:35.009 "data_offset": 2048, 00:17:35.009 "data_size": 63488 00:17:35.009 } 00:17:35.009 ] 00:17:35.009 }' 00:17:35.009 23:39:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:35.009 23:39:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:35.576 23:39:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:35.576 [2024-07-24 23:39:20.436285] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:35.576 [2024-07-24 23:39:20.436304] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:35.576 [2024-07-24 23:39:20.436347] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:35.576 [2024-07-24 23:39:20.436395] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:35.576 [2024-07-24 23:39:20.436401] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfab6d0 name raid_bdev1, state offline 00:17:35.576 23:39:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:35.576 23:39:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:17:35.835 23:39:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:17:35.835 23:39:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:17:35.835 23:39:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 4 -gt 2 ']' 00:17:35.835 23:39:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@533 -- # i=3 00:17:35.835 23:39:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@534 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:17:35.835 23:39:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:17:36.094 [2024-07-24 23:39:20.941577] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:17:36.094 [2024-07-24 23:39:20.941605] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:36.094 [2024-07-24 23:39:20.941615] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfa8340 00:17:36.094 [2024-07-24 23:39:20.941622] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:36.094 [2024-07-24 23:39:20.942784] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:36.094 [2024-07-24 23:39:20.942803] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:17:36.094 [2024-07-24 23:39:20.942845] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:17:36.094 [2024-07-24 23:39:20.942863] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:17:36.094 [2024-07-24 23:39:20.942928] bdev_raid.c:3639:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:17:36.094 [2024-07-24 23:39:20.942951] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:36.094 [2024-07-24 23:39:20.942959] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfaf380 name raid_bdev1, state configuring 00:17:36.094 [2024-07-24 23:39:20.942973] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:36.094 [2024-07-24 23:39:20.943020] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:17:36.094 pt1 00:17:36.094 23:39:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 4 -gt 2 ']' 00:17:36.094 23:39:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@544 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:17:36.094 23:39:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:36.094 23:39:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:36.094 23:39:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:36.094 23:39:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:36.094 23:39:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:36.094 23:39:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:36.094 23:39:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:36.094 23:39:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:36.094 23:39:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:36.094 23:39:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:36.094 23:39:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:36.353 23:39:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:36.353 "name": "raid_bdev1", 00:17:36.353 "uuid": "61fa06a6-5f05-4c87-bcf0-4dddfb08decf", 00:17:36.353 "strip_size_kb": 0, 00:17:36.353 "state": "configuring", 00:17:36.353 "raid_level": "raid1", 00:17:36.353 "superblock": true, 00:17:36.353 "num_base_bdevs": 4, 00:17:36.353 "num_base_bdevs_discovered": 2, 00:17:36.353 "num_base_bdevs_operational": 3, 00:17:36.353 "base_bdevs_list": [ 00:17:36.353 { 00:17:36.353 "name": null, 00:17:36.353 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:36.353 "is_configured": false, 00:17:36.353 "data_offset": 2048, 00:17:36.353 "data_size": 63488 00:17:36.353 }, 00:17:36.353 { 00:17:36.353 "name": "pt2", 00:17:36.353 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:36.353 "is_configured": true, 00:17:36.353 "data_offset": 2048, 00:17:36.353 "data_size": 63488 00:17:36.353 }, 00:17:36.353 { 00:17:36.353 "name": "pt3", 00:17:36.353 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:36.353 "is_configured": true, 00:17:36.353 "data_offset": 2048, 00:17:36.353 "data_size": 63488 00:17:36.353 }, 00:17:36.353 { 00:17:36.353 "name": null, 00:17:36.353 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:36.353 "is_configured": false, 00:17:36.353 "data_offset": 2048, 00:17:36.353 "data_size": 63488 00:17:36.353 } 00:17:36.353 ] 00:17:36.353 }' 00:17:36.353 23:39:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:36.353 23:39:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:36.939 23:39:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:17:36.940 23:39:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs configuring 00:17:36.940 23:39:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # [[ false == \f\a\l\s\e ]] 00:17:36.940 23:39:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@548 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:17:37.197 [2024-07-24 23:39:21.948184] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:17:37.198 [2024-07-24 23:39:21.948221] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:37.198 [2024-07-24 23:39:21.948232] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfaf7a0 00:17:37.198 [2024-07-24 23:39:21.948238] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:37.198 [2024-07-24 23:39:21.948487] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:37.198 [2024-07-24 23:39:21.948498] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:17:37.198 [2024-07-24 23:39:21.948540] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:17:37.198 [2024-07-24 23:39:21.948554] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:17:37.198 [2024-07-24 23:39:21.948634] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xfaac50 00:17:37.198 [2024-07-24 23:39:21.948640] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:37.198 [2024-07-24 23:39:21.948757] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xfb0730 00:17:37.198 [2024-07-24 23:39:21.948844] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xfaac50 00:17:37.198 [2024-07-24 23:39:21.948849] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xfaac50 00:17:37.198 [2024-07-24 23:39:21.948913] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:37.198 pt4 00:17:37.198 23:39:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:17:37.198 23:39:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:37.198 23:39:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:37.198 23:39:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:37.198 23:39:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:37.198 23:39:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:37.198 23:39:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:37.198 23:39:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:37.198 23:39:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:37.198 23:39:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:37.198 23:39:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:37.198 23:39:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:37.198 23:39:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:37.198 "name": "raid_bdev1", 00:17:37.198 "uuid": "61fa06a6-5f05-4c87-bcf0-4dddfb08decf", 00:17:37.198 "strip_size_kb": 0, 00:17:37.198 "state": "online", 00:17:37.198 "raid_level": "raid1", 00:17:37.198 "superblock": true, 00:17:37.198 "num_base_bdevs": 4, 00:17:37.198 "num_base_bdevs_discovered": 3, 00:17:37.198 "num_base_bdevs_operational": 3, 00:17:37.198 "base_bdevs_list": [ 00:17:37.198 { 00:17:37.198 "name": null, 00:17:37.198 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:37.198 "is_configured": false, 00:17:37.198 "data_offset": 2048, 00:17:37.198 "data_size": 63488 00:17:37.198 }, 00:17:37.198 { 00:17:37.198 "name": "pt2", 00:17:37.198 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:37.198 "is_configured": true, 00:17:37.198 "data_offset": 2048, 00:17:37.198 "data_size": 63488 00:17:37.198 }, 00:17:37.198 { 00:17:37.198 "name": "pt3", 00:17:37.198 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:37.198 "is_configured": true, 00:17:37.198 "data_offset": 2048, 00:17:37.198 "data_size": 63488 00:17:37.198 }, 00:17:37.198 { 00:17:37.198 "name": "pt4", 00:17:37.198 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:37.198 "is_configured": true, 00:17:37.198 "data_offset": 2048, 00:17:37.198 "data_size": 63488 00:17:37.198 } 00:17:37.198 ] 00:17:37.198 }' 00:17:37.198 23:39:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:37.198 23:39:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:37.765 23:39:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:17:37.765 23:39:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:17:38.024 23:39:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:17:38.024 23:39:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:38.024 23:39:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:17:38.024 [2024-07-24 23:39:22.954969] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:38.024 23:39:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' 61fa06a6-5f05-4c87-bcf0-4dddfb08decf '!=' 61fa06a6-5f05-4c87-bcf0-4dddfb08decf ']' 00:17:38.024 23:39:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 339716 00:17:38.024 23:39:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@950 -- # '[' -z 339716 ']' 00:17:38.024 23:39:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # kill -0 339716 00:17:38.024 23:39:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # uname 00:17:38.024 23:39:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:17:38.024 23:39:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 339716 00:17:38.024 23:39:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:17:38.024 23:39:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:17:38.024 23:39:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 339716' 00:17:38.024 killing process with pid 339716 00:17:38.024 23:39:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@969 -- # kill 339716 00:17:38.024 [2024-07-24 23:39:23.010293] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:38.024 [2024-07-24 23:39:23.010340] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:38.024 [2024-07-24 23:39:23.010389] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:38.024 [2024-07-24 23:39:23.010396] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfaac50 name raid_bdev1, state offline 00:17:38.024 23:39:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@974 -- # wait 339716 00:17:38.282 [2024-07-24 23:39:23.042858] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:38.282 23:39:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:17:38.282 00:17:38.282 real 0m19.045s 00:17:38.282 user 0m35.281s 00:17:38.282 sys 0m2.950s 00:17:38.282 23:39:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:17:38.282 23:39:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:38.282 ************************************ 00:17:38.282 END TEST raid_superblock_test 00:17:38.282 ************************************ 00:17:38.282 23:39:23 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 4 read 00:17:38.282 23:39:23 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:17:38.282 23:39:23 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:17:38.282 23:39:23 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:38.282 ************************************ 00:17:38.282 START TEST raid_read_error_test 00:17:38.282 ************************************ 00:17:38.282 23:39:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test raid1 4 read 00:17:38.282 23:39:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:17:38.282 23:39:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:17:38.282 23:39:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:17:38.541 23:39:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:17:38.541 23:39:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:38.541 23:39:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:17:38.541 23:39:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:38.541 23:39:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:38.541 23:39:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:17:38.541 23:39:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:38.541 23:39:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:38.541 23:39:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:17:38.541 23:39:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:38.541 23:39:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:38.541 23:39:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:17:38.541 23:39:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:38.541 23:39:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:38.541 23:39:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:17:38.541 23:39:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:17:38.541 23:39:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:17:38.541 23:39:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:17:38.541 23:39:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:17:38.541 23:39:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:17:38.541 23:39:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:17:38.541 23:39:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:17:38.541 23:39:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:17:38.541 23:39:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:17:38.541 23:39:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.T0522OtSHz 00:17:38.541 23:39:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=343802 00:17:38.541 23:39:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 343802 /var/tmp/spdk-raid.sock 00:17:38.541 23:39:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:17:38.541 23:39:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@831 -- # '[' -z 343802 ']' 00:17:38.541 23:39:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:38.541 23:39:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:17:38.541 23:39:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:38.541 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:38.541 23:39:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:17:38.541 23:39:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:38.541 [2024-07-24 23:39:23.345316] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:17:38.541 [2024-07-24 23:39:23.345354] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid343802 ] 00:17:38.541 [2024-07-24 23:39:23.408156] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:38.541 [2024-07-24 23:39:23.486116] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:38.541 [2024-07-24 23:39:23.536907] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:38.541 [2024-07-24 23:39:23.536931] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:39.475 23:39:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:17:39.475 23:39:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@864 -- # return 0 00:17:39.476 23:39:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:39.476 23:39:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:17:39.476 BaseBdev1_malloc 00:17:39.476 23:39:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:17:39.476 true 00:17:39.476 23:39:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:17:39.734 [2024-07-24 23:39:24.585093] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:17:39.734 [2024-07-24 23:39:24.585126] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:39.734 [2024-07-24 23:39:24.585138] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ef1550 00:17:39.734 [2024-07-24 23:39:24.585145] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:39.734 [2024-07-24 23:39:24.586383] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:39.734 [2024-07-24 23:39:24.586403] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:17:39.734 BaseBdev1 00:17:39.734 23:39:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:39.734 23:39:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:17:39.993 BaseBdev2_malloc 00:17:39.993 23:39:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:17:39.993 true 00:17:39.993 23:39:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:17:40.251 [2024-07-24 23:39:25.073918] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:17:40.251 [2024-07-24 23:39:25.073949] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:40.251 [2024-07-24 23:39:25.073959] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ef5d90 00:17:40.251 [2024-07-24 23:39:25.073965] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:40.251 [2024-07-24 23:39:25.075028] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:40.251 [2024-07-24 23:39:25.075048] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:17:40.251 BaseBdev2 00:17:40.251 23:39:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:40.251 23:39:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:17:40.251 BaseBdev3_malloc 00:17:40.251 23:39:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:17:40.509 true 00:17:40.509 23:39:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:17:40.767 [2024-07-24 23:39:25.546603] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:17:40.767 [2024-07-24 23:39:25.546633] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:40.767 [2024-07-24 23:39:25.546644] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ef8050 00:17:40.767 [2024-07-24 23:39:25.546650] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:40.767 [2024-07-24 23:39:25.547717] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:40.767 [2024-07-24 23:39:25.547736] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:17:40.767 BaseBdev3 00:17:40.767 23:39:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:40.767 23:39:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:17:40.767 BaseBdev4_malloc 00:17:40.767 23:39:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:17:41.025 true 00:17:41.025 23:39:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:17:41.282 [2024-07-24 23:39:26.035278] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:17:41.282 [2024-07-24 23:39:26.035309] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:41.282 [2024-07-24 23:39:26.035320] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ef8f20 00:17:41.282 [2024-07-24 23:39:26.035326] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:41.282 [2024-07-24 23:39:26.036397] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:41.282 [2024-07-24 23:39:26.036417] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:17:41.282 BaseBdev4 00:17:41.282 23:39:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:17:41.282 [2024-07-24 23:39:26.195727] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:41.282 [2024-07-24 23:39:26.196612] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:41.282 [2024-07-24 23:39:26.196660] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:41.282 [2024-07-24 23:39:26.196700] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:17:41.282 [2024-07-24 23:39:26.196857] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1ef30a0 00:17:41.282 [2024-07-24 23:39:26.196863] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:41.282 [2024-07-24 23:39:26.196997] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d476e0 00:17:41.282 [2024-07-24 23:39:26.197105] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1ef30a0 00:17:41.282 [2024-07-24 23:39:26.197111] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1ef30a0 00:17:41.282 [2024-07-24 23:39:26.197181] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:41.282 23:39:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:17:41.282 23:39:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:41.282 23:39:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:41.282 23:39:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:41.282 23:39:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:41.282 23:39:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:41.282 23:39:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:41.282 23:39:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:41.282 23:39:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:41.282 23:39:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:41.282 23:39:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:41.282 23:39:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:41.539 23:39:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:41.539 "name": "raid_bdev1", 00:17:41.539 "uuid": "ada43393-d318-47f1-a325-7e6665b45fa5", 00:17:41.539 "strip_size_kb": 0, 00:17:41.539 "state": "online", 00:17:41.539 "raid_level": "raid1", 00:17:41.539 "superblock": true, 00:17:41.539 "num_base_bdevs": 4, 00:17:41.539 "num_base_bdevs_discovered": 4, 00:17:41.540 "num_base_bdevs_operational": 4, 00:17:41.540 "base_bdevs_list": [ 00:17:41.540 { 00:17:41.540 "name": "BaseBdev1", 00:17:41.540 "uuid": "16288452-29a4-5302-981c-379637f4bf7a", 00:17:41.540 "is_configured": true, 00:17:41.540 "data_offset": 2048, 00:17:41.540 "data_size": 63488 00:17:41.540 }, 00:17:41.540 { 00:17:41.540 "name": "BaseBdev2", 00:17:41.540 "uuid": "76bd739d-93d0-58c7-b0e1-d36cc85fa541", 00:17:41.540 "is_configured": true, 00:17:41.540 "data_offset": 2048, 00:17:41.540 "data_size": 63488 00:17:41.540 }, 00:17:41.540 { 00:17:41.540 "name": "BaseBdev3", 00:17:41.540 "uuid": "be620f1a-2d61-52bf-b4c8-e30b947b56b1", 00:17:41.540 "is_configured": true, 00:17:41.540 "data_offset": 2048, 00:17:41.540 "data_size": 63488 00:17:41.540 }, 00:17:41.540 { 00:17:41.540 "name": "BaseBdev4", 00:17:41.540 "uuid": "9895d47b-42fa-51da-b62b-86d23d1afc90", 00:17:41.540 "is_configured": true, 00:17:41.540 "data_offset": 2048, 00:17:41.540 "data_size": 63488 00:17:41.540 } 00:17:41.540 ] 00:17:41.540 }' 00:17:41.540 23:39:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:41.540 23:39:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:42.105 23:39:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:17:42.105 23:39:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:17:42.105 [2024-07-24 23:39:26.941915] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d470e0 00:17:43.040 23:39:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:17:43.299 23:39:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:17:43.299 23:39:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:17:43.299 23:39:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:17:43.299 23:39:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:17:43.299 23:39:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:17:43.299 23:39:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:43.299 23:39:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:43.299 23:39:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:43.299 23:39:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:43.299 23:39:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:43.299 23:39:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:43.299 23:39:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:43.299 23:39:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:43.299 23:39:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:43.299 23:39:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:43.299 23:39:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:43.299 23:39:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:43.299 "name": "raid_bdev1", 00:17:43.299 "uuid": "ada43393-d318-47f1-a325-7e6665b45fa5", 00:17:43.299 "strip_size_kb": 0, 00:17:43.299 "state": "online", 00:17:43.299 "raid_level": "raid1", 00:17:43.299 "superblock": true, 00:17:43.299 "num_base_bdevs": 4, 00:17:43.299 "num_base_bdevs_discovered": 4, 00:17:43.299 "num_base_bdevs_operational": 4, 00:17:43.299 "base_bdevs_list": [ 00:17:43.299 { 00:17:43.299 "name": "BaseBdev1", 00:17:43.299 "uuid": "16288452-29a4-5302-981c-379637f4bf7a", 00:17:43.299 "is_configured": true, 00:17:43.299 "data_offset": 2048, 00:17:43.299 "data_size": 63488 00:17:43.299 }, 00:17:43.299 { 00:17:43.299 "name": "BaseBdev2", 00:17:43.299 "uuid": "76bd739d-93d0-58c7-b0e1-d36cc85fa541", 00:17:43.299 "is_configured": true, 00:17:43.299 "data_offset": 2048, 00:17:43.299 "data_size": 63488 00:17:43.299 }, 00:17:43.299 { 00:17:43.299 "name": "BaseBdev3", 00:17:43.299 "uuid": "be620f1a-2d61-52bf-b4c8-e30b947b56b1", 00:17:43.299 "is_configured": true, 00:17:43.299 "data_offset": 2048, 00:17:43.299 "data_size": 63488 00:17:43.299 }, 00:17:43.299 { 00:17:43.299 "name": "BaseBdev4", 00:17:43.299 "uuid": "9895d47b-42fa-51da-b62b-86d23d1afc90", 00:17:43.299 "is_configured": true, 00:17:43.299 "data_offset": 2048, 00:17:43.299 "data_size": 63488 00:17:43.299 } 00:17:43.299 ] 00:17:43.299 }' 00:17:43.299 23:39:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:43.299 23:39:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:43.867 23:39:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:43.867 [2024-07-24 23:39:28.842740] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:43.867 [2024-07-24 23:39:28.842773] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:43.867 [2024-07-24 23:39:28.844854] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:43.867 [2024-07-24 23:39:28.844879] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:43.867 [2024-07-24 23:39:28.844955] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:43.867 [2024-07-24 23:39:28.844961] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ef30a0 name raid_bdev1, state offline 00:17:43.867 0 00:17:43.867 23:39:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 343802 00:17:43.867 23:39:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@950 -- # '[' -z 343802 ']' 00:17:43.867 23:39:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # kill -0 343802 00:17:43.867 23:39:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # uname 00:17:43.867 23:39:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:17:44.127 23:39:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 343802 00:17:44.127 23:39:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:17:44.127 23:39:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:17:44.127 23:39:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 343802' 00:17:44.127 killing process with pid 343802 00:17:44.127 23:39:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@969 -- # kill 343802 00:17:44.127 [2024-07-24 23:39:28.901940] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:44.127 23:39:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@974 -- # wait 343802 00:17:44.127 [2024-07-24 23:39:28.928610] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:44.127 23:39:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.T0522OtSHz 00:17:44.127 23:39:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:17:44.127 23:39:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:17:44.127 23:39:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:17:44.127 23:39:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:17:44.127 23:39:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:44.127 23:39:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:17:44.127 23:39:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:17:44.127 00:17:44.127 real 0m5.833s 00:17:44.127 user 0m9.167s 00:17:44.127 sys 0m0.841s 00:17:44.127 23:39:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:17:44.127 23:39:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:44.127 ************************************ 00:17:44.127 END TEST raid_read_error_test 00:17:44.127 ************************************ 00:17:44.386 23:39:29 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 4 write 00:17:44.386 23:39:29 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:17:44.386 23:39:29 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:17:44.386 23:39:29 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:44.386 ************************************ 00:17:44.386 START TEST raid_write_error_test 00:17:44.386 ************************************ 00:17:44.386 23:39:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test raid1 4 write 00:17:44.386 23:39:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:17:44.386 23:39:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:17:44.386 23:39:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:17:44.386 23:39:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:17:44.386 23:39:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:44.386 23:39:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:17:44.386 23:39:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:44.386 23:39:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:44.386 23:39:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:17:44.386 23:39:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:44.386 23:39:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:44.386 23:39:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:17:44.386 23:39:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:44.386 23:39:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:44.386 23:39:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:17:44.386 23:39:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:44.386 23:39:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:44.386 23:39:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:17:44.386 23:39:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:17:44.386 23:39:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:17:44.386 23:39:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:17:44.386 23:39:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:17:44.386 23:39:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:17:44.386 23:39:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:17:44.386 23:39:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:17:44.386 23:39:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:17:44.386 23:39:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:17:44.386 23:39:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.JWRaU1znqt 00:17:44.386 23:39:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=344818 00:17:44.386 23:39:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 344818 /var/tmp/spdk-raid.sock 00:17:44.386 23:39:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:17:44.386 23:39:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@831 -- # '[' -z 344818 ']' 00:17:44.386 23:39:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:44.386 23:39:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:17:44.386 23:39:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:44.386 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:44.386 23:39:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:17:44.386 23:39:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:44.386 [2024-07-24 23:39:29.234154] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:17:44.386 [2024-07-24 23:39:29.234192] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid344818 ] 00:17:44.386 [2024-07-24 23:39:29.297619] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:44.386 [2024-07-24 23:39:29.379198] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:44.646 [2024-07-24 23:39:29.430142] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:44.646 [2024-07-24 23:39:29.430164] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:45.213 23:39:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:17:45.213 23:39:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@864 -- # return 0 00:17:45.213 23:39:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:45.213 23:39:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:17:45.213 BaseBdev1_malloc 00:17:45.213 23:39:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:17:45.472 true 00:17:45.472 23:39:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:17:45.731 [2024-07-24 23:39:30.506379] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:17:45.731 [2024-07-24 23:39:30.506410] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:45.731 [2024-07-24 23:39:30.506421] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1519550 00:17:45.731 [2024-07-24 23:39:30.506427] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:45.731 [2024-07-24 23:39:30.507668] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:45.731 [2024-07-24 23:39:30.507688] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:17:45.731 BaseBdev1 00:17:45.731 23:39:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:45.731 23:39:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:17:45.731 BaseBdev2_malloc 00:17:45.731 23:39:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:17:45.989 true 00:17:45.990 23:39:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:17:45.990 [2024-07-24 23:39:30.986995] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:17:45.990 [2024-07-24 23:39:30.987037] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:45.990 [2024-07-24 23:39:30.987048] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x151dd90 00:17:45.990 [2024-07-24 23:39:30.987054] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:45.990 [2024-07-24 23:39:30.988148] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:45.990 [2024-07-24 23:39:30.988168] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:17:46.248 BaseBdev2 00:17:46.248 23:39:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:46.248 23:39:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:17:46.248 BaseBdev3_malloc 00:17:46.248 23:39:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:17:46.507 true 00:17:46.507 23:39:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:17:46.507 [2024-07-24 23:39:31.463588] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:17:46.507 [2024-07-24 23:39:31.463619] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:46.507 [2024-07-24 23:39:31.463629] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1520050 00:17:46.507 [2024-07-24 23:39:31.463635] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:46.507 [2024-07-24 23:39:31.464703] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:46.507 [2024-07-24 23:39:31.464723] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:17:46.507 BaseBdev3 00:17:46.507 23:39:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:46.507 23:39:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:17:46.766 BaseBdev4_malloc 00:17:46.766 23:39:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:17:47.025 true 00:17:47.025 23:39:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:17:47.025 [2024-07-24 23:39:31.944313] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:17:47.025 [2024-07-24 23:39:31.944341] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:47.025 [2024-07-24 23:39:31.944353] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1520f20 00:17:47.025 [2024-07-24 23:39:31.944358] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:47.025 [2024-07-24 23:39:31.945451] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:47.025 [2024-07-24 23:39:31.945478] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:17:47.025 BaseBdev4 00:17:47.025 23:39:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:17:47.284 [2024-07-24 23:39:32.100738] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:47.284 [2024-07-24 23:39:32.101610] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:47.284 [2024-07-24 23:39:32.101657] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:47.284 [2024-07-24 23:39:32.101694] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:17:47.284 [2024-07-24 23:39:32.101853] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x151b0a0 00:17:47.284 [2024-07-24 23:39:32.101860] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:47.284 [2024-07-24 23:39:32.101989] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x136f6e0 00:17:47.284 [2024-07-24 23:39:32.102095] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x151b0a0 00:17:47.284 [2024-07-24 23:39:32.102099] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x151b0a0 00:17:47.284 [2024-07-24 23:39:32.102165] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:47.284 23:39:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:17:47.284 23:39:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:47.284 23:39:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:47.284 23:39:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:47.284 23:39:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:47.284 23:39:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:47.284 23:39:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:47.284 23:39:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:47.284 23:39:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:47.284 23:39:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:47.284 23:39:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:47.284 23:39:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:47.284 23:39:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:47.284 "name": "raid_bdev1", 00:17:47.284 "uuid": "5fdcd72f-1852-4258-9595-3f9da75fa41f", 00:17:47.284 "strip_size_kb": 0, 00:17:47.284 "state": "online", 00:17:47.284 "raid_level": "raid1", 00:17:47.284 "superblock": true, 00:17:47.284 "num_base_bdevs": 4, 00:17:47.284 "num_base_bdevs_discovered": 4, 00:17:47.284 "num_base_bdevs_operational": 4, 00:17:47.284 "base_bdevs_list": [ 00:17:47.284 { 00:17:47.284 "name": "BaseBdev1", 00:17:47.284 "uuid": "46452cd2-168a-5f28-b9fb-b1494961fe35", 00:17:47.284 "is_configured": true, 00:17:47.284 "data_offset": 2048, 00:17:47.284 "data_size": 63488 00:17:47.284 }, 00:17:47.284 { 00:17:47.284 "name": "BaseBdev2", 00:17:47.284 "uuid": "a014dec5-8e6a-5dc4-b143-24fdff738937", 00:17:47.284 "is_configured": true, 00:17:47.284 "data_offset": 2048, 00:17:47.284 "data_size": 63488 00:17:47.284 }, 00:17:47.284 { 00:17:47.284 "name": "BaseBdev3", 00:17:47.284 "uuid": "c13b77a6-32ca-5326-9c2d-313fbb871657", 00:17:47.284 "is_configured": true, 00:17:47.284 "data_offset": 2048, 00:17:47.284 "data_size": 63488 00:17:47.284 }, 00:17:47.284 { 00:17:47.284 "name": "BaseBdev4", 00:17:47.284 "uuid": "effc74ab-6aea-5e67-9b15-0edf65fa349b", 00:17:47.284 "is_configured": true, 00:17:47.284 "data_offset": 2048, 00:17:47.284 "data_size": 63488 00:17:47.284 } 00:17:47.285 ] 00:17:47.285 }' 00:17:47.285 23:39:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:47.285 23:39:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:47.852 23:39:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:17:47.852 23:39:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:17:47.852 [2024-07-24 23:39:32.814787] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x136f0e0 00:17:48.788 23:39:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:17:49.047 [2024-07-24 23:39:33.893911] bdev_raid.c:2247:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:17:49.047 [2024-07-24 23:39:33.893952] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:49.047 [2024-07-24 23:39:33.894132] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x136f0e0 00:17:49.047 23:39:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:17:49.047 23:39:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:17:49.047 23:39:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:17:49.047 23:39:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=3 00:17:49.047 23:39:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:17:49.047 23:39:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:49.047 23:39:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:49.047 23:39:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:49.047 23:39:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:49.047 23:39:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:49.047 23:39:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:49.047 23:39:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:49.047 23:39:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:49.047 23:39:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:49.047 23:39:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:49.047 23:39:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:49.306 23:39:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:49.306 "name": "raid_bdev1", 00:17:49.306 "uuid": "5fdcd72f-1852-4258-9595-3f9da75fa41f", 00:17:49.306 "strip_size_kb": 0, 00:17:49.306 "state": "online", 00:17:49.306 "raid_level": "raid1", 00:17:49.306 "superblock": true, 00:17:49.306 "num_base_bdevs": 4, 00:17:49.306 "num_base_bdevs_discovered": 3, 00:17:49.306 "num_base_bdevs_operational": 3, 00:17:49.306 "base_bdevs_list": [ 00:17:49.306 { 00:17:49.306 "name": null, 00:17:49.306 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:49.306 "is_configured": false, 00:17:49.306 "data_offset": 2048, 00:17:49.306 "data_size": 63488 00:17:49.306 }, 00:17:49.306 { 00:17:49.306 "name": "BaseBdev2", 00:17:49.306 "uuid": "a014dec5-8e6a-5dc4-b143-24fdff738937", 00:17:49.306 "is_configured": true, 00:17:49.306 "data_offset": 2048, 00:17:49.306 "data_size": 63488 00:17:49.306 }, 00:17:49.306 { 00:17:49.306 "name": "BaseBdev3", 00:17:49.306 "uuid": "c13b77a6-32ca-5326-9c2d-313fbb871657", 00:17:49.306 "is_configured": true, 00:17:49.306 "data_offset": 2048, 00:17:49.306 "data_size": 63488 00:17:49.306 }, 00:17:49.306 { 00:17:49.306 "name": "BaseBdev4", 00:17:49.306 "uuid": "effc74ab-6aea-5e67-9b15-0edf65fa349b", 00:17:49.306 "is_configured": true, 00:17:49.306 "data_offset": 2048, 00:17:49.306 "data_size": 63488 00:17:49.306 } 00:17:49.306 ] 00:17:49.306 }' 00:17:49.306 23:39:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:49.306 23:39:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:49.564 23:39:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:49.824 [2024-07-24 23:39:34.686628] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:49.824 [2024-07-24 23:39:34.686657] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:49.824 [2024-07-24 23:39:34.688603] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:49.824 [2024-07-24 23:39:34.688626] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:49.824 [2024-07-24 23:39:34.688687] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:49.824 [2024-07-24 23:39:34.688692] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x151b0a0 name raid_bdev1, state offline 00:17:49.824 0 00:17:49.824 23:39:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 344818 00:17:49.824 23:39:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@950 -- # '[' -z 344818 ']' 00:17:49.824 23:39:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # kill -0 344818 00:17:49.824 23:39:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # uname 00:17:49.824 23:39:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:17:49.824 23:39:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 344818 00:17:49.824 23:39:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:17:49.824 23:39:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:17:49.824 23:39:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 344818' 00:17:49.824 killing process with pid 344818 00:17:49.824 23:39:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@969 -- # kill 344818 00:17:49.824 [2024-07-24 23:39:34.743875] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:49.824 23:39:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@974 -- # wait 344818 00:17:49.824 [2024-07-24 23:39:34.771339] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:50.083 23:39:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:17:50.083 23:39:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.JWRaU1znqt 00:17:50.083 23:39:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:17:50.083 23:39:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:17:50.083 23:39:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:17:50.083 23:39:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:50.083 23:39:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:17:50.083 23:39:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:17:50.083 00:17:50.083 real 0m5.782s 00:17:50.083 user 0m9.108s 00:17:50.083 sys 0m0.782s 00:17:50.083 23:39:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:17:50.083 23:39:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:50.083 ************************************ 00:17:50.083 END TEST raid_write_error_test 00:17:50.083 ************************************ 00:17:50.083 23:39:34 bdev_raid -- bdev/bdev_raid.sh@875 -- # '[' true = true ']' 00:17:50.083 23:39:34 bdev_raid -- bdev/bdev_raid.sh@876 -- # for n in 2 4 00:17:50.083 23:39:34 bdev_raid -- bdev/bdev_raid.sh@877 -- # run_test raid_rebuild_test raid_rebuild_test raid1 2 false false true 00:17:50.083 23:39:34 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:17:50.083 23:39:34 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:17:50.083 23:39:34 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:50.083 ************************************ 00:17:50.083 START TEST raid_rebuild_test 00:17:50.083 ************************************ 00:17:50.083 23:39:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1125 -- # raid_rebuild_test raid1 2 false false true 00:17:50.083 23:39:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:17:50.083 23:39:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:17:50.083 23:39:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:17:50.083 23:39:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:17:50.083 23:39:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@572 -- # local verify=true 00:17:50.083 23:39:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:17:50.083 23:39:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:17:50.083 23:39:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:17:50.083 23:39:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:17:50.083 23:39:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:17:50.083 23:39:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:17:50.083 23:39:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:17:50.083 23:39:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:17:50.083 23:39:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:17:50.083 23:39:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:17:50.083 23:39:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:17:50.083 23:39:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # local strip_size 00:17:50.083 23:39:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@576 -- # local create_arg 00:17:50.083 23:39:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:17:50.083 23:39:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@578 -- # local data_offset 00:17:50.083 23:39:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:17:50.083 23:39:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:17:50.083 23:39:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:17:50.083 23:39:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:17:50.083 23:39:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@596 -- # raid_pid=345894 00:17:50.083 23:39:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@597 -- # waitforlisten 345894 /var/tmp/spdk-raid.sock 00:17:50.083 23:39:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@831 -- # '[' -z 345894 ']' 00:17:50.083 23:39:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:50.083 23:39:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:17:50.083 23:39:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:50.083 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:50.083 23:39:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:17:50.083 23:39:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:17:50.083 [2024-07-24 23:39:35.068258] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:17:50.083 [2024-07-24 23:39:35.068296] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid345894 ] 00:17:50.083 I/O size of 3145728 is greater than zero copy threshold (65536). 00:17:50.083 Zero copy mechanism will not be used. 00:17:50.341 [2024-07-24 23:39:35.126064] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:50.341 [2024-07-24 23:39:35.205329] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:50.341 [2024-07-24 23:39:35.262336] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:50.341 [2024-07-24 23:39:35.262365] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:50.908 23:39:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:17:50.908 23:39:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@864 -- # return 0 00:17:50.908 23:39:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:17:50.908 23:39:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:17:51.166 BaseBdev1_malloc 00:17:51.166 23:39:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:17:51.425 [2024-07-24 23:39:36.174225] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:17:51.425 [2024-07-24 23:39:36.174259] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:51.425 [2024-07-24 23:39:36.174273] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c601c0 00:17:51.425 [2024-07-24 23:39:36.174279] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:51.425 [2024-07-24 23:39:36.175438] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:51.425 [2024-07-24 23:39:36.175459] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:17:51.425 BaseBdev1 00:17:51.425 23:39:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:17:51.425 23:39:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:17:51.425 BaseBdev2_malloc 00:17:51.425 23:39:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:17:51.685 [2024-07-24 23:39:36.498655] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:17:51.685 [2024-07-24 23:39:36.498688] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:51.685 [2024-07-24 23:39:36.498703] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c60ce0 00:17:51.685 [2024-07-24 23:39:36.498725] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:51.685 [2024-07-24 23:39:36.499783] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:51.685 [2024-07-24 23:39:36.499803] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:17:51.685 BaseBdev2 00:17:51.685 23:39:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:17:51.685 spare_malloc 00:17:51.685 23:39:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:17:51.943 spare_delay 00:17:51.943 23:39:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:17:52.202 [2024-07-24 23:39:36.987388] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:17:52.202 [2024-07-24 23:39:36.987419] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:52.202 [2024-07-24 23:39:36.987431] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e0f340 00:17:52.202 [2024-07-24 23:39:36.987436] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:52.202 [2024-07-24 23:39:36.988519] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:52.202 [2024-07-24 23:39:36.988543] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:17:52.202 spare 00:17:52.202 23:39:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:17:52.203 [2024-07-24 23:39:37.143905] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:52.203 [2024-07-24 23:39:37.144759] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:52.203 [2024-07-24 23:39:37.144813] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1e104f0 00:17:52.203 [2024-07-24 23:39:37.144819] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:17:52.203 [2024-07-24 23:39:37.144957] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e09910 00:17:52.203 [2024-07-24 23:39:37.145051] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1e104f0 00:17:52.203 [2024-07-24 23:39:37.145056] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1e104f0 00:17:52.203 [2024-07-24 23:39:37.145128] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:52.203 23:39:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:17:52.203 23:39:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:52.203 23:39:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:52.203 23:39:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:52.203 23:39:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:52.203 23:39:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:52.203 23:39:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:52.203 23:39:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:52.203 23:39:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:52.203 23:39:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:52.203 23:39:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:52.203 23:39:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:52.477 23:39:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:52.477 "name": "raid_bdev1", 00:17:52.477 "uuid": "091c0852-0fb0-464d-95fd-5e3adab11048", 00:17:52.477 "strip_size_kb": 0, 00:17:52.477 "state": "online", 00:17:52.477 "raid_level": "raid1", 00:17:52.477 "superblock": false, 00:17:52.477 "num_base_bdevs": 2, 00:17:52.477 "num_base_bdevs_discovered": 2, 00:17:52.477 "num_base_bdevs_operational": 2, 00:17:52.477 "base_bdevs_list": [ 00:17:52.477 { 00:17:52.477 "name": "BaseBdev1", 00:17:52.477 "uuid": "11c90516-052b-53f1-afc2-b45bc4a9b363", 00:17:52.477 "is_configured": true, 00:17:52.477 "data_offset": 0, 00:17:52.477 "data_size": 65536 00:17:52.477 }, 00:17:52.477 { 00:17:52.477 "name": "BaseBdev2", 00:17:52.477 "uuid": "79d8f398-c740-5f8d-856f-293051b9d5d6", 00:17:52.477 "is_configured": true, 00:17:52.477 "data_offset": 0, 00:17:52.477 "data_size": 65536 00:17:52.477 } 00:17:52.477 ] 00:17:52.477 }' 00:17:52.477 23:39:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:52.477 23:39:37 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:17:53.129 23:39:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:53.129 23:39:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:17:53.129 [2024-07-24 23:39:37.962154] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:53.129 23:39:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:17:53.129 23:39:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:53.129 23:39:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:17:53.388 23:39:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:17:53.388 23:39:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:17:53.388 23:39:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:17:53.388 23:39:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:17:53.388 23:39:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:17:53.388 23:39:38 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:17:53.388 23:39:38 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:17:53.388 23:39:38 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:17:53.388 23:39:38 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:17:53.388 23:39:38 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:17:53.388 23:39:38 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:17:53.388 23:39:38 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:17:53.388 23:39:38 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:17:53.388 23:39:38 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:17:53.388 [2024-07-24 23:39:38.286863] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e09910 00:17:53.388 /dev/nbd0 00:17:53.388 23:39:38 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:17:53.388 23:39:38 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:17:53.388 23:39:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:17:53.388 23:39:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # local i 00:17:53.388 23:39:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:17:53.388 23:39:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:17:53.388 23:39:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:17:53.388 23:39:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@873 -- # break 00:17:53.388 23:39:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:17:53.388 23:39:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:17:53.388 23:39:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:17:53.388 1+0 records in 00:17:53.388 1+0 records out 00:17:53.388 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000195607 s, 20.9 MB/s 00:17:53.388 23:39:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:17:53.388 23:39:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # size=4096 00:17:53.388 23:39:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:17:53.388 23:39:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:17:53.388 23:39:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@889 -- # return 0 00:17:53.388 23:39:38 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:17:53.388 23:39:38 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:17:53.388 23:39:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:17:53.388 23:39:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:17:53.388 23:39:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=65536 oflag=direct 00:17:57.578 65536+0 records in 00:17:57.578 65536+0 records out 00:17:57.578 33554432 bytes (34 MB, 32 MiB) copied, 3.52821 s, 9.5 MB/s 00:17:57.578 23:39:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:17:57.578 23:39:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:17:57.578 23:39:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:17:57.578 23:39:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:17:57.578 23:39:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:17:57.578 23:39:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:17:57.578 23:39:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:17:57.578 23:39:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:17:57.578 23:39:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:17:57.578 23:39:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:17:57.578 23:39:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:17:57.578 23:39:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:17:57.578 23:39:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:17:57.578 [2024-07-24 23:39:42.068220] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:57.578 23:39:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:17:57.578 23:39:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:17:57.578 23:39:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:17:57.578 [2024-07-24 23:39:42.232670] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:57.578 23:39:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:17:57.578 23:39:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:57.578 23:39:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:57.578 23:39:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:57.578 23:39:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:57.578 23:39:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:17:57.578 23:39:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:57.578 23:39:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:57.578 23:39:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:57.578 23:39:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:57.578 23:39:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:57.578 23:39:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:57.579 23:39:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:57.579 "name": "raid_bdev1", 00:17:57.579 "uuid": "091c0852-0fb0-464d-95fd-5e3adab11048", 00:17:57.579 "strip_size_kb": 0, 00:17:57.579 "state": "online", 00:17:57.579 "raid_level": "raid1", 00:17:57.579 "superblock": false, 00:17:57.579 "num_base_bdevs": 2, 00:17:57.579 "num_base_bdevs_discovered": 1, 00:17:57.579 "num_base_bdevs_operational": 1, 00:17:57.579 "base_bdevs_list": [ 00:17:57.579 { 00:17:57.579 "name": null, 00:17:57.579 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:57.579 "is_configured": false, 00:17:57.579 "data_offset": 0, 00:17:57.579 "data_size": 65536 00:17:57.579 }, 00:17:57.579 { 00:17:57.579 "name": "BaseBdev2", 00:17:57.579 "uuid": "79d8f398-c740-5f8d-856f-293051b9d5d6", 00:17:57.579 "is_configured": true, 00:17:57.579 "data_offset": 0, 00:17:57.579 "data_size": 65536 00:17:57.579 } 00:17:57.579 ] 00:17:57.579 }' 00:17:57.579 23:39:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:57.579 23:39:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:17:58.147 23:39:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:17:58.147 [2024-07-24 23:39:43.062829] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:17:58.147 [2024-07-24 23:39:43.067155] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e09910 00:17:58.147 [2024-07-24 23:39:43.068566] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:17:58.147 23:39:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@646 -- # sleep 1 00:17:59.523 23:39:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:17:59.523 23:39:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:17:59.523 23:39:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:17:59.523 23:39:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:17:59.523 23:39:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:17:59.523 23:39:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:59.523 23:39:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:59.523 23:39:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:17:59.523 "name": "raid_bdev1", 00:17:59.523 "uuid": "091c0852-0fb0-464d-95fd-5e3adab11048", 00:17:59.523 "strip_size_kb": 0, 00:17:59.523 "state": "online", 00:17:59.523 "raid_level": "raid1", 00:17:59.523 "superblock": false, 00:17:59.523 "num_base_bdevs": 2, 00:17:59.523 "num_base_bdevs_discovered": 2, 00:17:59.523 "num_base_bdevs_operational": 2, 00:17:59.523 "process": { 00:17:59.523 "type": "rebuild", 00:17:59.523 "target": "spare", 00:17:59.523 "progress": { 00:17:59.523 "blocks": 22528, 00:17:59.523 "percent": 34 00:17:59.523 } 00:17:59.523 }, 00:17:59.523 "base_bdevs_list": [ 00:17:59.523 { 00:17:59.523 "name": "spare", 00:17:59.523 "uuid": "f48c5c7c-ac11-5425-9f61-f718a58a5939", 00:17:59.523 "is_configured": true, 00:17:59.523 "data_offset": 0, 00:17:59.523 "data_size": 65536 00:17:59.523 }, 00:17:59.523 { 00:17:59.523 "name": "BaseBdev2", 00:17:59.523 "uuid": "79d8f398-c740-5f8d-856f-293051b9d5d6", 00:17:59.523 "is_configured": true, 00:17:59.523 "data_offset": 0, 00:17:59.523 "data_size": 65536 00:17:59.523 } 00:17:59.523 ] 00:17:59.523 }' 00:17:59.523 23:39:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:17:59.523 23:39:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:17:59.523 23:39:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:17:59.523 23:39:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:17:59.523 23:39:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:17:59.523 [2024-07-24 23:39:44.499647] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:17:59.781 [2024-07-24 23:39:44.579119] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:17:59.781 [2024-07-24 23:39:44.579152] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:59.781 [2024-07-24 23:39:44.579160] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:17:59.781 [2024-07-24 23:39:44.579164] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:17:59.781 23:39:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:17:59.782 23:39:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:59.782 23:39:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:59.782 23:39:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:59.782 23:39:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:59.782 23:39:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:17:59.782 23:39:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:59.782 23:39:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:59.782 23:39:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:59.782 23:39:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:59.782 23:39:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:59.782 23:39:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:00.040 23:39:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:00.040 "name": "raid_bdev1", 00:18:00.040 "uuid": "091c0852-0fb0-464d-95fd-5e3adab11048", 00:18:00.040 "strip_size_kb": 0, 00:18:00.040 "state": "online", 00:18:00.040 "raid_level": "raid1", 00:18:00.040 "superblock": false, 00:18:00.040 "num_base_bdevs": 2, 00:18:00.040 "num_base_bdevs_discovered": 1, 00:18:00.040 "num_base_bdevs_operational": 1, 00:18:00.040 "base_bdevs_list": [ 00:18:00.040 { 00:18:00.040 "name": null, 00:18:00.040 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:00.040 "is_configured": false, 00:18:00.040 "data_offset": 0, 00:18:00.040 "data_size": 65536 00:18:00.040 }, 00:18:00.040 { 00:18:00.040 "name": "BaseBdev2", 00:18:00.040 "uuid": "79d8f398-c740-5f8d-856f-293051b9d5d6", 00:18:00.040 "is_configured": true, 00:18:00.040 "data_offset": 0, 00:18:00.040 "data_size": 65536 00:18:00.040 } 00:18:00.040 ] 00:18:00.040 }' 00:18:00.040 23:39:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:00.040 23:39:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:18:00.298 23:39:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:18:00.298 23:39:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:00.298 23:39:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:18:00.298 23:39:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:18:00.298 23:39:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:00.298 23:39:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:00.298 23:39:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:00.557 23:39:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:00.557 "name": "raid_bdev1", 00:18:00.557 "uuid": "091c0852-0fb0-464d-95fd-5e3adab11048", 00:18:00.557 "strip_size_kb": 0, 00:18:00.557 "state": "online", 00:18:00.557 "raid_level": "raid1", 00:18:00.557 "superblock": false, 00:18:00.557 "num_base_bdevs": 2, 00:18:00.557 "num_base_bdevs_discovered": 1, 00:18:00.557 "num_base_bdevs_operational": 1, 00:18:00.557 "base_bdevs_list": [ 00:18:00.557 { 00:18:00.557 "name": null, 00:18:00.557 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:00.557 "is_configured": false, 00:18:00.557 "data_offset": 0, 00:18:00.557 "data_size": 65536 00:18:00.557 }, 00:18:00.557 { 00:18:00.557 "name": "BaseBdev2", 00:18:00.557 "uuid": "79d8f398-c740-5f8d-856f-293051b9d5d6", 00:18:00.557 "is_configured": true, 00:18:00.557 "data_offset": 0, 00:18:00.557 "data_size": 65536 00:18:00.557 } 00:18:00.557 ] 00:18:00.557 }' 00:18:00.557 23:39:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:00.557 23:39:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:18:00.557 23:39:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:00.557 23:39:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:18:00.557 23:39:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:18:00.815 [2024-07-24 23:39:45.682035] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:18:00.815 [2024-07-24 23:39:45.686377] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e09910 00:18:00.815 [2024-07-24 23:39:45.687444] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:18:00.815 23:39:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@662 -- # sleep 1 00:18:01.751 23:39:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:18:01.751 23:39:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:01.751 23:39:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:18:01.751 23:39:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:18:01.751 23:39:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:01.751 23:39:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:01.751 23:39:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:02.011 23:39:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:02.011 "name": "raid_bdev1", 00:18:02.011 "uuid": "091c0852-0fb0-464d-95fd-5e3adab11048", 00:18:02.011 "strip_size_kb": 0, 00:18:02.011 "state": "online", 00:18:02.011 "raid_level": "raid1", 00:18:02.011 "superblock": false, 00:18:02.011 "num_base_bdevs": 2, 00:18:02.011 "num_base_bdevs_discovered": 2, 00:18:02.011 "num_base_bdevs_operational": 2, 00:18:02.011 "process": { 00:18:02.011 "type": "rebuild", 00:18:02.011 "target": "spare", 00:18:02.011 "progress": { 00:18:02.011 "blocks": 22528, 00:18:02.011 "percent": 34 00:18:02.011 } 00:18:02.011 }, 00:18:02.011 "base_bdevs_list": [ 00:18:02.011 { 00:18:02.011 "name": "spare", 00:18:02.011 "uuid": "f48c5c7c-ac11-5425-9f61-f718a58a5939", 00:18:02.011 "is_configured": true, 00:18:02.011 "data_offset": 0, 00:18:02.011 "data_size": 65536 00:18:02.011 }, 00:18:02.011 { 00:18:02.011 "name": "BaseBdev2", 00:18:02.011 "uuid": "79d8f398-c740-5f8d-856f-293051b9d5d6", 00:18:02.011 "is_configured": true, 00:18:02.011 "data_offset": 0, 00:18:02.011 "data_size": 65536 00:18:02.011 } 00:18:02.011 ] 00:18:02.011 }' 00:18:02.011 23:39:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:02.011 23:39:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:18:02.011 23:39:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:02.011 23:39:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:18:02.011 23:39:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:18:02.011 23:39:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:18:02.011 23:39:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:18:02.011 23:39:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:18:02.011 23:39:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@705 -- # local timeout=579 00:18:02.011 23:39:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:18:02.011 23:39:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:18:02.011 23:39:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:02.011 23:39:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:18:02.011 23:39:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:18:02.011 23:39:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:02.011 23:39:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:02.011 23:39:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:02.269 23:39:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:02.269 "name": "raid_bdev1", 00:18:02.269 "uuid": "091c0852-0fb0-464d-95fd-5e3adab11048", 00:18:02.269 "strip_size_kb": 0, 00:18:02.269 "state": "online", 00:18:02.269 "raid_level": "raid1", 00:18:02.269 "superblock": false, 00:18:02.269 "num_base_bdevs": 2, 00:18:02.269 "num_base_bdevs_discovered": 2, 00:18:02.269 "num_base_bdevs_operational": 2, 00:18:02.269 "process": { 00:18:02.269 "type": "rebuild", 00:18:02.269 "target": "spare", 00:18:02.269 "progress": { 00:18:02.269 "blocks": 28672, 00:18:02.269 "percent": 43 00:18:02.269 } 00:18:02.269 }, 00:18:02.269 "base_bdevs_list": [ 00:18:02.269 { 00:18:02.269 "name": "spare", 00:18:02.269 "uuid": "f48c5c7c-ac11-5425-9f61-f718a58a5939", 00:18:02.269 "is_configured": true, 00:18:02.269 "data_offset": 0, 00:18:02.269 "data_size": 65536 00:18:02.269 }, 00:18:02.269 { 00:18:02.269 "name": "BaseBdev2", 00:18:02.269 "uuid": "79d8f398-c740-5f8d-856f-293051b9d5d6", 00:18:02.269 "is_configured": true, 00:18:02.269 "data_offset": 0, 00:18:02.269 "data_size": 65536 00:18:02.269 } 00:18:02.269 ] 00:18:02.269 }' 00:18:02.269 23:39:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:02.269 23:39:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:18:02.269 23:39:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:02.269 23:39:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:18:02.269 23:39:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:18:03.268 23:39:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:18:03.268 23:39:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:18:03.268 23:39:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:03.268 23:39:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:18:03.268 23:39:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:18:03.268 23:39:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:03.268 23:39:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:03.268 23:39:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:03.527 23:39:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:03.527 "name": "raid_bdev1", 00:18:03.527 "uuid": "091c0852-0fb0-464d-95fd-5e3adab11048", 00:18:03.527 "strip_size_kb": 0, 00:18:03.527 "state": "online", 00:18:03.527 "raid_level": "raid1", 00:18:03.527 "superblock": false, 00:18:03.527 "num_base_bdevs": 2, 00:18:03.527 "num_base_bdevs_discovered": 2, 00:18:03.527 "num_base_bdevs_operational": 2, 00:18:03.527 "process": { 00:18:03.527 "type": "rebuild", 00:18:03.527 "target": "spare", 00:18:03.527 "progress": { 00:18:03.527 "blocks": 53248, 00:18:03.527 "percent": 81 00:18:03.527 } 00:18:03.527 }, 00:18:03.527 "base_bdevs_list": [ 00:18:03.527 { 00:18:03.527 "name": "spare", 00:18:03.527 "uuid": "f48c5c7c-ac11-5425-9f61-f718a58a5939", 00:18:03.527 "is_configured": true, 00:18:03.527 "data_offset": 0, 00:18:03.527 "data_size": 65536 00:18:03.527 }, 00:18:03.527 { 00:18:03.527 "name": "BaseBdev2", 00:18:03.527 "uuid": "79d8f398-c740-5f8d-856f-293051b9d5d6", 00:18:03.527 "is_configured": true, 00:18:03.527 "data_offset": 0, 00:18:03.527 "data_size": 65536 00:18:03.527 } 00:18:03.527 ] 00:18:03.527 }' 00:18:03.527 23:39:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:03.527 23:39:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:18:03.527 23:39:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:03.527 23:39:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:18:03.527 23:39:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:18:04.093 [2024-07-24 23:39:48.909587] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:18:04.093 [2024-07-24 23:39:48.909627] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:18:04.093 [2024-07-24 23:39:48.909652] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:04.660 23:39:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:18:04.660 23:39:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:18:04.660 23:39:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:04.660 23:39:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:18:04.660 23:39:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:18:04.660 23:39:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:04.660 23:39:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:04.660 23:39:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:04.660 23:39:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:04.660 "name": "raid_bdev1", 00:18:04.660 "uuid": "091c0852-0fb0-464d-95fd-5e3adab11048", 00:18:04.660 "strip_size_kb": 0, 00:18:04.660 "state": "online", 00:18:04.660 "raid_level": "raid1", 00:18:04.660 "superblock": false, 00:18:04.660 "num_base_bdevs": 2, 00:18:04.660 "num_base_bdevs_discovered": 2, 00:18:04.660 "num_base_bdevs_operational": 2, 00:18:04.660 "base_bdevs_list": [ 00:18:04.660 { 00:18:04.660 "name": "spare", 00:18:04.660 "uuid": "f48c5c7c-ac11-5425-9f61-f718a58a5939", 00:18:04.660 "is_configured": true, 00:18:04.660 "data_offset": 0, 00:18:04.660 "data_size": 65536 00:18:04.660 }, 00:18:04.660 { 00:18:04.660 "name": "BaseBdev2", 00:18:04.660 "uuid": "79d8f398-c740-5f8d-856f-293051b9d5d6", 00:18:04.660 "is_configured": true, 00:18:04.660 "data_offset": 0, 00:18:04.660 "data_size": 65536 00:18:04.660 } 00:18:04.660 ] 00:18:04.660 }' 00:18:04.660 23:39:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:04.918 23:39:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:18:04.918 23:39:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:04.918 23:39:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:18:04.918 23:39:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # break 00:18:04.918 23:39:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:18:04.918 23:39:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:04.918 23:39:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:18:04.918 23:39:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:18:04.918 23:39:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:04.918 23:39:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:04.918 23:39:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:04.918 23:39:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:04.918 "name": "raid_bdev1", 00:18:04.918 "uuid": "091c0852-0fb0-464d-95fd-5e3adab11048", 00:18:04.918 "strip_size_kb": 0, 00:18:04.918 "state": "online", 00:18:04.918 "raid_level": "raid1", 00:18:04.918 "superblock": false, 00:18:04.918 "num_base_bdevs": 2, 00:18:04.918 "num_base_bdevs_discovered": 2, 00:18:04.918 "num_base_bdevs_operational": 2, 00:18:04.918 "base_bdevs_list": [ 00:18:04.918 { 00:18:04.918 "name": "spare", 00:18:04.918 "uuid": "f48c5c7c-ac11-5425-9f61-f718a58a5939", 00:18:04.918 "is_configured": true, 00:18:04.918 "data_offset": 0, 00:18:04.918 "data_size": 65536 00:18:04.918 }, 00:18:04.918 { 00:18:04.918 "name": "BaseBdev2", 00:18:04.918 "uuid": "79d8f398-c740-5f8d-856f-293051b9d5d6", 00:18:04.918 "is_configured": true, 00:18:04.918 "data_offset": 0, 00:18:04.918 "data_size": 65536 00:18:04.918 } 00:18:04.918 ] 00:18:04.918 }' 00:18:04.918 23:39:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:05.177 23:39:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:18:05.177 23:39:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:05.177 23:39:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:18:05.177 23:39:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:18:05.177 23:39:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:05.177 23:39:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:05.177 23:39:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:05.177 23:39:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:05.177 23:39:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:18:05.177 23:39:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:05.177 23:39:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:05.177 23:39:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:05.177 23:39:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:05.177 23:39:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:05.177 23:39:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:05.177 23:39:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:05.177 "name": "raid_bdev1", 00:18:05.177 "uuid": "091c0852-0fb0-464d-95fd-5e3adab11048", 00:18:05.177 "strip_size_kb": 0, 00:18:05.177 "state": "online", 00:18:05.177 "raid_level": "raid1", 00:18:05.177 "superblock": false, 00:18:05.177 "num_base_bdevs": 2, 00:18:05.177 "num_base_bdevs_discovered": 2, 00:18:05.177 "num_base_bdevs_operational": 2, 00:18:05.177 "base_bdevs_list": [ 00:18:05.177 { 00:18:05.177 "name": "spare", 00:18:05.177 "uuid": "f48c5c7c-ac11-5425-9f61-f718a58a5939", 00:18:05.177 "is_configured": true, 00:18:05.177 "data_offset": 0, 00:18:05.177 "data_size": 65536 00:18:05.177 }, 00:18:05.177 { 00:18:05.177 "name": "BaseBdev2", 00:18:05.177 "uuid": "79d8f398-c740-5f8d-856f-293051b9d5d6", 00:18:05.177 "is_configured": true, 00:18:05.177 "data_offset": 0, 00:18:05.177 "data_size": 65536 00:18:05.177 } 00:18:05.177 ] 00:18:05.177 }' 00:18:05.177 23:39:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:05.177 23:39:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:18:05.743 23:39:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:06.002 [2024-07-24 23:39:50.802199] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:06.002 [2024-07-24 23:39:50.802217] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:06.002 [2024-07-24 23:39:50.802258] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:06.002 [2024-07-24 23:39:50.802296] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:06.002 [2024-07-24 23:39:50.802302] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e104f0 name raid_bdev1, state offline 00:18:06.002 23:39:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:06.002 23:39:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # jq length 00:18:06.002 23:39:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:18:06.002 23:39:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:18:06.002 23:39:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:18:06.002 23:39:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:18:06.002 23:39:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:18:06.002 23:39:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:18:06.002 23:39:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:18:06.002 23:39:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:18:06.002 23:39:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:18:06.002 23:39:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:18:06.002 23:39:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:18:06.002 23:39:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:18:06.002 23:39:50 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:18:06.260 /dev/nbd0 00:18:06.260 23:39:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:18:06.260 23:39:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:18:06.260 23:39:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:18:06.260 23:39:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # local i 00:18:06.260 23:39:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:18:06.260 23:39:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:18:06.260 23:39:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:18:06.260 23:39:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@873 -- # break 00:18:06.260 23:39:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:18:06.260 23:39:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:18:06.260 23:39:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:18:06.260 1+0 records in 00:18:06.260 1+0 records out 00:18:06.260 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000189081 s, 21.7 MB/s 00:18:06.260 23:39:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:18:06.260 23:39:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # size=4096 00:18:06.260 23:39:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:18:06.260 23:39:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:18:06.260 23:39:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@889 -- # return 0 00:18:06.260 23:39:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:18:06.260 23:39:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:18:06.260 23:39:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:18:06.519 /dev/nbd1 00:18:06.519 23:39:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:18:06.519 23:39:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:18:06.519 23:39:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:18:06.519 23:39:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # local i 00:18:06.519 23:39:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:18:06.519 23:39:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:18:06.519 23:39:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:18:06.519 23:39:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@873 -- # break 00:18:06.519 23:39:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:18:06.519 23:39:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:18:06.519 23:39:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:18:06.519 1+0 records in 00:18:06.519 1+0 records out 00:18:06.519 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000209431 s, 19.6 MB/s 00:18:06.519 23:39:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:18:06.519 23:39:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # size=4096 00:18:06.519 23:39:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:18:06.519 23:39:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:18:06.519 23:39:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@889 -- # return 0 00:18:06.519 23:39:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:18:06.519 23:39:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:18:06.519 23:39:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@737 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:18:06.519 23:39:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:18:06.519 23:39:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:18:06.519 23:39:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:18:06.519 23:39:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:18:06.519 23:39:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:18:06.519 23:39:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:18:06.519 23:39:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:18:06.778 23:39:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:18:06.778 23:39:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:18:06.778 23:39:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:18:06.778 23:39:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:18:06.778 23:39:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:18:06.778 23:39:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:18:06.778 23:39:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:18:06.778 23:39:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:18:06.778 23:39:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:18:06.778 23:39:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:18:07.037 23:39:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:18:07.037 23:39:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:18:07.037 23:39:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:18:07.037 23:39:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:18:07.037 23:39:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:18:07.037 23:39:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:18:07.037 23:39:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:18:07.037 23:39:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:18:07.037 23:39:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:18:07.037 23:39:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@782 -- # killprocess 345894 00:18:07.037 23:39:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@950 -- # '[' -z 345894 ']' 00:18:07.037 23:39:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # kill -0 345894 00:18:07.037 23:39:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@955 -- # uname 00:18:07.037 23:39:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:18:07.037 23:39:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 345894 00:18:07.037 23:39:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:18:07.037 23:39:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:18:07.037 23:39:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 345894' 00:18:07.037 killing process with pid 345894 00:18:07.037 23:39:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@969 -- # kill 345894 00:18:07.037 Received shutdown signal, test time was about 60.000000 seconds 00:18:07.037 00:18:07.037 Latency(us) 00:18:07.037 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:18:07.037 =================================================================================================================== 00:18:07.037 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:18:07.037 [2024-07-24 23:39:51.883903] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:07.037 23:39:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@974 -- # wait 345894 00:18:07.037 [2024-07-24 23:39:51.907249] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:07.296 23:39:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@784 -- # return 0 00:18:07.296 00:18:07.296 real 0m17.053s 00:18:07.296 user 0m23.317s 00:18:07.296 sys 0m2.872s 00:18:07.296 23:39:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:18:07.296 23:39:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:18:07.296 ************************************ 00:18:07.296 END TEST raid_rebuild_test 00:18:07.296 ************************************ 00:18:07.296 23:39:52 bdev_raid -- bdev/bdev_raid.sh@878 -- # run_test raid_rebuild_test_sb raid_rebuild_test raid1 2 true false true 00:18:07.296 23:39:52 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:18:07.296 23:39:52 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:18:07.296 23:39:52 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:07.296 ************************************ 00:18:07.296 START TEST raid_rebuild_test_sb 00:18:07.296 ************************************ 00:18:07.296 23:39:52 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1125 -- # raid_rebuild_test raid1 2 true false true 00:18:07.296 23:39:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:18:07.296 23:39:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:18:07.296 23:39:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:18:07.296 23:39:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:18:07.296 23:39:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@572 -- # local verify=true 00:18:07.296 23:39:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:18:07.296 23:39:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:18:07.296 23:39:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:18:07.296 23:39:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:18:07.296 23:39:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:18:07.296 23:39:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:18:07.296 23:39:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:18:07.296 23:39:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:18:07.296 23:39:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:18:07.296 23:39:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:18:07.296 23:39:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:18:07.296 23:39:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # local strip_size 00:18:07.296 23:39:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@576 -- # local create_arg 00:18:07.296 23:39:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:18:07.296 23:39:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@578 -- # local data_offset 00:18:07.296 23:39:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:18:07.296 23:39:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:18:07.296 23:39:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:18:07.296 23:39:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:18:07.296 23:39:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@596 -- # raid_pid=349020 00:18:07.296 23:39:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@597 -- # waitforlisten 349020 /var/tmp/spdk-raid.sock 00:18:07.296 23:39:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:18:07.296 23:39:52 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@831 -- # '[' -z 349020 ']' 00:18:07.296 23:39:52 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:07.296 23:39:52 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@836 -- # local max_retries=100 00:18:07.296 23:39:52 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:07.296 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:07.296 23:39:52 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@840 -- # xtrace_disable 00:18:07.296 23:39:52 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:07.296 [2024-07-24 23:39:52.202163] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:18:07.296 [2024-07-24 23:39:52.202200] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid349020 ] 00:18:07.296 I/O size of 3145728 is greater than zero copy threshold (65536). 00:18:07.296 Zero copy mechanism will not be used. 00:18:07.296 [2024-07-24 23:39:52.266686] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:07.555 [2024-07-24 23:39:52.345127] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:07.555 [2024-07-24 23:39:52.396195] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:07.555 [2024-07-24 23:39:52.396225] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:08.123 23:39:52 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:18:08.123 23:39:52 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@864 -- # return 0 00:18:08.123 23:39:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:18:08.123 23:39:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:18:08.381 BaseBdev1_malloc 00:18:08.381 23:39:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:18:08.381 [2024-07-24 23:39:53.331117] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:18:08.381 [2024-07-24 23:39:53.331150] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:08.381 [2024-07-24 23:39:53.331163] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24991c0 00:18:08.381 [2024-07-24 23:39:53.331169] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:08.381 [2024-07-24 23:39:53.332312] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:08.381 [2024-07-24 23:39:53.332332] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:18:08.381 BaseBdev1 00:18:08.381 23:39:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:18:08.381 23:39:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:18:08.639 BaseBdev2_malloc 00:18:08.639 23:39:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:18:08.897 [2024-07-24 23:39:53.667631] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:18:08.897 [2024-07-24 23:39:53.667662] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:08.897 [2024-07-24 23:39:53.667675] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2499ce0 00:18:08.897 [2024-07-24 23:39:53.667697] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:08.897 [2024-07-24 23:39:53.668748] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:08.897 [2024-07-24 23:39:53.668768] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:18:08.897 BaseBdev2 00:18:08.897 23:39:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:18:08.897 spare_malloc 00:18:08.897 23:39:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:18:09.155 spare_delay 00:18:09.155 23:39:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:18:09.414 [2024-07-24 23:39:54.160305] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:18:09.414 [2024-07-24 23:39:54.160334] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:09.414 [2024-07-24 23:39:54.160345] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2648340 00:18:09.414 [2024-07-24 23:39:54.160351] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:09.414 [2024-07-24 23:39:54.161374] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:09.414 [2024-07-24 23:39:54.161394] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:18:09.414 spare 00:18:09.414 23:39:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:18:09.414 [2024-07-24 23:39:54.312723] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:09.414 [2024-07-24 23:39:54.313559] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:09.414 [2024-07-24 23:39:54.313671] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x26494f0 00:18:09.414 [2024-07-24 23:39:54.313679] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:18:09.414 [2024-07-24 23:39:54.313806] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2642910 00:18:09.414 [2024-07-24 23:39:54.313901] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x26494f0 00:18:09.414 [2024-07-24 23:39:54.313906] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x26494f0 00:18:09.414 [2024-07-24 23:39:54.313968] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:09.414 23:39:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:18:09.414 23:39:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:09.414 23:39:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:09.414 23:39:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:09.414 23:39:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:09.414 23:39:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:18:09.414 23:39:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:09.414 23:39:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:09.414 23:39:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:09.414 23:39:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:09.414 23:39:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:09.414 23:39:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:09.672 23:39:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:09.672 "name": "raid_bdev1", 00:18:09.672 "uuid": "bdfd073b-af41-43ce-9703-5909d688425e", 00:18:09.672 "strip_size_kb": 0, 00:18:09.672 "state": "online", 00:18:09.672 "raid_level": "raid1", 00:18:09.672 "superblock": true, 00:18:09.672 "num_base_bdevs": 2, 00:18:09.672 "num_base_bdevs_discovered": 2, 00:18:09.672 "num_base_bdevs_operational": 2, 00:18:09.672 "base_bdevs_list": [ 00:18:09.672 { 00:18:09.672 "name": "BaseBdev1", 00:18:09.672 "uuid": "0b5cd56b-410b-59ef-92e9-9d0b418eec7f", 00:18:09.672 "is_configured": true, 00:18:09.672 "data_offset": 2048, 00:18:09.672 "data_size": 63488 00:18:09.672 }, 00:18:09.672 { 00:18:09.672 "name": "BaseBdev2", 00:18:09.672 "uuid": "67f0cf39-ff43-5d15-b35f-d911fe0d9f94", 00:18:09.672 "is_configured": true, 00:18:09.672 "data_offset": 2048, 00:18:09.672 "data_size": 63488 00:18:09.672 } 00:18:09.672 ] 00:18:09.672 }' 00:18:09.672 23:39:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:09.672 23:39:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:10.239 23:39:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:10.239 23:39:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:18:10.239 [2024-07-24 23:39:55.110981] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:10.239 23:39:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:18:10.239 23:39:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:10.239 23:39:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:18:10.499 23:39:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:18:10.499 23:39:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:18:10.499 23:39:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:18:10.499 23:39:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:18:10.499 23:39:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:18:10.499 23:39:55 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:18:10.499 23:39:55 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:18:10.499 23:39:55 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:18:10.499 23:39:55 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:18:10.499 23:39:55 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:18:10.499 23:39:55 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:18:10.499 23:39:55 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:18:10.499 23:39:55 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:18:10.499 23:39:55 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:18:10.499 [2024-07-24 23:39:55.459753] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2642910 00:18:10.499 /dev/nbd0 00:18:10.499 23:39:55 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:18:10.499 23:39:55 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:18:10.499 23:39:55 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:18:10.499 23:39:55 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # local i 00:18:10.499 23:39:55 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:18:10.499 23:39:55 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:18:10.499 23:39:55 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:18:10.757 23:39:55 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@873 -- # break 00:18:10.757 23:39:55 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:18:10.757 23:39:55 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:18:10.757 23:39:55 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:18:10.757 1+0 records in 00:18:10.757 1+0 records out 00:18:10.757 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00022665 s, 18.1 MB/s 00:18:10.757 23:39:55 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:18:10.757 23:39:55 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # size=4096 00:18:10.757 23:39:55 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:18:10.757 23:39:55 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:18:10.757 23:39:55 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@889 -- # return 0 00:18:10.757 23:39:55 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:18:10.757 23:39:55 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:18:10.757 23:39:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:18:10.757 23:39:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:18:10.757 23:39:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=63488 oflag=direct 00:18:14.042 63488+0 records in 00:18:14.042 63488+0 records out 00:18:14.042 32505856 bytes (33 MB, 31 MiB) copied, 3.13932 s, 10.4 MB/s 00:18:14.042 23:39:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:18:14.042 23:39:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:18:14.042 23:39:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:18:14.042 23:39:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:18:14.042 23:39:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:18:14.042 23:39:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:18:14.042 23:39:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:18:14.042 23:39:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:18:14.042 23:39:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:18:14.042 23:39:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:18:14.042 23:39:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:18:14.042 23:39:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:18:14.042 23:39:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:18:14.042 [2024-07-24 23:39:58.841050] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:14.042 23:39:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:18:14.042 23:39:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:18:14.042 23:39:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:18:14.042 [2024-07-24 23:39:58.996561] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:14.043 23:39:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:18:14.043 23:39:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:14.043 23:39:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:14.043 23:39:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:14.043 23:39:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:14.043 23:39:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:18:14.043 23:39:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:14.043 23:39:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:14.043 23:39:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:14.043 23:39:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:14.043 23:39:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:14.043 23:39:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:14.300 23:39:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:14.300 "name": "raid_bdev1", 00:18:14.300 "uuid": "bdfd073b-af41-43ce-9703-5909d688425e", 00:18:14.300 "strip_size_kb": 0, 00:18:14.300 "state": "online", 00:18:14.300 "raid_level": "raid1", 00:18:14.300 "superblock": true, 00:18:14.300 "num_base_bdevs": 2, 00:18:14.300 "num_base_bdevs_discovered": 1, 00:18:14.300 "num_base_bdevs_operational": 1, 00:18:14.300 "base_bdevs_list": [ 00:18:14.300 { 00:18:14.300 "name": null, 00:18:14.300 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:14.300 "is_configured": false, 00:18:14.300 "data_offset": 2048, 00:18:14.300 "data_size": 63488 00:18:14.300 }, 00:18:14.300 { 00:18:14.300 "name": "BaseBdev2", 00:18:14.300 "uuid": "67f0cf39-ff43-5d15-b35f-d911fe0d9f94", 00:18:14.300 "is_configured": true, 00:18:14.300 "data_offset": 2048, 00:18:14.300 "data_size": 63488 00:18:14.300 } 00:18:14.300 ] 00:18:14.300 }' 00:18:14.301 23:39:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:14.301 23:39:59 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:14.867 23:39:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:18:14.867 [2024-07-24 23:39:59.842752] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:18:14.867 [2024-07-24 23:39:59.847088] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2642910 00:18:14.867 [2024-07-24 23:39:59.848527] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:18:14.867 23:39:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@646 -- # sleep 1 00:18:16.331 23:40:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:18:16.332 23:40:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:16.332 23:40:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:18:16.332 23:40:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:18:16.332 23:40:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:16.332 23:40:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:16.332 23:40:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:16.332 23:40:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:16.332 "name": "raid_bdev1", 00:18:16.332 "uuid": "bdfd073b-af41-43ce-9703-5909d688425e", 00:18:16.332 "strip_size_kb": 0, 00:18:16.332 "state": "online", 00:18:16.332 "raid_level": "raid1", 00:18:16.332 "superblock": true, 00:18:16.332 "num_base_bdevs": 2, 00:18:16.332 "num_base_bdevs_discovered": 2, 00:18:16.332 "num_base_bdevs_operational": 2, 00:18:16.332 "process": { 00:18:16.332 "type": "rebuild", 00:18:16.332 "target": "spare", 00:18:16.332 "progress": { 00:18:16.332 "blocks": 22528, 00:18:16.332 "percent": 35 00:18:16.332 } 00:18:16.332 }, 00:18:16.332 "base_bdevs_list": [ 00:18:16.332 { 00:18:16.332 "name": "spare", 00:18:16.332 "uuid": "46a9e48e-3919-5739-b5f3-7ec713d5bb9d", 00:18:16.332 "is_configured": true, 00:18:16.332 "data_offset": 2048, 00:18:16.332 "data_size": 63488 00:18:16.332 }, 00:18:16.332 { 00:18:16.332 "name": "BaseBdev2", 00:18:16.332 "uuid": "67f0cf39-ff43-5d15-b35f-d911fe0d9f94", 00:18:16.332 "is_configured": true, 00:18:16.332 "data_offset": 2048, 00:18:16.332 "data_size": 63488 00:18:16.332 } 00:18:16.332 ] 00:18:16.332 }' 00:18:16.332 23:40:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:16.332 23:40:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:18:16.332 23:40:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:16.332 23:40:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:18:16.332 23:40:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:18:16.332 [2024-07-24 23:40:01.271149] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:18:16.596 [2024-07-24 23:40:01.359076] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:18:16.596 [2024-07-24 23:40:01.359110] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:16.596 [2024-07-24 23:40:01.359119] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:18:16.596 [2024-07-24 23:40:01.359123] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:18:16.596 23:40:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:18:16.596 23:40:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:16.596 23:40:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:16.596 23:40:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:16.596 23:40:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:16.596 23:40:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:18:16.596 23:40:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:16.596 23:40:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:16.596 23:40:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:16.596 23:40:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:16.596 23:40:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:16.596 23:40:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:16.596 23:40:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:16.596 "name": "raid_bdev1", 00:18:16.596 "uuid": "bdfd073b-af41-43ce-9703-5909d688425e", 00:18:16.596 "strip_size_kb": 0, 00:18:16.596 "state": "online", 00:18:16.596 "raid_level": "raid1", 00:18:16.596 "superblock": true, 00:18:16.596 "num_base_bdevs": 2, 00:18:16.596 "num_base_bdevs_discovered": 1, 00:18:16.596 "num_base_bdevs_operational": 1, 00:18:16.596 "base_bdevs_list": [ 00:18:16.596 { 00:18:16.596 "name": null, 00:18:16.596 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:16.596 "is_configured": false, 00:18:16.596 "data_offset": 2048, 00:18:16.596 "data_size": 63488 00:18:16.596 }, 00:18:16.596 { 00:18:16.596 "name": "BaseBdev2", 00:18:16.596 "uuid": "67f0cf39-ff43-5d15-b35f-d911fe0d9f94", 00:18:16.596 "is_configured": true, 00:18:16.596 "data_offset": 2048, 00:18:16.596 "data_size": 63488 00:18:16.596 } 00:18:16.596 ] 00:18:16.596 }' 00:18:16.596 23:40:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:16.596 23:40:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:17.161 23:40:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:18:17.161 23:40:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:17.161 23:40:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:18:17.161 23:40:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:18:17.161 23:40:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:17.161 23:40:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:17.161 23:40:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:17.420 23:40:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:17.420 "name": "raid_bdev1", 00:18:17.420 "uuid": "bdfd073b-af41-43ce-9703-5909d688425e", 00:18:17.420 "strip_size_kb": 0, 00:18:17.420 "state": "online", 00:18:17.420 "raid_level": "raid1", 00:18:17.420 "superblock": true, 00:18:17.420 "num_base_bdevs": 2, 00:18:17.420 "num_base_bdevs_discovered": 1, 00:18:17.420 "num_base_bdevs_operational": 1, 00:18:17.420 "base_bdevs_list": [ 00:18:17.420 { 00:18:17.420 "name": null, 00:18:17.420 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:17.420 "is_configured": false, 00:18:17.420 "data_offset": 2048, 00:18:17.420 "data_size": 63488 00:18:17.420 }, 00:18:17.420 { 00:18:17.420 "name": "BaseBdev2", 00:18:17.420 "uuid": "67f0cf39-ff43-5d15-b35f-d911fe0d9f94", 00:18:17.420 "is_configured": true, 00:18:17.420 "data_offset": 2048, 00:18:17.420 "data_size": 63488 00:18:17.420 } 00:18:17.420 ] 00:18:17.420 }' 00:18:17.420 23:40:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:17.420 23:40:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:18:17.420 23:40:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:17.420 23:40:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:18:17.420 23:40:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:18:17.678 [2024-07-24 23:40:02.453988] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:18:17.678 [2024-07-24 23:40:02.458302] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2642910 00:18:17.678 [2024-07-24 23:40:02.459360] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:18:17.678 23:40:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@662 -- # sleep 1 00:18:18.611 23:40:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:18:18.611 23:40:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:18.611 23:40:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:18:18.611 23:40:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:18:18.611 23:40:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:18.611 23:40:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:18.611 23:40:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:18.869 23:40:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:18.870 "name": "raid_bdev1", 00:18:18.870 "uuid": "bdfd073b-af41-43ce-9703-5909d688425e", 00:18:18.870 "strip_size_kb": 0, 00:18:18.870 "state": "online", 00:18:18.870 "raid_level": "raid1", 00:18:18.870 "superblock": true, 00:18:18.870 "num_base_bdevs": 2, 00:18:18.870 "num_base_bdevs_discovered": 2, 00:18:18.870 "num_base_bdevs_operational": 2, 00:18:18.870 "process": { 00:18:18.870 "type": "rebuild", 00:18:18.870 "target": "spare", 00:18:18.870 "progress": { 00:18:18.870 "blocks": 22528, 00:18:18.870 "percent": 35 00:18:18.870 } 00:18:18.870 }, 00:18:18.870 "base_bdevs_list": [ 00:18:18.870 { 00:18:18.870 "name": "spare", 00:18:18.870 "uuid": "46a9e48e-3919-5739-b5f3-7ec713d5bb9d", 00:18:18.870 "is_configured": true, 00:18:18.870 "data_offset": 2048, 00:18:18.870 "data_size": 63488 00:18:18.870 }, 00:18:18.870 { 00:18:18.870 "name": "BaseBdev2", 00:18:18.870 "uuid": "67f0cf39-ff43-5d15-b35f-d911fe0d9f94", 00:18:18.870 "is_configured": true, 00:18:18.870 "data_offset": 2048, 00:18:18.870 "data_size": 63488 00:18:18.870 } 00:18:18.870 ] 00:18:18.870 }' 00:18:18.870 23:40:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:18.870 23:40:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:18:18.870 23:40:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:18.870 23:40:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:18:18.870 23:40:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:18:18.870 23:40:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:18:18.870 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:18:18.870 23:40:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:18:18.870 23:40:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:18:18.870 23:40:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:18:18.870 23:40:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@705 -- # local timeout=596 00:18:18.870 23:40:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:18:18.870 23:40:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:18:18.870 23:40:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:18.870 23:40:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:18:18.870 23:40:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:18:18.870 23:40:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:18.870 23:40:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:18.870 23:40:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:19.128 23:40:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:19.128 "name": "raid_bdev1", 00:18:19.128 "uuid": "bdfd073b-af41-43ce-9703-5909d688425e", 00:18:19.128 "strip_size_kb": 0, 00:18:19.128 "state": "online", 00:18:19.128 "raid_level": "raid1", 00:18:19.128 "superblock": true, 00:18:19.128 "num_base_bdevs": 2, 00:18:19.128 "num_base_bdevs_discovered": 2, 00:18:19.128 "num_base_bdevs_operational": 2, 00:18:19.128 "process": { 00:18:19.129 "type": "rebuild", 00:18:19.129 "target": "spare", 00:18:19.129 "progress": { 00:18:19.129 "blocks": 28672, 00:18:19.129 "percent": 45 00:18:19.129 } 00:18:19.129 }, 00:18:19.129 "base_bdevs_list": [ 00:18:19.129 { 00:18:19.129 "name": "spare", 00:18:19.129 "uuid": "46a9e48e-3919-5739-b5f3-7ec713d5bb9d", 00:18:19.129 "is_configured": true, 00:18:19.129 "data_offset": 2048, 00:18:19.129 "data_size": 63488 00:18:19.129 }, 00:18:19.129 { 00:18:19.129 "name": "BaseBdev2", 00:18:19.129 "uuid": "67f0cf39-ff43-5d15-b35f-d911fe0d9f94", 00:18:19.129 "is_configured": true, 00:18:19.129 "data_offset": 2048, 00:18:19.129 "data_size": 63488 00:18:19.129 } 00:18:19.129 ] 00:18:19.129 }' 00:18:19.129 23:40:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:19.129 23:40:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:18:19.129 23:40:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:19.129 23:40:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:18:19.129 23:40:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:18:20.064 23:40:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:18:20.064 23:40:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:18:20.064 23:40:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:20.064 23:40:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:18:20.064 23:40:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:18:20.064 23:40:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:20.064 23:40:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:20.064 23:40:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:20.324 23:40:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:20.324 "name": "raid_bdev1", 00:18:20.324 "uuid": "bdfd073b-af41-43ce-9703-5909d688425e", 00:18:20.324 "strip_size_kb": 0, 00:18:20.324 "state": "online", 00:18:20.324 "raid_level": "raid1", 00:18:20.324 "superblock": true, 00:18:20.324 "num_base_bdevs": 2, 00:18:20.324 "num_base_bdevs_discovered": 2, 00:18:20.324 "num_base_bdevs_operational": 2, 00:18:20.324 "process": { 00:18:20.324 "type": "rebuild", 00:18:20.324 "target": "spare", 00:18:20.324 "progress": { 00:18:20.324 "blocks": 53248, 00:18:20.324 "percent": 83 00:18:20.324 } 00:18:20.324 }, 00:18:20.324 "base_bdevs_list": [ 00:18:20.324 { 00:18:20.324 "name": "spare", 00:18:20.324 "uuid": "46a9e48e-3919-5739-b5f3-7ec713d5bb9d", 00:18:20.324 "is_configured": true, 00:18:20.324 "data_offset": 2048, 00:18:20.324 "data_size": 63488 00:18:20.324 }, 00:18:20.324 { 00:18:20.324 "name": "BaseBdev2", 00:18:20.324 "uuid": "67f0cf39-ff43-5d15-b35f-d911fe0d9f94", 00:18:20.324 "is_configured": true, 00:18:20.324 "data_offset": 2048, 00:18:20.324 "data_size": 63488 00:18:20.324 } 00:18:20.324 ] 00:18:20.324 }' 00:18:20.324 23:40:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:20.324 23:40:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:18:20.324 23:40:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:20.324 23:40:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:18:20.324 23:40:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:18:20.583 [2024-07-24 23:40:05.580867] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:18:20.583 [2024-07-24 23:40:05.580909] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:18:20.583 [2024-07-24 23:40:05.580968] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:21.514 23:40:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:18:21.514 23:40:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:18:21.514 23:40:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:21.514 23:40:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:18:21.514 23:40:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:18:21.514 23:40:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:21.514 23:40:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:21.514 23:40:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:21.514 23:40:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:21.514 "name": "raid_bdev1", 00:18:21.514 "uuid": "bdfd073b-af41-43ce-9703-5909d688425e", 00:18:21.514 "strip_size_kb": 0, 00:18:21.514 "state": "online", 00:18:21.514 "raid_level": "raid1", 00:18:21.514 "superblock": true, 00:18:21.514 "num_base_bdevs": 2, 00:18:21.514 "num_base_bdevs_discovered": 2, 00:18:21.514 "num_base_bdevs_operational": 2, 00:18:21.514 "base_bdevs_list": [ 00:18:21.514 { 00:18:21.514 "name": "spare", 00:18:21.514 "uuid": "46a9e48e-3919-5739-b5f3-7ec713d5bb9d", 00:18:21.514 "is_configured": true, 00:18:21.514 "data_offset": 2048, 00:18:21.514 "data_size": 63488 00:18:21.514 }, 00:18:21.514 { 00:18:21.514 "name": "BaseBdev2", 00:18:21.514 "uuid": "67f0cf39-ff43-5d15-b35f-d911fe0d9f94", 00:18:21.514 "is_configured": true, 00:18:21.514 "data_offset": 2048, 00:18:21.514 "data_size": 63488 00:18:21.514 } 00:18:21.514 ] 00:18:21.514 }' 00:18:21.514 23:40:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:21.514 23:40:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:18:21.514 23:40:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:21.514 23:40:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:18:21.514 23:40:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # break 00:18:21.514 23:40:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:18:21.514 23:40:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:21.514 23:40:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:18:21.514 23:40:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:18:21.514 23:40:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:21.514 23:40:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:21.514 23:40:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:21.772 23:40:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:21.772 "name": "raid_bdev1", 00:18:21.772 "uuid": "bdfd073b-af41-43ce-9703-5909d688425e", 00:18:21.772 "strip_size_kb": 0, 00:18:21.772 "state": "online", 00:18:21.772 "raid_level": "raid1", 00:18:21.772 "superblock": true, 00:18:21.772 "num_base_bdevs": 2, 00:18:21.772 "num_base_bdevs_discovered": 2, 00:18:21.772 "num_base_bdevs_operational": 2, 00:18:21.772 "base_bdevs_list": [ 00:18:21.772 { 00:18:21.772 "name": "spare", 00:18:21.772 "uuid": "46a9e48e-3919-5739-b5f3-7ec713d5bb9d", 00:18:21.772 "is_configured": true, 00:18:21.772 "data_offset": 2048, 00:18:21.772 "data_size": 63488 00:18:21.772 }, 00:18:21.772 { 00:18:21.772 "name": "BaseBdev2", 00:18:21.772 "uuid": "67f0cf39-ff43-5d15-b35f-d911fe0d9f94", 00:18:21.772 "is_configured": true, 00:18:21.772 "data_offset": 2048, 00:18:21.772 "data_size": 63488 00:18:21.772 } 00:18:21.772 ] 00:18:21.772 }' 00:18:21.772 23:40:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:21.772 23:40:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:18:21.772 23:40:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:21.772 23:40:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:18:21.772 23:40:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:18:21.772 23:40:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:21.772 23:40:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:21.772 23:40:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:21.772 23:40:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:21.772 23:40:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:18:21.772 23:40:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:21.772 23:40:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:21.772 23:40:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:21.772 23:40:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:21.772 23:40:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:21.772 23:40:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:22.031 23:40:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:22.031 "name": "raid_bdev1", 00:18:22.031 "uuid": "bdfd073b-af41-43ce-9703-5909d688425e", 00:18:22.031 "strip_size_kb": 0, 00:18:22.031 "state": "online", 00:18:22.031 "raid_level": "raid1", 00:18:22.031 "superblock": true, 00:18:22.031 "num_base_bdevs": 2, 00:18:22.031 "num_base_bdevs_discovered": 2, 00:18:22.031 "num_base_bdevs_operational": 2, 00:18:22.031 "base_bdevs_list": [ 00:18:22.031 { 00:18:22.031 "name": "spare", 00:18:22.031 "uuid": "46a9e48e-3919-5739-b5f3-7ec713d5bb9d", 00:18:22.031 "is_configured": true, 00:18:22.031 "data_offset": 2048, 00:18:22.031 "data_size": 63488 00:18:22.031 }, 00:18:22.031 { 00:18:22.031 "name": "BaseBdev2", 00:18:22.031 "uuid": "67f0cf39-ff43-5d15-b35f-d911fe0d9f94", 00:18:22.031 "is_configured": true, 00:18:22.031 "data_offset": 2048, 00:18:22.031 "data_size": 63488 00:18:22.031 } 00:18:22.031 ] 00:18:22.031 }' 00:18:22.031 23:40:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:22.031 23:40:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:22.597 23:40:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:22.597 [2024-07-24 23:40:07.566035] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:22.597 [2024-07-24 23:40:07.566055] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:22.597 [2024-07-24 23:40:07.566097] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:22.597 [2024-07-24 23:40:07.566136] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:22.597 [2024-07-24 23:40:07.566142] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x26494f0 name raid_bdev1, state offline 00:18:22.597 23:40:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:22.597 23:40:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # jq length 00:18:22.855 23:40:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:18:22.855 23:40:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:18:22.855 23:40:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:18:22.855 23:40:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:18:22.855 23:40:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:18:22.856 23:40:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:18:22.856 23:40:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:18:22.856 23:40:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:18:22.856 23:40:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:18:22.856 23:40:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:18:22.856 23:40:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:18:22.856 23:40:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:18:22.856 23:40:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:18:23.112 /dev/nbd0 00:18:23.112 23:40:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:18:23.112 23:40:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:18:23.112 23:40:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:18:23.112 23:40:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # local i 00:18:23.112 23:40:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:18:23.112 23:40:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:18:23.112 23:40:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:18:23.112 23:40:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@873 -- # break 00:18:23.112 23:40:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:18:23.112 23:40:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:18:23.112 23:40:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:18:23.112 1+0 records in 00:18:23.112 1+0 records out 00:18:23.112 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000227593 s, 18.0 MB/s 00:18:23.112 23:40:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:18:23.112 23:40:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # size=4096 00:18:23.112 23:40:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:18:23.112 23:40:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:18:23.113 23:40:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@889 -- # return 0 00:18:23.113 23:40:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:18:23.113 23:40:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:18:23.113 23:40:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:18:23.370 /dev/nbd1 00:18:23.370 23:40:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:18:23.370 23:40:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:18:23.370 23:40:08 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:18:23.370 23:40:08 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # local i 00:18:23.370 23:40:08 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:18:23.370 23:40:08 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:18:23.370 23:40:08 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:18:23.370 23:40:08 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@873 -- # break 00:18:23.370 23:40:08 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:18:23.370 23:40:08 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:18:23.370 23:40:08 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:18:23.370 1+0 records in 00:18:23.370 1+0 records out 00:18:23.370 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000232612 s, 17.6 MB/s 00:18:23.370 23:40:08 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:18:23.370 23:40:08 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # size=4096 00:18:23.370 23:40:08 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:18:23.370 23:40:08 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:18:23.370 23:40:08 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@889 -- # return 0 00:18:23.370 23:40:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:18:23.370 23:40:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:18:23.370 23:40:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:18:23.370 23:40:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:18:23.370 23:40:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:18:23.370 23:40:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:18:23.370 23:40:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:18:23.370 23:40:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:18:23.371 23:40:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:18:23.371 23:40:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:18:23.629 23:40:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:18:23.629 23:40:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:18:23.629 23:40:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:18:23.629 23:40:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:18:23.629 23:40:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:18:23.629 23:40:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:18:23.629 23:40:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:18:23.629 23:40:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:18:23.629 23:40:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:18:23.629 23:40:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:18:23.629 23:40:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:18:23.629 23:40:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:18:23.629 23:40:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:18:23.629 23:40:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:18:23.629 23:40:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:18:23.629 23:40:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:18:23.629 23:40:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:18:23.629 23:40:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:18:23.629 23:40:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:18:23.629 23:40:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:18:23.888 23:40:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:18:24.147 [2024-07-24 23:40:08.935307] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:18:24.147 [2024-07-24 23:40:08.935342] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:24.147 [2024-07-24 23:40:08.935357] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25b50c0 00:18:24.147 [2024-07-24 23:40:08.935364] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:24.147 [2024-07-24 23:40:08.936570] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:24.147 [2024-07-24 23:40:08.936590] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:18:24.147 [2024-07-24 23:40:08.936645] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:18:24.147 [2024-07-24 23:40:08.936663] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:18:24.147 [2024-07-24 23:40:08.936736] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:24.147 spare 00:18:24.147 23:40:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:18:24.147 23:40:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:24.147 23:40:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:24.147 23:40:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:24.147 23:40:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:24.147 23:40:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:18:24.147 23:40:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:24.147 23:40:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:24.147 23:40:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:24.147 23:40:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:24.147 23:40:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:24.147 23:40:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:24.147 [2024-07-24 23:40:09.037028] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x26476e0 00:18:24.147 [2024-07-24 23:40:09.037040] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:18:24.147 [2024-07-24 23:40:09.037187] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x264b850 00:18:24.147 [2024-07-24 23:40:09.037304] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x26476e0 00:18:24.147 [2024-07-24 23:40:09.037310] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x26476e0 00:18:24.147 [2024-07-24 23:40:09.037386] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:24.147 23:40:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:24.147 "name": "raid_bdev1", 00:18:24.147 "uuid": "bdfd073b-af41-43ce-9703-5909d688425e", 00:18:24.147 "strip_size_kb": 0, 00:18:24.147 "state": "online", 00:18:24.147 "raid_level": "raid1", 00:18:24.147 "superblock": true, 00:18:24.147 "num_base_bdevs": 2, 00:18:24.147 "num_base_bdevs_discovered": 2, 00:18:24.147 "num_base_bdevs_operational": 2, 00:18:24.147 "base_bdevs_list": [ 00:18:24.147 { 00:18:24.147 "name": "spare", 00:18:24.147 "uuid": "46a9e48e-3919-5739-b5f3-7ec713d5bb9d", 00:18:24.147 "is_configured": true, 00:18:24.147 "data_offset": 2048, 00:18:24.147 "data_size": 63488 00:18:24.147 }, 00:18:24.147 { 00:18:24.147 "name": "BaseBdev2", 00:18:24.147 "uuid": "67f0cf39-ff43-5d15-b35f-d911fe0d9f94", 00:18:24.147 "is_configured": true, 00:18:24.147 "data_offset": 2048, 00:18:24.147 "data_size": 63488 00:18:24.147 } 00:18:24.147 ] 00:18:24.147 }' 00:18:24.147 23:40:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:24.147 23:40:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:24.713 23:40:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:18:24.714 23:40:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:24.714 23:40:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:18:24.714 23:40:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:18:24.714 23:40:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:24.714 23:40:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:24.714 23:40:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:24.972 23:40:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:24.972 "name": "raid_bdev1", 00:18:24.972 "uuid": "bdfd073b-af41-43ce-9703-5909d688425e", 00:18:24.972 "strip_size_kb": 0, 00:18:24.972 "state": "online", 00:18:24.972 "raid_level": "raid1", 00:18:24.972 "superblock": true, 00:18:24.972 "num_base_bdevs": 2, 00:18:24.972 "num_base_bdevs_discovered": 2, 00:18:24.972 "num_base_bdevs_operational": 2, 00:18:24.972 "base_bdevs_list": [ 00:18:24.972 { 00:18:24.972 "name": "spare", 00:18:24.972 "uuid": "46a9e48e-3919-5739-b5f3-7ec713d5bb9d", 00:18:24.972 "is_configured": true, 00:18:24.972 "data_offset": 2048, 00:18:24.972 "data_size": 63488 00:18:24.972 }, 00:18:24.972 { 00:18:24.972 "name": "BaseBdev2", 00:18:24.972 "uuid": "67f0cf39-ff43-5d15-b35f-d911fe0d9f94", 00:18:24.972 "is_configured": true, 00:18:24.972 "data_offset": 2048, 00:18:24.972 "data_size": 63488 00:18:24.972 } 00:18:24.972 ] 00:18:24.972 }' 00:18:24.972 23:40:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:24.972 23:40:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:18:24.972 23:40:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:24.972 23:40:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:18:24.972 23:40:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:18:24.972 23:40:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:25.231 23:40:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:18:25.231 23:40:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:18:25.231 [2024-07-24 23:40:10.146515] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:18:25.231 23:40:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:18:25.231 23:40:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:25.231 23:40:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:25.231 23:40:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:25.231 23:40:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:25.231 23:40:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:18:25.231 23:40:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:25.231 23:40:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:25.231 23:40:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:25.231 23:40:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:25.231 23:40:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:25.231 23:40:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:25.490 23:40:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:25.490 "name": "raid_bdev1", 00:18:25.490 "uuid": "bdfd073b-af41-43ce-9703-5909d688425e", 00:18:25.490 "strip_size_kb": 0, 00:18:25.490 "state": "online", 00:18:25.490 "raid_level": "raid1", 00:18:25.490 "superblock": true, 00:18:25.490 "num_base_bdevs": 2, 00:18:25.490 "num_base_bdevs_discovered": 1, 00:18:25.490 "num_base_bdevs_operational": 1, 00:18:25.490 "base_bdevs_list": [ 00:18:25.490 { 00:18:25.490 "name": null, 00:18:25.490 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:25.490 "is_configured": false, 00:18:25.490 "data_offset": 2048, 00:18:25.490 "data_size": 63488 00:18:25.490 }, 00:18:25.490 { 00:18:25.490 "name": "BaseBdev2", 00:18:25.490 "uuid": "67f0cf39-ff43-5d15-b35f-d911fe0d9f94", 00:18:25.490 "is_configured": true, 00:18:25.490 "data_offset": 2048, 00:18:25.490 "data_size": 63488 00:18:25.490 } 00:18:25.490 ] 00:18:25.490 }' 00:18:25.490 23:40:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:25.490 23:40:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:26.063 23:40:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:18:26.063 [2024-07-24 23:40:10.956603] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:18:26.063 [2024-07-24 23:40:10.956711] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:18:26.063 [2024-07-24 23:40:10.956721] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:18:26.063 [2024-07-24 23:40:10.956738] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:18:26.063 [2024-07-24 23:40:10.960914] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26479c0 00:18:26.063 [2024-07-24 23:40:10.962405] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:18:26.063 23:40:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@755 -- # sleep 1 00:18:27.003 23:40:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:18:27.003 23:40:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:27.003 23:40:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:18:27.003 23:40:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:18:27.003 23:40:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:27.003 23:40:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:27.003 23:40:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:27.261 23:40:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:27.261 "name": "raid_bdev1", 00:18:27.261 "uuid": "bdfd073b-af41-43ce-9703-5909d688425e", 00:18:27.261 "strip_size_kb": 0, 00:18:27.261 "state": "online", 00:18:27.261 "raid_level": "raid1", 00:18:27.261 "superblock": true, 00:18:27.261 "num_base_bdevs": 2, 00:18:27.261 "num_base_bdevs_discovered": 2, 00:18:27.261 "num_base_bdevs_operational": 2, 00:18:27.261 "process": { 00:18:27.261 "type": "rebuild", 00:18:27.261 "target": "spare", 00:18:27.261 "progress": { 00:18:27.261 "blocks": 22528, 00:18:27.261 "percent": 35 00:18:27.261 } 00:18:27.261 }, 00:18:27.261 "base_bdevs_list": [ 00:18:27.261 { 00:18:27.261 "name": "spare", 00:18:27.261 "uuid": "46a9e48e-3919-5739-b5f3-7ec713d5bb9d", 00:18:27.261 "is_configured": true, 00:18:27.261 "data_offset": 2048, 00:18:27.261 "data_size": 63488 00:18:27.261 }, 00:18:27.261 { 00:18:27.261 "name": "BaseBdev2", 00:18:27.261 "uuid": "67f0cf39-ff43-5d15-b35f-d911fe0d9f94", 00:18:27.261 "is_configured": true, 00:18:27.261 "data_offset": 2048, 00:18:27.261 "data_size": 63488 00:18:27.261 } 00:18:27.261 ] 00:18:27.261 }' 00:18:27.261 23:40:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:27.261 23:40:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:18:27.261 23:40:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:27.261 23:40:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:18:27.261 23:40:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:18:27.520 [2024-07-24 23:40:12.405056] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:18:27.520 [2024-07-24 23:40:12.472916] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:18:27.520 [2024-07-24 23:40:12.472951] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:27.520 [2024-07-24 23:40:12.472959] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:18:27.520 [2024-07-24 23:40:12.472979] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:18:27.520 23:40:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:18:27.520 23:40:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:27.520 23:40:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:27.520 23:40:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:27.520 23:40:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:27.520 23:40:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:18:27.520 23:40:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:27.520 23:40:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:27.520 23:40:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:27.520 23:40:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:27.520 23:40:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:27.520 23:40:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:27.778 23:40:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:27.778 "name": "raid_bdev1", 00:18:27.778 "uuid": "bdfd073b-af41-43ce-9703-5909d688425e", 00:18:27.778 "strip_size_kb": 0, 00:18:27.778 "state": "online", 00:18:27.778 "raid_level": "raid1", 00:18:27.778 "superblock": true, 00:18:27.778 "num_base_bdevs": 2, 00:18:27.778 "num_base_bdevs_discovered": 1, 00:18:27.778 "num_base_bdevs_operational": 1, 00:18:27.778 "base_bdevs_list": [ 00:18:27.778 { 00:18:27.778 "name": null, 00:18:27.778 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:27.778 "is_configured": false, 00:18:27.778 "data_offset": 2048, 00:18:27.778 "data_size": 63488 00:18:27.778 }, 00:18:27.778 { 00:18:27.778 "name": "BaseBdev2", 00:18:27.778 "uuid": "67f0cf39-ff43-5d15-b35f-d911fe0d9f94", 00:18:27.778 "is_configured": true, 00:18:27.778 "data_offset": 2048, 00:18:27.778 "data_size": 63488 00:18:27.778 } 00:18:27.778 ] 00:18:27.778 }' 00:18:27.778 23:40:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:27.778 23:40:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:28.343 23:40:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:18:28.343 [2024-07-24 23:40:13.287030] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:18:28.343 [2024-07-24 23:40:13.287064] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:28.343 [2024-07-24 23:40:13.287076] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x248fd30 00:18:28.343 [2024-07-24 23:40:13.287082] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:28.343 [2024-07-24 23:40:13.287340] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:28.343 [2024-07-24 23:40:13.287350] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:18:28.343 [2024-07-24 23:40:13.287401] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:18:28.343 [2024-07-24 23:40:13.287408] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:18:28.343 [2024-07-24 23:40:13.287413] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:18:28.343 [2024-07-24 23:40:13.287422] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:18:28.343 [2024-07-24 23:40:13.291623] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x264a5b0 00:18:28.343 spare 00:18:28.343 [2024-07-24 23:40:13.292636] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:18:28.343 23:40:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@762 -- # sleep 1 00:18:29.720 23:40:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:18:29.720 23:40:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:29.720 23:40:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:18:29.720 23:40:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:18:29.720 23:40:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:29.720 23:40:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:29.720 23:40:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:29.720 23:40:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:29.720 "name": "raid_bdev1", 00:18:29.720 "uuid": "bdfd073b-af41-43ce-9703-5909d688425e", 00:18:29.720 "strip_size_kb": 0, 00:18:29.720 "state": "online", 00:18:29.720 "raid_level": "raid1", 00:18:29.720 "superblock": true, 00:18:29.720 "num_base_bdevs": 2, 00:18:29.720 "num_base_bdevs_discovered": 2, 00:18:29.720 "num_base_bdevs_operational": 2, 00:18:29.720 "process": { 00:18:29.720 "type": "rebuild", 00:18:29.720 "target": "spare", 00:18:29.720 "progress": { 00:18:29.720 "blocks": 22528, 00:18:29.720 "percent": 35 00:18:29.720 } 00:18:29.720 }, 00:18:29.720 "base_bdevs_list": [ 00:18:29.720 { 00:18:29.720 "name": "spare", 00:18:29.720 "uuid": "46a9e48e-3919-5739-b5f3-7ec713d5bb9d", 00:18:29.720 "is_configured": true, 00:18:29.720 "data_offset": 2048, 00:18:29.720 "data_size": 63488 00:18:29.720 }, 00:18:29.720 { 00:18:29.720 "name": "BaseBdev2", 00:18:29.720 "uuid": "67f0cf39-ff43-5d15-b35f-d911fe0d9f94", 00:18:29.720 "is_configured": true, 00:18:29.720 "data_offset": 2048, 00:18:29.720 "data_size": 63488 00:18:29.720 } 00:18:29.720 ] 00:18:29.720 }' 00:18:29.720 23:40:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:29.720 23:40:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:18:29.720 23:40:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:29.720 23:40:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:18:29.720 23:40:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:18:29.720 [2024-07-24 23:40:14.711196] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:18:29.979 [2024-07-24 23:40:14.803071] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:18:29.979 [2024-07-24 23:40:14.803107] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:29.979 [2024-07-24 23:40:14.803116] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:18:29.979 [2024-07-24 23:40:14.803120] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:18:29.979 23:40:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:18:29.979 23:40:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:29.979 23:40:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:29.979 23:40:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:29.979 23:40:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:29.979 23:40:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:18:29.979 23:40:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:29.979 23:40:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:29.979 23:40:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:29.979 23:40:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:29.979 23:40:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:29.979 23:40:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:30.238 23:40:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:30.238 "name": "raid_bdev1", 00:18:30.238 "uuid": "bdfd073b-af41-43ce-9703-5909d688425e", 00:18:30.238 "strip_size_kb": 0, 00:18:30.238 "state": "online", 00:18:30.238 "raid_level": "raid1", 00:18:30.238 "superblock": true, 00:18:30.238 "num_base_bdevs": 2, 00:18:30.238 "num_base_bdevs_discovered": 1, 00:18:30.238 "num_base_bdevs_operational": 1, 00:18:30.238 "base_bdevs_list": [ 00:18:30.238 { 00:18:30.238 "name": null, 00:18:30.238 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:30.238 "is_configured": false, 00:18:30.238 "data_offset": 2048, 00:18:30.238 "data_size": 63488 00:18:30.238 }, 00:18:30.238 { 00:18:30.238 "name": "BaseBdev2", 00:18:30.238 "uuid": "67f0cf39-ff43-5d15-b35f-d911fe0d9f94", 00:18:30.238 "is_configured": true, 00:18:30.238 "data_offset": 2048, 00:18:30.238 "data_size": 63488 00:18:30.238 } 00:18:30.238 ] 00:18:30.238 }' 00:18:30.238 23:40:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:30.238 23:40:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:30.497 23:40:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:18:30.497 23:40:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:30.497 23:40:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:18:30.497 23:40:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:18:30.497 23:40:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:30.497 23:40:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:30.497 23:40:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:30.756 23:40:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:30.756 "name": "raid_bdev1", 00:18:30.756 "uuid": "bdfd073b-af41-43ce-9703-5909d688425e", 00:18:30.756 "strip_size_kb": 0, 00:18:30.756 "state": "online", 00:18:30.756 "raid_level": "raid1", 00:18:30.756 "superblock": true, 00:18:30.756 "num_base_bdevs": 2, 00:18:30.756 "num_base_bdevs_discovered": 1, 00:18:30.756 "num_base_bdevs_operational": 1, 00:18:30.756 "base_bdevs_list": [ 00:18:30.756 { 00:18:30.756 "name": null, 00:18:30.756 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:30.756 "is_configured": false, 00:18:30.756 "data_offset": 2048, 00:18:30.756 "data_size": 63488 00:18:30.756 }, 00:18:30.756 { 00:18:30.756 "name": "BaseBdev2", 00:18:30.756 "uuid": "67f0cf39-ff43-5d15-b35f-d911fe0d9f94", 00:18:30.756 "is_configured": true, 00:18:30.756 "data_offset": 2048, 00:18:30.756 "data_size": 63488 00:18:30.756 } 00:18:30.756 ] 00:18:30.756 }' 00:18:30.756 23:40:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:30.756 23:40:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:18:30.756 23:40:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:30.756 23:40:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:18:30.756 23:40:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:18:31.015 23:40:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:18:31.274 [2024-07-24 23:40:16.058327] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:18:31.274 [2024-07-24 23:40:16.058357] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:31.274 [2024-07-24 23:40:16.058369] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2647d40 00:18:31.274 [2024-07-24 23:40:16.058391] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:31.274 [2024-07-24 23:40:16.058642] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:31.274 [2024-07-24 23:40:16.058653] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:18:31.274 [2024-07-24 23:40:16.058696] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:18:31.274 [2024-07-24 23:40:16.058704] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:18:31.274 [2024-07-24 23:40:16.058709] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:18:31.274 BaseBdev1 00:18:31.274 23:40:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@773 -- # sleep 1 00:18:32.210 23:40:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:18:32.210 23:40:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:32.210 23:40:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:32.210 23:40:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:32.210 23:40:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:32.210 23:40:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:18:32.210 23:40:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:32.210 23:40:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:32.210 23:40:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:32.210 23:40:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:32.210 23:40:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:32.210 23:40:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:32.468 23:40:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:32.468 "name": "raid_bdev1", 00:18:32.468 "uuid": "bdfd073b-af41-43ce-9703-5909d688425e", 00:18:32.468 "strip_size_kb": 0, 00:18:32.468 "state": "online", 00:18:32.468 "raid_level": "raid1", 00:18:32.468 "superblock": true, 00:18:32.468 "num_base_bdevs": 2, 00:18:32.468 "num_base_bdevs_discovered": 1, 00:18:32.468 "num_base_bdevs_operational": 1, 00:18:32.468 "base_bdevs_list": [ 00:18:32.468 { 00:18:32.468 "name": null, 00:18:32.468 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:32.468 "is_configured": false, 00:18:32.468 "data_offset": 2048, 00:18:32.468 "data_size": 63488 00:18:32.468 }, 00:18:32.468 { 00:18:32.468 "name": "BaseBdev2", 00:18:32.468 "uuid": "67f0cf39-ff43-5d15-b35f-d911fe0d9f94", 00:18:32.468 "is_configured": true, 00:18:32.468 "data_offset": 2048, 00:18:32.468 "data_size": 63488 00:18:32.468 } 00:18:32.469 ] 00:18:32.469 }' 00:18:32.469 23:40:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:32.469 23:40:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:32.727 23:40:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:18:32.727 23:40:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:32.727 23:40:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:18:32.727 23:40:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:18:32.727 23:40:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:32.727 23:40:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:32.727 23:40:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:32.986 23:40:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:32.986 "name": "raid_bdev1", 00:18:32.986 "uuid": "bdfd073b-af41-43ce-9703-5909d688425e", 00:18:32.986 "strip_size_kb": 0, 00:18:32.986 "state": "online", 00:18:32.986 "raid_level": "raid1", 00:18:32.986 "superblock": true, 00:18:32.986 "num_base_bdevs": 2, 00:18:32.986 "num_base_bdevs_discovered": 1, 00:18:32.986 "num_base_bdevs_operational": 1, 00:18:32.986 "base_bdevs_list": [ 00:18:32.986 { 00:18:32.986 "name": null, 00:18:32.986 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:32.986 "is_configured": false, 00:18:32.986 "data_offset": 2048, 00:18:32.986 "data_size": 63488 00:18:32.986 }, 00:18:32.986 { 00:18:32.986 "name": "BaseBdev2", 00:18:32.986 "uuid": "67f0cf39-ff43-5d15-b35f-d911fe0d9f94", 00:18:32.986 "is_configured": true, 00:18:32.986 "data_offset": 2048, 00:18:32.986 "data_size": 63488 00:18:32.986 } 00:18:32.986 ] 00:18:32.986 }' 00:18:32.986 23:40:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:32.986 23:40:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:18:32.986 23:40:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:32.986 23:40:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:18:32.986 23:40:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:18:32.986 23:40:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@650 -- # local es=0 00:18:32.986 23:40:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:18:32.986 23:40:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:32.986 23:40:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:18:32.986 23:40:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:32.986 23:40:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:18:32.986 23:40:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:32.986 23:40:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:18:32.986 23:40:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:32.986 23:40:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:18:32.986 23:40:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:18:33.245 [2024-07-24 23:40:18.083576] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:33.245 [2024-07-24 23:40:18.083665] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:18:33.245 [2024-07-24 23:40:18.083673] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:18:33.245 request: 00:18:33.245 { 00:18:33.245 "base_bdev": "BaseBdev1", 00:18:33.245 "raid_bdev": "raid_bdev1", 00:18:33.245 "method": "bdev_raid_add_base_bdev", 00:18:33.245 "req_id": 1 00:18:33.245 } 00:18:33.245 Got JSON-RPC error response 00:18:33.245 response: 00:18:33.245 { 00:18:33.245 "code": -22, 00:18:33.245 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:18:33.245 } 00:18:33.245 23:40:18 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@653 -- # es=1 00:18:33.245 23:40:18 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:18:33.245 23:40:18 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:18:33.245 23:40:18 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:18:33.245 23:40:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@777 -- # sleep 1 00:18:34.181 23:40:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:18:34.181 23:40:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:34.181 23:40:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:34.181 23:40:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:34.181 23:40:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:34.182 23:40:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:18:34.182 23:40:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:34.182 23:40:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:34.182 23:40:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:34.182 23:40:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:34.182 23:40:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:34.182 23:40:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:34.440 23:40:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:34.440 "name": "raid_bdev1", 00:18:34.440 "uuid": "bdfd073b-af41-43ce-9703-5909d688425e", 00:18:34.440 "strip_size_kb": 0, 00:18:34.440 "state": "online", 00:18:34.440 "raid_level": "raid1", 00:18:34.440 "superblock": true, 00:18:34.440 "num_base_bdevs": 2, 00:18:34.440 "num_base_bdevs_discovered": 1, 00:18:34.440 "num_base_bdevs_operational": 1, 00:18:34.440 "base_bdevs_list": [ 00:18:34.440 { 00:18:34.440 "name": null, 00:18:34.440 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:34.440 "is_configured": false, 00:18:34.440 "data_offset": 2048, 00:18:34.440 "data_size": 63488 00:18:34.440 }, 00:18:34.440 { 00:18:34.440 "name": "BaseBdev2", 00:18:34.440 "uuid": "67f0cf39-ff43-5d15-b35f-d911fe0d9f94", 00:18:34.440 "is_configured": true, 00:18:34.440 "data_offset": 2048, 00:18:34.440 "data_size": 63488 00:18:34.440 } 00:18:34.440 ] 00:18:34.440 }' 00:18:34.440 23:40:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:34.440 23:40:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:35.008 23:40:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:18:35.008 23:40:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:35.008 23:40:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:18:35.008 23:40:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:18:35.008 23:40:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:35.008 23:40:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:35.008 23:40:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:35.008 23:40:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:35.008 "name": "raid_bdev1", 00:18:35.008 "uuid": "bdfd073b-af41-43ce-9703-5909d688425e", 00:18:35.008 "strip_size_kb": 0, 00:18:35.008 "state": "online", 00:18:35.008 "raid_level": "raid1", 00:18:35.008 "superblock": true, 00:18:35.008 "num_base_bdevs": 2, 00:18:35.008 "num_base_bdevs_discovered": 1, 00:18:35.008 "num_base_bdevs_operational": 1, 00:18:35.008 "base_bdevs_list": [ 00:18:35.008 { 00:18:35.008 "name": null, 00:18:35.008 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:35.008 "is_configured": false, 00:18:35.008 "data_offset": 2048, 00:18:35.008 "data_size": 63488 00:18:35.008 }, 00:18:35.008 { 00:18:35.008 "name": "BaseBdev2", 00:18:35.008 "uuid": "67f0cf39-ff43-5d15-b35f-d911fe0d9f94", 00:18:35.008 "is_configured": true, 00:18:35.008 "data_offset": 2048, 00:18:35.008 "data_size": 63488 00:18:35.008 } 00:18:35.008 ] 00:18:35.008 }' 00:18:35.008 23:40:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:35.008 23:40:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:18:35.008 23:40:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:35.008 23:40:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:18:35.008 23:40:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@782 -- # killprocess 349020 00:18:35.008 23:40:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@950 -- # '[' -z 349020 ']' 00:18:35.008 23:40:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # kill -0 349020 00:18:35.008 23:40:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@955 -- # uname 00:18:35.008 23:40:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:18:35.008 23:40:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 349020 00:18:35.267 23:40:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:18:35.267 23:40:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:18:35.267 23:40:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@968 -- # echo 'killing process with pid 349020' 00:18:35.267 killing process with pid 349020 00:18:35.267 23:40:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@969 -- # kill 349020 00:18:35.267 Received shutdown signal, test time was about 60.000000 seconds 00:18:35.267 00:18:35.267 Latency(us) 00:18:35.267 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:18:35.267 =================================================================================================================== 00:18:35.267 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:18:35.267 [2024-07-24 23:40:20.043291] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:35.267 [2024-07-24 23:40:20.043358] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:35.267 [2024-07-24 23:40:20.043391] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:35.267 [2024-07-24 23:40:20.043397] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x26476e0 name raid_bdev1, state offline 00:18:35.267 23:40:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@974 -- # wait 349020 00:18:35.267 [2024-07-24 23:40:20.066479] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:35.267 23:40:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@784 -- # return 0 00:18:35.267 00:18:35.267 real 0m28.095s 00:18:35.267 user 0m40.972s 00:18:35.267 sys 0m4.039s 00:18:35.267 23:40:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1126 -- # xtrace_disable 00:18:35.267 23:40:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:35.267 ************************************ 00:18:35.267 END TEST raid_rebuild_test_sb 00:18:35.267 ************************************ 00:18:35.527 23:40:20 bdev_raid -- bdev/bdev_raid.sh@879 -- # run_test raid_rebuild_test_io raid_rebuild_test raid1 2 false true true 00:18:35.527 23:40:20 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:18:35.527 23:40:20 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:18:35.527 23:40:20 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:35.527 ************************************ 00:18:35.527 START TEST raid_rebuild_test_io 00:18:35.527 ************************************ 00:18:35.527 23:40:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1125 -- # raid_rebuild_test raid1 2 false true true 00:18:35.527 23:40:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:18:35.527 23:40:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:18:35.527 23:40:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:18:35.527 23:40:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:18:35.527 23:40:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:18:35.527 23:40:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:18:35.527 23:40:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:18:35.527 23:40:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:18:35.527 23:40:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:18:35.527 23:40:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:18:35.527 23:40:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:18:35.527 23:40:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:18:35.527 23:40:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:18:35.527 23:40:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:18:35.527 23:40:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:18:35.527 23:40:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:18:35.527 23:40:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:18:35.527 23:40:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:18:35.527 23:40:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:18:35.527 23:40:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:18:35.527 23:40:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:18:35.527 23:40:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:18:35.527 23:40:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:18:35.527 23:40:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@596 -- # raid_pid=354036 00:18:35.527 23:40:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 354036 /var/tmp/spdk-raid.sock 00:18:35.527 23:40:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:18:35.527 23:40:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@831 -- # '[' -z 354036 ']' 00:18:35.527 23:40:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:35.527 23:40:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@836 -- # local max_retries=100 00:18:35.527 23:40:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:35.527 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:35.527 23:40:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@840 -- # xtrace_disable 00:18:35.527 23:40:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:18:35.527 [2024-07-24 23:40:20.365700] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:18:35.527 [2024-07-24 23:40:20.365740] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid354036 ] 00:18:35.527 I/O size of 3145728 is greater than zero copy threshold (65536). 00:18:35.527 Zero copy mechanism will not be used. 00:18:35.527 [2024-07-24 23:40:20.430398] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:35.527 [2024-07-24 23:40:20.500907] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:35.786 [2024-07-24 23:40:20.550967] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:35.786 [2024-07-24 23:40:20.550993] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:36.435 23:40:21 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:18:36.435 23:40:21 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@864 -- # return 0 00:18:36.435 23:40:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:18:36.435 23:40:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:18:36.435 BaseBdev1_malloc 00:18:36.435 23:40:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:18:36.708 [2024-07-24 23:40:21.477976] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:18:36.708 [2024-07-24 23:40:21.478011] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:36.708 [2024-07-24 23:40:21.478024] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1bc01c0 00:18:36.708 [2024-07-24 23:40:21.478045] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:36.708 [2024-07-24 23:40:21.479105] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:36.708 [2024-07-24 23:40:21.479124] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:18:36.708 BaseBdev1 00:18:36.708 23:40:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:18:36.708 23:40:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:18:36.708 BaseBdev2_malloc 00:18:36.709 23:40:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:18:36.967 [2024-07-24 23:40:21.822391] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:18:36.967 [2024-07-24 23:40:21.822419] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:36.967 [2024-07-24 23:40:21.822433] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1bc0ce0 00:18:36.967 [2024-07-24 23:40:21.822439] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:36.967 [2024-07-24 23:40:21.823360] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:36.967 [2024-07-24 23:40:21.823378] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:18:36.967 BaseBdev2 00:18:36.967 23:40:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:18:37.226 spare_malloc 00:18:37.226 23:40:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:18:37.226 spare_delay 00:18:37.226 23:40:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:18:37.484 [2024-07-24 23:40:22.346949] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:18:37.484 [2024-07-24 23:40:22.346981] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:37.484 [2024-07-24 23:40:22.346993] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d6f340 00:18:37.484 [2024-07-24 23:40:22.346998] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:37.484 [2024-07-24 23:40:22.347990] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:37.484 [2024-07-24 23:40:22.348009] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:18:37.484 spare 00:18:37.484 23:40:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:18:37.744 [2024-07-24 23:40:22.515404] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:37.744 [2024-07-24 23:40:22.516175] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:37.744 [2024-07-24 23:40:22.516224] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1d704f0 00:18:37.744 [2024-07-24 23:40:22.516230] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:18:37.744 [2024-07-24 23:40:22.516354] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d69910 00:18:37.744 [2024-07-24 23:40:22.516444] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1d704f0 00:18:37.744 [2024-07-24 23:40:22.516449] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1d704f0 00:18:37.744 [2024-07-24 23:40:22.516521] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:37.744 23:40:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:18:37.744 23:40:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:37.744 23:40:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:37.744 23:40:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:37.744 23:40:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:37.744 23:40:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:18:37.744 23:40:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:37.744 23:40:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:37.744 23:40:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:37.744 23:40:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:37.744 23:40:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:37.744 23:40:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:37.744 23:40:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:37.744 "name": "raid_bdev1", 00:18:37.744 "uuid": "595b9bb1-134d-4380-a3be-3b6e60c4525a", 00:18:37.744 "strip_size_kb": 0, 00:18:37.744 "state": "online", 00:18:37.744 "raid_level": "raid1", 00:18:37.744 "superblock": false, 00:18:37.744 "num_base_bdevs": 2, 00:18:37.744 "num_base_bdevs_discovered": 2, 00:18:37.744 "num_base_bdevs_operational": 2, 00:18:37.744 "base_bdevs_list": [ 00:18:37.744 { 00:18:37.744 "name": "BaseBdev1", 00:18:37.744 "uuid": "a1118be4-50bd-5785-a881-eea42297996f", 00:18:37.744 "is_configured": true, 00:18:37.744 "data_offset": 0, 00:18:37.744 "data_size": 65536 00:18:37.744 }, 00:18:37.744 { 00:18:37.744 "name": "BaseBdev2", 00:18:37.744 "uuid": "0cbe2cb7-f71a-5086-a9c0-2d9c203e1de6", 00:18:37.744 "is_configured": true, 00:18:37.744 "data_offset": 0, 00:18:37.744 "data_size": 65536 00:18:37.744 } 00:18:37.744 ] 00:18:37.744 }' 00:18:37.744 23:40:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:37.744 23:40:22 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:18:38.311 23:40:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:38.311 23:40:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:18:38.311 [2024-07-24 23:40:23.309599] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:38.572 23:40:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:18:38.572 23:40:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:38.572 23:40:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:18:38.572 23:40:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:18:38.572 23:40:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:18:38.572 23:40:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:18:38.573 23:40:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:18:38.831 [2024-07-24 23:40:23.592008] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d6b050 00:18:38.831 I/O size of 3145728 is greater than zero copy threshold (65536). 00:18:38.831 Zero copy mechanism will not be used. 00:18:38.831 Running I/O for 60 seconds... 00:18:38.831 [2024-07-24 23:40:23.656727] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:38.831 [2024-07-24 23:40:23.662091] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1d6b050 00:18:38.831 23:40:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:18:38.832 23:40:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:38.832 23:40:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:38.832 23:40:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:38.832 23:40:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:38.832 23:40:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:18:38.832 23:40:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:38.832 23:40:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:38.832 23:40:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:38.832 23:40:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:38.832 23:40:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:38.832 23:40:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:39.091 23:40:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:39.091 "name": "raid_bdev1", 00:18:39.091 "uuid": "595b9bb1-134d-4380-a3be-3b6e60c4525a", 00:18:39.091 "strip_size_kb": 0, 00:18:39.091 "state": "online", 00:18:39.091 "raid_level": "raid1", 00:18:39.091 "superblock": false, 00:18:39.091 "num_base_bdevs": 2, 00:18:39.091 "num_base_bdevs_discovered": 1, 00:18:39.091 "num_base_bdevs_operational": 1, 00:18:39.091 "base_bdevs_list": [ 00:18:39.091 { 00:18:39.091 "name": null, 00:18:39.091 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:39.091 "is_configured": false, 00:18:39.091 "data_offset": 0, 00:18:39.091 "data_size": 65536 00:18:39.091 }, 00:18:39.091 { 00:18:39.091 "name": "BaseBdev2", 00:18:39.091 "uuid": "0cbe2cb7-f71a-5086-a9c0-2d9c203e1de6", 00:18:39.091 "is_configured": true, 00:18:39.091 "data_offset": 0, 00:18:39.091 "data_size": 65536 00:18:39.091 } 00:18:39.091 ] 00:18:39.091 }' 00:18:39.091 23:40:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:39.091 23:40:23 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:18:39.350 23:40:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:18:39.608 [2024-07-24 23:40:24.483139] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:18:39.608 [2024-07-24 23:40:24.511350] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1cf2d30 00:18:39.608 [2024-07-24 23:40:24.512824] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:18:39.608 23:40:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:18:39.867 [2024-07-24 23:40:24.625413] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:18:39.867 [2024-07-24 23:40:24.625767] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:18:39.867 [2024-07-24 23:40:24.843484] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:18:39.867 [2024-07-24 23:40:24.843670] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:18:40.435 [2024-07-24 23:40:25.206364] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:18:40.435 [2024-07-24 23:40:25.424780] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:18:40.693 23:40:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:18:40.693 23:40:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:40.693 23:40:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:18:40.693 23:40:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:18:40.693 23:40:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:40.693 23:40:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:40.693 23:40:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:40.952 23:40:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:40.952 "name": "raid_bdev1", 00:18:40.952 "uuid": "595b9bb1-134d-4380-a3be-3b6e60c4525a", 00:18:40.952 "strip_size_kb": 0, 00:18:40.952 "state": "online", 00:18:40.952 "raid_level": "raid1", 00:18:40.952 "superblock": false, 00:18:40.952 "num_base_bdevs": 2, 00:18:40.952 "num_base_bdevs_discovered": 2, 00:18:40.952 "num_base_bdevs_operational": 2, 00:18:40.952 "process": { 00:18:40.952 "type": "rebuild", 00:18:40.952 "target": "spare", 00:18:40.952 "progress": { 00:18:40.952 "blocks": 12288, 00:18:40.952 "percent": 18 00:18:40.952 } 00:18:40.952 }, 00:18:40.952 "base_bdevs_list": [ 00:18:40.952 { 00:18:40.952 "name": "spare", 00:18:40.952 "uuid": "c6e8cd7a-97e1-58c0-a548-a8a55a63653e", 00:18:40.952 "is_configured": true, 00:18:40.952 "data_offset": 0, 00:18:40.952 "data_size": 65536 00:18:40.952 }, 00:18:40.952 { 00:18:40.952 "name": "BaseBdev2", 00:18:40.952 "uuid": "0cbe2cb7-f71a-5086-a9c0-2d9c203e1de6", 00:18:40.952 "is_configured": true, 00:18:40.952 "data_offset": 0, 00:18:40.952 "data_size": 65536 00:18:40.952 } 00:18:40.952 ] 00:18:40.952 }' 00:18:40.952 23:40:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:40.952 23:40:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:18:40.952 23:40:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:40.953 23:40:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:18:40.953 23:40:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:18:40.953 [2024-07-24 23:40:25.852202] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:18:40.953 [2024-07-24 23:40:25.923528] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:18:41.212 [2024-07-24 23:40:25.970155] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:18:41.212 [2024-07-24 23:40:26.076855] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:18:41.212 [2024-07-24 23:40:26.089081] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:41.212 [2024-07-24 23:40:26.089105] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:18:41.212 [2024-07-24 23:40:26.089112] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:18:41.212 [2024-07-24 23:40:26.104250] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1d6b050 00:18:41.212 23:40:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:18:41.212 23:40:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:41.212 23:40:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:41.212 23:40:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:41.212 23:40:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:41.212 23:40:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:18:41.212 23:40:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:41.212 23:40:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:41.212 23:40:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:41.212 23:40:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:41.212 23:40:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:41.212 23:40:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:41.472 23:40:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:41.472 "name": "raid_bdev1", 00:18:41.472 "uuid": "595b9bb1-134d-4380-a3be-3b6e60c4525a", 00:18:41.472 "strip_size_kb": 0, 00:18:41.472 "state": "online", 00:18:41.472 "raid_level": "raid1", 00:18:41.472 "superblock": false, 00:18:41.472 "num_base_bdevs": 2, 00:18:41.472 "num_base_bdevs_discovered": 1, 00:18:41.472 "num_base_bdevs_operational": 1, 00:18:41.472 "base_bdevs_list": [ 00:18:41.472 { 00:18:41.472 "name": null, 00:18:41.472 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:41.472 "is_configured": false, 00:18:41.472 "data_offset": 0, 00:18:41.472 "data_size": 65536 00:18:41.472 }, 00:18:41.472 { 00:18:41.472 "name": "BaseBdev2", 00:18:41.472 "uuid": "0cbe2cb7-f71a-5086-a9c0-2d9c203e1de6", 00:18:41.472 "is_configured": true, 00:18:41.472 "data_offset": 0, 00:18:41.472 "data_size": 65536 00:18:41.472 } 00:18:41.472 ] 00:18:41.472 }' 00:18:41.472 23:40:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:41.472 23:40:26 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:18:42.040 23:40:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:18:42.040 23:40:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:42.040 23:40:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:18:42.040 23:40:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:18:42.040 23:40:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:42.040 23:40:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:42.040 23:40:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:42.040 23:40:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:42.040 "name": "raid_bdev1", 00:18:42.040 "uuid": "595b9bb1-134d-4380-a3be-3b6e60c4525a", 00:18:42.040 "strip_size_kb": 0, 00:18:42.040 "state": "online", 00:18:42.040 "raid_level": "raid1", 00:18:42.040 "superblock": false, 00:18:42.040 "num_base_bdevs": 2, 00:18:42.040 "num_base_bdevs_discovered": 1, 00:18:42.040 "num_base_bdevs_operational": 1, 00:18:42.040 "base_bdevs_list": [ 00:18:42.040 { 00:18:42.040 "name": null, 00:18:42.040 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:42.040 "is_configured": false, 00:18:42.040 "data_offset": 0, 00:18:42.040 "data_size": 65536 00:18:42.040 }, 00:18:42.040 { 00:18:42.040 "name": "BaseBdev2", 00:18:42.040 "uuid": "0cbe2cb7-f71a-5086-a9c0-2d9c203e1de6", 00:18:42.040 "is_configured": true, 00:18:42.040 "data_offset": 0, 00:18:42.040 "data_size": 65536 00:18:42.040 } 00:18:42.040 ] 00:18:42.040 }' 00:18:42.040 23:40:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:42.040 23:40:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:18:42.040 23:40:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:42.040 23:40:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:18:42.040 23:40:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:18:42.300 [2024-07-24 23:40:27.183043] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:18:42.300 23:40:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:18:42.300 [2024-07-24 23:40:27.222870] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1cf3720 00:18:42.300 [2024-07-24 23:40:27.223909] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:18:42.559 [2024-07-24 23:40:27.349186] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:18:42.818 [2024-07-24 23:40:27.572692] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:18:42.818 [2024-07-24 23:40:27.572848] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:18:43.076 [2024-07-24 23:40:27.919521] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:18:43.335 [2024-07-24 23:40:28.142892] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:18:43.335 [2024-07-24 23:40:28.143017] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:18:43.335 23:40:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:18:43.335 23:40:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:43.335 23:40:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:18:43.335 23:40:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:18:43.335 23:40:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:43.335 23:40:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:43.335 23:40:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:43.595 [2024-07-24 23:40:28.379788] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:18:43.595 23:40:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:43.595 "name": "raid_bdev1", 00:18:43.595 "uuid": "595b9bb1-134d-4380-a3be-3b6e60c4525a", 00:18:43.595 "strip_size_kb": 0, 00:18:43.595 "state": "online", 00:18:43.595 "raid_level": "raid1", 00:18:43.595 "superblock": false, 00:18:43.595 "num_base_bdevs": 2, 00:18:43.595 "num_base_bdevs_discovered": 2, 00:18:43.595 "num_base_bdevs_operational": 2, 00:18:43.595 "process": { 00:18:43.595 "type": "rebuild", 00:18:43.595 "target": "spare", 00:18:43.595 "progress": { 00:18:43.595 "blocks": 12288, 00:18:43.595 "percent": 18 00:18:43.595 } 00:18:43.595 }, 00:18:43.595 "base_bdevs_list": [ 00:18:43.595 { 00:18:43.595 "name": "spare", 00:18:43.595 "uuid": "c6e8cd7a-97e1-58c0-a548-a8a55a63653e", 00:18:43.595 "is_configured": true, 00:18:43.595 "data_offset": 0, 00:18:43.595 "data_size": 65536 00:18:43.595 }, 00:18:43.595 { 00:18:43.595 "name": "BaseBdev2", 00:18:43.595 "uuid": "0cbe2cb7-f71a-5086-a9c0-2d9c203e1de6", 00:18:43.595 "is_configured": true, 00:18:43.595 "data_offset": 0, 00:18:43.595 "data_size": 65536 00:18:43.595 } 00:18:43.595 ] 00:18:43.595 }' 00:18:43.595 23:40:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:43.595 23:40:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:18:43.595 23:40:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:43.595 23:40:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:18:43.595 23:40:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:18:43.595 23:40:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:18:43.595 23:40:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:18:43.595 23:40:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:18:43.595 23:40:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@705 -- # local timeout=621 00:18:43.595 23:40:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:18:43.595 23:40:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:18:43.595 23:40:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:43.595 23:40:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:18:43.595 23:40:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:18:43.595 23:40:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:43.595 23:40:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:43.595 23:40:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:43.596 [2024-07-24 23:40:28.499016] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:18:43.596 [2024-07-24 23:40:28.499162] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:18:43.855 23:40:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:43.855 "name": "raid_bdev1", 00:18:43.855 "uuid": "595b9bb1-134d-4380-a3be-3b6e60c4525a", 00:18:43.855 "strip_size_kb": 0, 00:18:43.855 "state": "online", 00:18:43.855 "raid_level": "raid1", 00:18:43.855 "superblock": false, 00:18:43.855 "num_base_bdevs": 2, 00:18:43.855 "num_base_bdevs_discovered": 2, 00:18:43.855 "num_base_bdevs_operational": 2, 00:18:43.855 "process": { 00:18:43.855 "type": "rebuild", 00:18:43.855 "target": "spare", 00:18:43.855 "progress": { 00:18:43.855 "blocks": 18432, 00:18:43.855 "percent": 28 00:18:43.855 } 00:18:43.855 }, 00:18:43.855 "base_bdevs_list": [ 00:18:43.855 { 00:18:43.855 "name": "spare", 00:18:43.855 "uuid": "c6e8cd7a-97e1-58c0-a548-a8a55a63653e", 00:18:43.855 "is_configured": true, 00:18:43.855 "data_offset": 0, 00:18:43.855 "data_size": 65536 00:18:43.855 }, 00:18:43.855 { 00:18:43.855 "name": "BaseBdev2", 00:18:43.855 "uuid": "0cbe2cb7-f71a-5086-a9c0-2d9c203e1de6", 00:18:43.855 "is_configured": true, 00:18:43.855 "data_offset": 0, 00:18:43.855 "data_size": 65536 00:18:43.855 } 00:18:43.855 ] 00:18:43.855 }' 00:18:43.855 23:40:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:43.855 23:40:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:18:43.855 23:40:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:43.855 23:40:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:18:43.855 23:40:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:18:44.114 [2024-07-24 23:40:29.072610] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:18:44.372 [2024-07-24 23:40:29.279676] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:18:44.940 23:40:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:18:44.940 23:40:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:18:44.940 23:40:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:44.940 23:40:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:18:44.940 23:40:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:18:44.940 23:40:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:44.940 23:40:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:44.940 23:40:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:44.940 23:40:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:44.940 "name": "raid_bdev1", 00:18:44.940 "uuid": "595b9bb1-134d-4380-a3be-3b6e60c4525a", 00:18:44.940 "strip_size_kb": 0, 00:18:44.940 "state": "online", 00:18:44.940 "raid_level": "raid1", 00:18:44.940 "superblock": false, 00:18:44.940 "num_base_bdevs": 2, 00:18:44.940 "num_base_bdevs_discovered": 2, 00:18:44.940 "num_base_bdevs_operational": 2, 00:18:44.940 "process": { 00:18:44.940 "type": "rebuild", 00:18:44.940 "target": "spare", 00:18:44.940 "progress": { 00:18:44.940 "blocks": 38912, 00:18:44.940 "percent": 59 00:18:44.940 } 00:18:44.940 }, 00:18:44.940 "base_bdevs_list": [ 00:18:44.940 { 00:18:44.940 "name": "spare", 00:18:44.940 "uuid": "c6e8cd7a-97e1-58c0-a548-a8a55a63653e", 00:18:44.940 "is_configured": true, 00:18:44.940 "data_offset": 0, 00:18:44.940 "data_size": 65536 00:18:44.940 }, 00:18:44.940 { 00:18:44.940 "name": "BaseBdev2", 00:18:44.940 "uuid": "0cbe2cb7-f71a-5086-a9c0-2d9c203e1de6", 00:18:44.940 "is_configured": true, 00:18:44.940 "data_offset": 0, 00:18:44.940 "data_size": 65536 00:18:44.940 } 00:18:44.940 ] 00:18:44.940 }' 00:18:44.940 23:40:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:44.940 23:40:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:18:44.940 23:40:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:45.199 23:40:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:18:45.199 23:40:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:18:45.199 [2024-07-24 23:40:30.175990] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 45056 offset_begin: 43008 offset_end: 49152 00:18:45.458 [2024-07-24 23:40:30.394077] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 47104 offset_begin: 43008 offset_end: 49152 00:18:45.717 [2024-07-24 23:40:30.613303] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 51200 offset_begin: 49152 offset_end: 55296 00:18:45.991 [2024-07-24 23:40:30.826053] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 53248 offset_begin: 49152 offset_end: 55296 00:18:45.991 23:40:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:18:45.991 23:40:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:18:45.991 23:40:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:45.991 23:40:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:18:45.991 23:40:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:18:45.991 23:40:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:45.991 23:40:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:45.991 23:40:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:46.250 23:40:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:46.250 "name": "raid_bdev1", 00:18:46.250 "uuid": "595b9bb1-134d-4380-a3be-3b6e60c4525a", 00:18:46.250 "strip_size_kb": 0, 00:18:46.250 "state": "online", 00:18:46.250 "raid_level": "raid1", 00:18:46.250 "superblock": false, 00:18:46.250 "num_base_bdevs": 2, 00:18:46.250 "num_base_bdevs_discovered": 2, 00:18:46.250 "num_base_bdevs_operational": 2, 00:18:46.250 "process": { 00:18:46.250 "type": "rebuild", 00:18:46.250 "target": "spare", 00:18:46.250 "progress": { 00:18:46.250 "blocks": 55296, 00:18:46.250 "percent": 84 00:18:46.250 } 00:18:46.250 }, 00:18:46.250 "base_bdevs_list": [ 00:18:46.250 { 00:18:46.250 "name": "spare", 00:18:46.250 "uuid": "c6e8cd7a-97e1-58c0-a548-a8a55a63653e", 00:18:46.250 "is_configured": true, 00:18:46.250 "data_offset": 0, 00:18:46.250 "data_size": 65536 00:18:46.250 }, 00:18:46.250 { 00:18:46.250 "name": "BaseBdev2", 00:18:46.250 "uuid": "0cbe2cb7-f71a-5086-a9c0-2d9c203e1de6", 00:18:46.250 "is_configured": true, 00:18:46.250 "data_offset": 0, 00:18:46.250 "data_size": 65536 00:18:46.250 } 00:18:46.250 ] 00:18:46.250 }' 00:18:46.250 23:40:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:46.251 [2024-07-24 23:40:31.147629] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 57344 offset_begin: 55296 offset_end: 61440 00:18:46.251 23:40:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:18:46.251 23:40:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:46.251 23:40:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:18:46.251 23:40:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:18:46.509 [2024-07-24 23:40:31.260679] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 59392 offset_begin: 55296 offset_end: 61440 00:18:46.768 [2024-07-24 23:40:31.586779] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:18:46.768 [2024-07-24 23:40:31.692007] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:18:46.768 [2024-07-24 23:40:31.693404] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:47.335 23:40:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:18:47.335 23:40:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:18:47.335 23:40:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:47.335 23:40:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:18:47.335 23:40:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:18:47.335 23:40:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:47.335 23:40:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:47.335 23:40:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:47.594 23:40:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:47.594 "name": "raid_bdev1", 00:18:47.594 "uuid": "595b9bb1-134d-4380-a3be-3b6e60c4525a", 00:18:47.594 "strip_size_kb": 0, 00:18:47.594 "state": "online", 00:18:47.594 "raid_level": "raid1", 00:18:47.594 "superblock": false, 00:18:47.594 "num_base_bdevs": 2, 00:18:47.594 "num_base_bdevs_discovered": 2, 00:18:47.594 "num_base_bdevs_operational": 2, 00:18:47.594 "base_bdevs_list": [ 00:18:47.594 { 00:18:47.594 "name": "spare", 00:18:47.594 "uuid": "c6e8cd7a-97e1-58c0-a548-a8a55a63653e", 00:18:47.594 "is_configured": true, 00:18:47.594 "data_offset": 0, 00:18:47.594 "data_size": 65536 00:18:47.594 }, 00:18:47.594 { 00:18:47.594 "name": "BaseBdev2", 00:18:47.594 "uuid": "0cbe2cb7-f71a-5086-a9c0-2d9c203e1de6", 00:18:47.594 "is_configured": true, 00:18:47.594 "data_offset": 0, 00:18:47.594 "data_size": 65536 00:18:47.594 } 00:18:47.594 ] 00:18:47.594 }' 00:18:47.594 23:40:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:47.594 23:40:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:18:47.594 23:40:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:47.594 23:40:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:18:47.594 23:40:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # break 00:18:47.594 23:40:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:18:47.594 23:40:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:47.594 23:40:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:18:47.594 23:40:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:18:47.594 23:40:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:47.594 23:40:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:47.594 23:40:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:47.853 23:40:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:47.853 "name": "raid_bdev1", 00:18:47.853 "uuid": "595b9bb1-134d-4380-a3be-3b6e60c4525a", 00:18:47.853 "strip_size_kb": 0, 00:18:47.853 "state": "online", 00:18:47.853 "raid_level": "raid1", 00:18:47.853 "superblock": false, 00:18:47.853 "num_base_bdevs": 2, 00:18:47.853 "num_base_bdevs_discovered": 2, 00:18:47.853 "num_base_bdevs_operational": 2, 00:18:47.853 "base_bdevs_list": [ 00:18:47.853 { 00:18:47.853 "name": "spare", 00:18:47.853 "uuid": "c6e8cd7a-97e1-58c0-a548-a8a55a63653e", 00:18:47.853 "is_configured": true, 00:18:47.853 "data_offset": 0, 00:18:47.853 "data_size": 65536 00:18:47.853 }, 00:18:47.853 { 00:18:47.853 "name": "BaseBdev2", 00:18:47.853 "uuid": "0cbe2cb7-f71a-5086-a9c0-2d9c203e1de6", 00:18:47.853 "is_configured": true, 00:18:47.853 "data_offset": 0, 00:18:47.853 "data_size": 65536 00:18:47.853 } 00:18:47.853 ] 00:18:47.853 }' 00:18:47.853 23:40:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:47.853 23:40:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:18:47.853 23:40:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:47.853 23:40:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:18:47.853 23:40:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:18:47.853 23:40:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:47.853 23:40:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:47.853 23:40:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:47.853 23:40:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:47.853 23:40:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:18:47.853 23:40:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:47.853 23:40:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:47.853 23:40:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:47.853 23:40:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:47.853 23:40:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:47.853 23:40:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:48.111 23:40:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:48.111 "name": "raid_bdev1", 00:18:48.111 "uuid": "595b9bb1-134d-4380-a3be-3b6e60c4525a", 00:18:48.111 "strip_size_kb": 0, 00:18:48.111 "state": "online", 00:18:48.111 "raid_level": "raid1", 00:18:48.111 "superblock": false, 00:18:48.111 "num_base_bdevs": 2, 00:18:48.111 "num_base_bdevs_discovered": 2, 00:18:48.111 "num_base_bdevs_operational": 2, 00:18:48.111 "base_bdevs_list": [ 00:18:48.111 { 00:18:48.111 "name": "spare", 00:18:48.111 "uuid": "c6e8cd7a-97e1-58c0-a548-a8a55a63653e", 00:18:48.111 "is_configured": true, 00:18:48.111 "data_offset": 0, 00:18:48.111 "data_size": 65536 00:18:48.111 }, 00:18:48.111 { 00:18:48.112 "name": "BaseBdev2", 00:18:48.112 "uuid": "0cbe2cb7-f71a-5086-a9c0-2d9c203e1de6", 00:18:48.112 "is_configured": true, 00:18:48.112 "data_offset": 0, 00:18:48.112 "data_size": 65536 00:18:48.112 } 00:18:48.112 ] 00:18:48.112 }' 00:18:48.112 23:40:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:48.112 23:40:32 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:18:48.370 23:40:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:48.629 [2024-07-24 23:40:33.458598] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:48.629 [2024-07-24 23:40:33.458622] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:48.629 00:18:48.629 Latency(us) 00:18:48.629 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:18:48.629 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:18:48.629 raid_bdev1 : 9.90 126.20 378.60 0.00 0.00 10918.88 242.83 111348.78 00:18:48.629 =================================================================================================================== 00:18:48.629 Total : 126.20 378.60 0.00 0.00 10918.88 242.83 111348.78 00:18:48.629 [2024-07-24 23:40:33.517715] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:48.629 [2024-07-24 23:40:33.517736] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:48.629 [2024-07-24 23:40:33.517795] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:48.629 [2024-07-24 23:40:33.517802] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d704f0 name raid_bdev1, state offline 00:18:48.629 0 00:18:48.629 23:40:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:48.629 23:40:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # jq length 00:18:48.888 23:40:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:18:48.888 23:40:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:18:48.888 23:40:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:18:48.888 23:40:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:18:48.888 23:40:33 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:18:48.888 23:40:33 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:18:48.888 23:40:33 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:18:48.888 23:40:33 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:18:48.888 23:40:33 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:18:48.888 23:40:33 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:18:48.888 23:40:33 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:18:48.888 23:40:33 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:18:48.888 23:40:33 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:18:48.888 /dev/nbd0 00:18:48.888 23:40:33 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:18:48.888 23:40:33 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:18:48.888 23:40:33 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:18:48.888 23:40:33 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # local i 00:18:48.888 23:40:33 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:18:48.888 23:40:33 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:18:48.888 23:40:33 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:18:48.888 23:40:33 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@873 -- # break 00:18:48.888 23:40:33 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:18:48.888 23:40:33 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:18:48.888 23:40:33 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:18:49.147 1+0 records in 00:18:49.147 1+0 records out 00:18:49.147 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000202333 s, 20.2 MB/s 00:18:49.147 23:40:33 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:18:49.147 23:40:33 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # size=4096 00:18:49.147 23:40:33 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:18:49.147 23:40:33 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:18:49.147 23:40:33 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@889 -- # return 0 00:18:49.147 23:40:33 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:18:49.147 23:40:33 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:18:49.147 23:40:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:18:49.147 23:40:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev2 ']' 00:18:49.147 23:40:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev2 /dev/nbd1 00:18:49.147 23:40:33 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:18:49.147 23:40:33 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev2') 00:18:49.147 23:40:33 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:18:49.147 23:40:33 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:18:49.147 23:40:33 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:18:49.147 23:40:33 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:18:49.147 23:40:33 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:18:49.147 23:40:33 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:18:49.147 23:40:33 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev2 /dev/nbd1 00:18:49.147 /dev/nbd1 00:18:49.147 23:40:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:18:49.147 23:40:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:18:49.147 23:40:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:18:49.147 23:40:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # local i 00:18:49.147 23:40:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:18:49.147 23:40:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:18:49.147 23:40:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:18:49.147 23:40:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@873 -- # break 00:18:49.147 23:40:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:18:49.147 23:40:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:18:49.147 23:40:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:18:49.147 1+0 records in 00:18:49.147 1+0 records out 00:18:49.147 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000223415 s, 18.3 MB/s 00:18:49.147 23:40:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:18:49.147 23:40:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # size=4096 00:18:49.147 23:40:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:18:49.147 23:40:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:18:49.147 23:40:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@889 -- # return 0 00:18:49.147 23:40:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:18:49.147 23:40:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:18:49.147 23:40:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:18:49.405 23:40:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:18:49.405 23:40:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:18:49.405 23:40:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:18:49.405 23:40:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:18:49.405 23:40:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:18:49.405 23:40:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:18:49.405 23:40:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:18:49.405 23:40:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:18:49.405 23:40:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:18:49.405 23:40:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:18:49.405 23:40:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:18:49.405 23:40:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:18:49.405 23:40:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:18:49.405 23:40:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:18:49.405 23:40:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:18:49.405 23:40:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:18:49.405 23:40:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:18:49.405 23:40:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:18:49.405 23:40:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:18:49.405 23:40:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:18:49.405 23:40:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:18:49.405 23:40:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:18:49.663 23:40:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:18:49.663 23:40:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:18:49.663 23:40:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:18:49.663 23:40:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:18:49.663 23:40:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:18:49.663 23:40:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:18:49.663 23:40:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:18:49.663 23:40:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:18:49.663 23:40:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:18:49.663 23:40:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@782 -- # killprocess 354036 00:18:49.663 23:40:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@950 -- # '[' -z 354036 ']' 00:18:49.663 23:40:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # kill -0 354036 00:18:49.663 23:40:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@955 -- # uname 00:18:49.663 23:40:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:18:49.663 23:40:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 354036 00:18:49.663 23:40:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:18:49.663 23:40:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:18:49.663 23:40:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@968 -- # echo 'killing process with pid 354036' 00:18:49.663 killing process with pid 354036 00:18:49.663 23:40:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@969 -- # kill 354036 00:18:49.663 Received shutdown signal, test time was about 10.956207 seconds 00:18:49.663 00:18:49.663 Latency(us) 00:18:49.663 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:18:49.663 =================================================================================================================== 00:18:49.663 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:18:49.663 [2024-07-24 23:40:34.576601] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:49.663 23:40:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@974 -- # wait 354036 00:18:49.663 [2024-07-24 23:40:34.595350] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:49.922 23:40:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@784 -- # return 0 00:18:49.922 00:18:49.922 real 0m14.473s 00:18:49.922 user 0m21.468s 00:18:49.922 sys 0m1.741s 00:18:49.922 23:40:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1126 -- # xtrace_disable 00:18:49.922 23:40:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:18:49.922 ************************************ 00:18:49.922 END TEST raid_rebuild_test_io 00:18:49.922 ************************************ 00:18:49.922 23:40:34 bdev_raid -- bdev/bdev_raid.sh@880 -- # run_test raid_rebuild_test_sb_io raid_rebuild_test raid1 2 true true true 00:18:49.922 23:40:34 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:18:49.922 23:40:34 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:18:49.922 23:40:34 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:49.922 ************************************ 00:18:49.922 START TEST raid_rebuild_test_sb_io 00:18:49.922 ************************************ 00:18:49.922 23:40:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1125 -- # raid_rebuild_test raid1 2 true true true 00:18:49.922 23:40:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:18:49.922 23:40:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:18:49.922 23:40:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:18:49.922 23:40:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:18:49.922 23:40:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:18:49.922 23:40:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:18:49.922 23:40:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:18:49.922 23:40:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:18:49.922 23:40:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:18:49.922 23:40:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:18:49.922 23:40:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:18:49.922 23:40:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:18:49.922 23:40:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:18:49.922 23:40:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:18:49.922 23:40:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:18:49.922 23:40:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:18:49.922 23:40:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:18:49.922 23:40:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:18:49.922 23:40:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:18:49.922 23:40:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:18:49.922 23:40:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:18:49.922 23:40:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:18:49.922 23:40:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:18:49.922 23:40:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:18:49.922 23:40:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@596 -- # raid_pid=356748 00:18:49.922 23:40:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 356748 /var/tmp/spdk-raid.sock 00:18:49.922 23:40:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:18:49.922 23:40:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@831 -- # '[' -z 356748 ']' 00:18:49.922 23:40:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:49.922 23:40:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@836 -- # local max_retries=100 00:18:49.922 23:40:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:49.922 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:49.922 23:40:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@840 -- # xtrace_disable 00:18:49.922 23:40:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:18:49.922 [2024-07-24 23:40:34.907832] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:18:49.922 [2024-07-24 23:40:34.907876] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid356748 ] 00:18:49.922 I/O size of 3145728 is greater than zero copy threshold (65536). 00:18:49.922 Zero copy mechanism will not be used. 00:18:50.181 [2024-07-24 23:40:34.970852] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:50.181 [2024-07-24 23:40:35.041825] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:50.181 [2024-07-24 23:40:35.090637] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:50.181 [2024-07-24 23:40:35.090661] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:50.747 23:40:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:18:50.747 23:40:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@864 -- # return 0 00:18:50.747 23:40:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:18:50.747 23:40:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:18:51.006 BaseBdev1_malloc 00:18:51.006 23:40:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:18:51.265 [2024-07-24 23:40:36.017776] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:18:51.265 [2024-07-24 23:40:36.017810] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:51.265 [2024-07-24 23:40:36.017824] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19c81c0 00:18:51.265 [2024-07-24 23:40:36.017830] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:51.265 [2024-07-24 23:40:36.018995] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:51.265 [2024-07-24 23:40:36.019015] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:18:51.265 BaseBdev1 00:18:51.265 23:40:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:18:51.265 23:40:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:18:51.265 BaseBdev2_malloc 00:18:51.265 23:40:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:18:51.523 [2024-07-24 23:40:36.382042] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:18:51.523 [2024-07-24 23:40:36.382069] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:51.523 [2024-07-24 23:40:36.382082] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19c8ce0 00:18:51.523 [2024-07-24 23:40:36.382087] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:51.523 [2024-07-24 23:40:36.383014] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:51.523 [2024-07-24 23:40:36.383032] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:18:51.523 BaseBdev2 00:18:51.523 23:40:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:18:51.782 spare_malloc 00:18:51.782 23:40:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:18:51.782 spare_delay 00:18:51.782 23:40:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:18:52.041 [2024-07-24 23:40:36.898688] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:18:52.041 [2024-07-24 23:40:36.898717] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:52.041 [2024-07-24 23:40:36.898727] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b77340 00:18:52.041 [2024-07-24 23:40:36.898733] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:52.041 [2024-07-24 23:40:36.899701] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:52.041 [2024-07-24 23:40:36.899720] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:18:52.041 spare 00:18:52.041 23:40:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:18:52.300 [2024-07-24 23:40:37.063131] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:52.300 [2024-07-24 23:40:37.063944] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:52.300 [2024-07-24 23:40:37.064057] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1b784f0 00:18:52.300 [2024-07-24 23:40:37.064066] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:18:52.300 [2024-07-24 23:40:37.064190] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b71910 00:18:52.300 [2024-07-24 23:40:37.064285] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1b784f0 00:18:52.300 [2024-07-24 23:40:37.064290] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1b784f0 00:18:52.300 [2024-07-24 23:40:37.064349] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:52.300 23:40:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:18:52.300 23:40:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:52.300 23:40:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:52.300 23:40:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:52.300 23:40:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:52.300 23:40:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:18:52.300 23:40:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:52.300 23:40:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:52.300 23:40:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:52.300 23:40:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:52.300 23:40:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:52.300 23:40:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:52.300 23:40:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:52.300 "name": "raid_bdev1", 00:18:52.300 "uuid": "69112ad9-3bc8-48e8-9642-c37c8484a5ca", 00:18:52.300 "strip_size_kb": 0, 00:18:52.300 "state": "online", 00:18:52.300 "raid_level": "raid1", 00:18:52.300 "superblock": true, 00:18:52.300 "num_base_bdevs": 2, 00:18:52.300 "num_base_bdevs_discovered": 2, 00:18:52.300 "num_base_bdevs_operational": 2, 00:18:52.300 "base_bdevs_list": [ 00:18:52.300 { 00:18:52.300 "name": "BaseBdev1", 00:18:52.300 "uuid": "d67490b9-faec-5183-8465-ccdd2b87432c", 00:18:52.300 "is_configured": true, 00:18:52.300 "data_offset": 2048, 00:18:52.300 "data_size": 63488 00:18:52.300 }, 00:18:52.300 { 00:18:52.300 "name": "BaseBdev2", 00:18:52.300 "uuid": "37a6e5ed-2702-530f-8dc8-5f3a741736fa", 00:18:52.300 "is_configured": true, 00:18:52.300 "data_offset": 2048, 00:18:52.300 "data_size": 63488 00:18:52.300 } 00:18:52.300 ] 00:18:52.300 }' 00:18:52.300 23:40:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:52.300 23:40:37 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:18:52.868 23:40:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:52.868 23:40:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:18:53.126 [2024-07-24 23:40:37.905442] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:53.126 23:40:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:18:53.126 23:40:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:53.126 23:40:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:18:53.126 23:40:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:18:53.126 23:40:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:18:53.126 23:40:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:18:53.126 23:40:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:18:53.385 [2024-07-24 23:40:38.171772] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b790d0 00:18:53.385 I/O size of 3145728 is greater than zero copy threshold (65536). 00:18:53.385 Zero copy mechanism will not be used. 00:18:53.386 Running I/O for 60 seconds... 00:18:53.386 [2024-07-24 23:40:38.246888] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:53.386 [2024-07-24 23:40:38.252034] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1b790d0 00:18:53.386 23:40:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:18:53.386 23:40:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:53.386 23:40:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:53.386 23:40:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:53.386 23:40:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:53.386 23:40:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:18:53.386 23:40:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:53.386 23:40:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:53.386 23:40:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:53.386 23:40:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:53.386 23:40:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:53.386 23:40:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:53.644 23:40:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:53.644 "name": "raid_bdev1", 00:18:53.644 "uuid": "69112ad9-3bc8-48e8-9642-c37c8484a5ca", 00:18:53.644 "strip_size_kb": 0, 00:18:53.644 "state": "online", 00:18:53.645 "raid_level": "raid1", 00:18:53.645 "superblock": true, 00:18:53.645 "num_base_bdevs": 2, 00:18:53.645 "num_base_bdevs_discovered": 1, 00:18:53.645 "num_base_bdevs_operational": 1, 00:18:53.645 "base_bdevs_list": [ 00:18:53.645 { 00:18:53.645 "name": null, 00:18:53.645 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:53.645 "is_configured": false, 00:18:53.645 "data_offset": 2048, 00:18:53.645 "data_size": 63488 00:18:53.645 }, 00:18:53.645 { 00:18:53.645 "name": "BaseBdev2", 00:18:53.645 "uuid": "37a6e5ed-2702-530f-8dc8-5f3a741736fa", 00:18:53.645 "is_configured": true, 00:18:53.645 "data_offset": 2048, 00:18:53.645 "data_size": 63488 00:18:53.645 } 00:18:53.645 ] 00:18:53.645 }' 00:18:53.645 23:40:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:53.645 23:40:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:18:54.212 23:40:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:18:54.212 [2024-07-24 23:40:39.101064] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:18:54.212 [2024-07-24 23:40:39.136555] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1afb610 00:18:54.212 [2024-07-24 23:40:39.138127] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:18:54.212 23:40:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:18:54.471 [2024-07-24 23:40:39.250383] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:18:54.471 [2024-07-24 23:40:39.250747] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:18:54.471 [2024-07-24 23:40:39.468494] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:18:54.471 [2024-07-24 23:40:39.468612] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:18:55.076 [2024-07-24 23:40:39.787807] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:18:55.076 [2024-07-24 23:40:39.788058] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:18:55.076 [2024-07-24 23:40:40.001224] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:18:55.378 23:40:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:18:55.378 23:40:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:55.378 23:40:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:18:55.378 23:40:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:18:55.378 23:40:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:55.378 23:40:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:55.378 23:40:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:55.379 [2024-07-24 23:40:40.240158] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:18:55.379 23:40:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:55.379 "name": "raid_bdev1", 00:18:55.379 "uuid": "69112ad9-3bc8-48e8-9642-c37c8484a5ca", 00:18:55.379 "strip_size_kb": 0, 00:18:55.379 "state": "online", 00:18:55.379 "raid_level": "raid1", 00:18:55.379 "superblock": true, 00:18:55.379 "num_base_bdevs": 2, 00:18:55.379 "num_base_bdevs_discovered": 2, 00:18:55.379 "num_base_bdevs_operational": 2, 00:18:55.379 "process": { 00:18:55.379 "type": "rebuild", 00:18:55.379 "target": "spare", 00:18:55.379 "progress": { 00:18:55.379 "blocks": 14336, 00:18:55.379 "percent": 22 00:18:55.379 } 00:18:55.379 }, 00:18:55.379 "base_bdevs_list": [ 00:18:55.379 { 00:18:55.379 "name": "spare", 00:18:55.379 "uuid": "657d6b7f-5e5d-5f12-99ca-d7d8c46a451e", 00:18:55.379 "is_configured": true, 00:18:55.379 "data_offset": 2048, 00:18:55.379 "data_size": 63488 00:18:55.379 }, 00:18:55.379 { 00:18:55.379 "name": "BaseBdev2", 00:18:55.379 "uuid": "37a6e5ed-2702-530f-8dc8-5f3a741736fa", 00:18:55.379 "is_configured": true, 00:18:55.379 "data_offset": 2048, 00:18:55.379 "data_size": 63488 00:18:55.379 } 00:18:55.379 ] 00:18:55.379 }' 00:18:55.379 23:40:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:55.379 23:40:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:18:55.379 23:40:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:55.637 23:40:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:18:55.638 23:40:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:18:55.638 [2024-07-24 23:40:40.460088] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:18:55.638 [2024-07-24 23:40:40.549514] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:18:55.896 [2024-07-24 23:40:40.674853] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:18:55.896 [2024-07-24 23:40:40.681316] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:55.896 [2024-07-24 23:40:40.681336] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:18:55.896 [2024-07-24 23:40:40.681341] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:18:55.896 [2024-07-24 23:40:40.696831] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1b790d0 00:18:55.896 23:40:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:18:55.896 23:40:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:55.896 23:40:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:55.896 23:40:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:55.896 23:40:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:55.896 23:40:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:18:55.896 23:40:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:55.896 23:40:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:55.896 23:40:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:55.896 23:40:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:55.896 23:40:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:55.896 23:40:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:56.155 23:40:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:56.155 "name": "raid_bdev1", 00:18:56.155 "uuid": "69112ad9-3bc8-48e8-9642-c37c8484a5ca", 00:18:56.155 "strip_size_kb": 0, 00:18:56.155 "state": "online", 00:18:56.155 "raid_level": "raid1", 00:18:56.155 "superblock": true, 00:18:56.155 "num_base_bdevs": 2, 00:18:56.155 "num_base_bdevs_discovered": 1, 00:18:56.155 "num_base_bdevs_operational": 1, 00:18:56.155 "base_bdevs_list": [ 00:18:56.155 { 00:18:56.155 "name": null, 00:18:56.155 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:56.155 "is_configured": false, 00:18:56.155 "data_offset": 2048, 00:18:56.155 "data_size": 63488 00:18:56.155 }, 00:18:56.155 { 00:18:56.155 "name": "BaseBdev2", 00:18:56.155 "uuid": "37a6e5ed-2702-530f-8dc8-5f3a741736fa", 00:18:56.155 "is_configured": true, 00:18:56.155 "data_offset": 2048, 00:18:56.155 "data_size": 63488 00:18:56.155 } 00:18:56.155 ] 00:18:56.155 }' 00:18:56.155 23:40:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:56.155 23:40:40 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:18:56.414 23:40:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:18:56.414 23:40:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:56.414 23:40:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:18:56.414 23:40:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:18:56.414 23:40:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:56.414 23:40:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:56.414 23:40:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:56.673 23:40:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:56.673 "name": "raid_bdev1", 00:18:56.673 "uuid": "69112ad9-3bc8-48e8-9642-c37c8484a5ca", 00:18:56.673 "strip_size_kb": 0, 00:18:56.673 "state": "online", 00:18:56.673 "raid_level": "raid1", 00:18:56.673 "superblock": true, 00:18:56.673 "num_base_bdevs": 2, 00:18:56.673 "num_base_bdevs_discovered": 1, 00:18:56.673 "num_base_bdevs_operational": 1, 00:18:56.673 "base_bdevs_list": [ 00:18:56.673 { 00:18:56.673 "name": null, 00:18:56.673 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:56.673 "is_configured": false, 00:18:56.673 "data_offset": 2048, 00:18:56.673 "data_size": 63488 00:18:56.673 }, 00:18:56.673 { 00:18:56.673 "name": "BaseBdev2", 00:18:56.673 "uuid": "37a6e5ed-2702-530f-8dc8-5f3a741736fa", 00:18:56.673 "is_configured": true, 00:18:56.673 "data_offset": 2048, 00:18:56.673 "data_size": 63488 00:18:56.673 } 00:18:56.673 ] 00:18:56.673 }' 00:18:56.673 23:40:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:56.673 23:40:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:18:56.673 23:40:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:56.673 23:40:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:18:56.673 23:40:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:18:56.932 [2024-07-24 23:40:41.812248] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:18:56.932 23:40:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:18:56.932 [2024-07-24 23:40:41.868328] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1aff5c0 00:18:56.932 [2024-07-24 23:40:41.869385] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:18:57.191 [2024-07-24 23:40:41.993485] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:18:57.191 [2024-07-24 23:40:41.993839] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:18:57.449 [2024-07-24 23:40:42.216911] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:18:57.449 [2024-07-24 23:40:42.217048] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:18:57.708 [2024-07-24 23:40:42.677322] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:18:57.967 23:40:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:18:57.967 23:40:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:57.967 23:40:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:18:57.967 23:40:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:18:57.967 23:40:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:57.967 23:40:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:57.967 23:40:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:58.226 23:40:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:58.226 "name": "raid_bdev1", 00:18:58.226 "uuid": "69112ad9-3bc8-48e8-9642-c37c8484a5ca", 00:18:58.226 "strip_size_kb": 0, 00:18:58.226 "state": "online", 00:18:58.226 "raid_level": "raid1", 00:18:58.226 "superblock": true, 00:18:58.226 "num_base_bdevs": 2, 00:18:58.226 "num_base_bdevs_discovered": 2, 00:18:58.226 "num_base_bdevs_operational": 2, 00:18:58.226 "process": { 00:18:58.226 "type": "rebuild", 00:18:58.226 "target": "spare", 00:18:58.226 "progress": { 00:18:58.226 "blocks": 14336, 00:18:58.226 "percent": 22 00:18:58.226 } 00:18:58.226 }, 00:18:58.226 "base_bdevs_list": [ 00:18:58.226 { 00:18:58.226 "name": "spare", 00:18:58.226 "uuid": "657d6b7f-5e5d-5f12-99ca-d7d8c46a451e", 00:18:58.226 "is_configured": true, 00:18:58.226 "data_offset": 2048, 00:18:58.226 "data_size": 63488 00:18:58.226 }, 00:18:58.226 { 00:18:58.226 "name": "BaseBdev2", 00:18:58.226 "uuid": "37a6e5ed-2702-530f-8dc8-5f3a741736fa", 00:18:58.226 "is_configured": true, 00:18:58.226 "data_offset": 2048, 00:18:58.226 "data_size": 63488 00:18:58.226 } 00:18:58.226 ] 00:18:58.226 }' 00:18:58.226 23:40:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:58.226 [2024-07-24 23:40:43.043684] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:18:58.226 [2024-07-24 23:40:43.043901] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:18:58.226 23:40:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:18:58.226 23:40:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:58.226 23:40:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:18:58.226 23:40:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:18:58.226 23:40:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:18:58.226 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:18:58.226 23:40:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:18:58.226 23:40:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:18:58.226 23:40:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:18:58.226 23:40:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@705 -- # local timeout=636 00:18:58.226 23:40:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:18:58.226 23:40:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:18:58.226 23:40:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:58.226 23:40:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:18:58.226 23:40:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:18:58.226 23:40:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:58.226 23:40:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:58.226 23:40:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:58.486 23:40:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:58.486 "name": "raid_bdev1", 00:18:58.486 "uuid": "69112ad9-3bc8-48e8-9642-c37c8484a5ca", 00:18:58.486 "strip_size_kb": 0, 00:18:58.486 "state": "online", 00:18:58.486 "raid_level": "raid1", 00:18:58.486 "superblock": true, 00:18:58.486 "num_base_bdevs": 2, 00:18:58.486 "num_base_bdevs_discovered": 2, 00:18:58.486 "num_base_bdevs_operational": 2, 00:18:58.486 "process": { 00:18:58.486 "type": "rebuild", 00:18:58.486 "target": "spare", 00:18:58.486 "progress": { 00:18:58.486 "blocks": 18432, 00:18:58.486 "percent": 29 00:18:58.486 } 00:18:58.486 }, 00:18:58.486 "base_bdevs_list": [ 00:18:58.486 { 00:18:58.486 "name": "spare", 00:18:58.486 "uuid": "657d6b7f-5e5d-5f12-99ca-d7d8c46a451e", 00:18:58.486 "is_configured": true, 00:18:58.486 "data_offset": 2048, 00:18:58.486 "data_size": 63488 00:18:58.486 }, 00:18:58.486 { 00:18:58.486 "name": "BaseBdev2", 00:18:58.486 "uuid": "37a6e5ed-2702-530f-8dc8-5f3a741736fa", 00:18:58.486 "is_configured": true, 00:18:58.486 "data_offset": 2048, 00:18:58.486 "data_size": 63488 00:18:58.486 } 00:18:58.486 ] 00:18:58.486 }' 00:18:58.486 23:40:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:58.486 23:40:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:18:58.486 23:40:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:58.486 23:40:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:18:58.486 23:40:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:18:58.486 [2024-07-24 23:40:43.484732] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:18:59.076 [2024-07-24 23:40:43.814512] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:18:59.334 [2024-07-24 23:40:44.290860] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:18:59.594 23:40:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:18:59.594 23:40:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:18:59.594 23:40:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:18:59.594 23:40:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:18:59.594 23:40:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:18:59.594 23:40:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:18:59.594 23:40:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:59.594 23:40:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:59.594 23:40:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:18:59.594 "name": "raid_bdev1", 00:18:59.594 "uuid": "69112ad9-3bc8-48e8-9642-c37c8484a5ca", 00:18:59.594 "strip_size_kb": 0, 00:18:59.594 "state": "online", 00:18:59.594 "raid_level": "raid1", 00:18:59.594 "superblock": true, 00:18:59.594 "num_base_bdevs": 2, 00:18:59.594 "num_base_bdevs_discovered": 2, 00:18:59.594 "num_base_bdevs_operational": 2, 00:18:59.594 "process": { 00:18:59.594 "type": "rebuild", 00:18:59.594 "target": "spare", 00:18:59.594 "progress": { 00:18:59.594 "blocks": 36864, 00:18:59.594 "percent": 58 00:18:59.594 } 00:18:59.594 }, 00:18:59.594 "base_bdevs_list": [ 00:18:59.594 { 00:18:59.594 "name": "spare", 00:18:59.594 "uuid": "657d6b7f-5e5d-5f12-99ca-d7d8c46a451e", 00:18:59.594 "is_configured": true, 00:18:59.594 "data_offset": 2048, 00:18:59.594 "data_size": 63488 00:18:59.594 }, 00:18:59.594 { 00:18:59.594 "name": "BaseBdev2", 00:18:59.594 "uuid": "37a6e5ed-2702-530f-8dc8-5f3a741736fa", 00:18:59.594 "is_configured": true, 00:18:59.594 "data_offset": 2048, 00:18:59.594 "data_size": 63488 00:18:59.594 } 00:18:59.594 ] 00:18:59.594 }' 00:18:59.594 23:40:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:18:59.853 23:40:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:18:59.853 23:40:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:18:59.853 [2024-07-24 23:40:44.608285] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 38912 offset_begin: 36864 offset_end: 43008 00:18:59.853 23:40:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:18:59.853 23:40:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:18:59.853 [2024-07-24 23:40:44.826786] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:19:00.790 23:40:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:19:00.790 23:40:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:00.790 23:40:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:00.790 23:40:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:00.790 23:40:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:00.790 23:40:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:00.790 23:40:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:00.790 23:40:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:00.790 [2024-07-24 23:40:45.693286] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 57344 offset_begin: 55296 offset_end: 61440 00:19:00.790 [2024-07-24 23:40:45.693629] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 57344 offset_begin: 55296 offset_end: 61440 00:19:01.048 23:40:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:01.048 "name": "raid_bdev1", 00:19:01.048 "uuid": "69112ad9-3bc8-48e8-9642-c37c8484a5ca", 00:19:01.048 "strip_size_kb": 0, 00:19:01.048 "state": "online", 00:19:01.048 "raid_level": "raid1", 00:19:01.048 "superblock": true, 00:19:01.048 "num_base_bdevs": 2, 00:19:01.048 "num_base_bdevs_discovered": 2, 00:19:01.048 "num_base_bdevs_operational": 2, 00:19:01.048 "process": { 00:19:01.048 "type": "rebuild", 00:19:01.048 "target": "spare", 00:19:01.048 "progress": { 00:19:01.048 "blocks": 57344, 00:19:01.048 "percent": 90 00:19:01.048 } 00:19:01.048 }, 00:19:01.048 "base_bdevs_list": [ 00:19:01.048 { 00:19:01.049 "name": "spare", 00:19:01.049 "uuid": "657d6b7f-5e5d-5f12-99ca-d7d8c46a451e", 00:19:01.049 "is_configured": true, 00:19:01.049 "data_offset": 2048, 00:19:01.049 "data_size": 63488 00:19:01.049 }, 00:19:01.049 { 00:19:01.049 "name": "BaseBdev2", 00:19:01.049 "uuid": "37a6e5ed-2702-530f-8dc8-5f3a741736fa", 00:19:01.049 "is_configured": true, 00:19:01.049 "data_offset": 2048, 00:19:01.049 "data_size": 63488 00:19:01.049 } 00:19:01.049 ] 00:19:01.049 }' 00:19:01.049 23:40:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:01.049 23:40:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:01.049 23:40:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:01.049 23:40:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:19:01.049 23:40:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:19:01.049 [2024-07-24 23:40:45.906885] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 59392 offset_begin: 55296 offset_end: 61440 00:19:01.049 [2024-07-24 23:40:45.907033] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 59392 offset_begin: 55296 offset_end: 61440 00:19:01.307 [2024-07-24 23:40:46.234551] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:19:01.566 [2024-07-24 23:40:46.334852] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:19:01.566 [2024-07-24 23:40:46.336311] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:02.134 23:40:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:19:02.134 23:40:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:02.134 23:40:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:02.134 23:40:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:02.134 23:40:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:02.134 23:40:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:02.134 23:40:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:02.134 23:40:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:02.134 23:40:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:02.134 "name": "raid_bdev1", 00:19:02.134 "uuid": "69112ad9-3bc8-48e8-9642-c37c8484a5ca", 00:19:02.134 "strip_size_kb": 0, 00:19:02.134 "state": "online", 00:19:02.134 "raid_level": "raid1", 00:19:02.134 "superblock": true, 00:19:02.134 "num_base_bdevs": 2, 00:19:02.134 "num_base_bdevs_discovered": 2, 00:19:02.134 "num_base_bdevs_operational": 2, 00:19:02.134 "base_bdevs_list": [ 00:19:02.135 { 00:19:02.135 "name": "spare", 00:19:02.135 "uuid": "657d6b7f-5e5d-5f12-99ca-d7d8c46a451e", 00:19:02.135 "is_configured": true, 00:19:02.135 "data_offset": 2048, 00:19:02.135 "data_size": 63488 00:19:02.135 }, 00:19:02.135 { 00:19:02.135 "name": "BaseBdev2", 00:19:02.135 "uuid": "37a6e5ed-2702-530f-8dc8-5f3a741736fa", 00:19:02.135 "is_configured": true, 00:19:02.135 "data_offset": 2048, 00:19:02.135 "data_size": 63488 00:19:02.135 } 00:19:02.135 ] 00:19:02.135 }' 00:19:02.135 23:40:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:02.135 23:40:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:19:02.135 23:40:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:02.135 23:40:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:19:02.135 23:40:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # break 00:19:02.135 23:40:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:19:02.135 23:40:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:02.135 23:40:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:19:02.135 23:40:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:19:02.135 23:40:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:02.135 23:40:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:02.135 23:40:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:02.394 23:40:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:02.394 "name": "raid_bdev1", 00:19:02.394 "uuid": "69112ad9-3bc8-48e8-9642-c37c8484a5ca", 00:19:02.394 "strip_size_kb": 0, 00:19:02.394 "state": "online", 00:19:02.394 "raid_level": "raid1", 00:19:02.394 "superblock": true, 00:19:02.394 "num_base_bdevs": 2, 00:19:02.394 "num_base_bdevs_discovered": 2, 00:19:02.394 "num_base_bdevs_operational": 2, 00:19:02.394 "base_bdevs_list": [ 00:19:02.394 { 00:19:02.394 "name": "spare", 00:19:02.394 "uuid": "657d6b7f-5e5d-5f12-99ca-d7d8c46a451e", 00:19:02.394 "is_configured": true, 00:19:02.394 "data_offset": 2048, 00:19:02.394 "data_size": 63488 00:19:02.394 }, 00:19:02.394 { 00:19:02.394 "name": "BaseBdev2", 00:19:02.394 "uuid": "37a6e5ed-2702-530f-8dc8-5f3a741736fa", 00:19:02.394 "is_configured": true, 00:19:02.394 "data_offset": 2048, 00:19:02.394 "data_size": 63488 00:19:02.394 } 00:19:02.394 ] 00:19:02.394 }' 00:19:02.394 23:40:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:02.394 23:40:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:19:02.394 23:40:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:02.394 23:40:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:19:02.394 23:40:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:19:02.394 23:40:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:02.394 23:40:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:02.394 23:40:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:02.394 23:40:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:02.394 23:40:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:19:02.394 23:40:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:02.394 23:40:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:02.394 23:40:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:02.394 23:40:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:02.394 23:40:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:02.394 23:40:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:02.653 23:40:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:02.653 "name": "raid_bdev1", 00:19:02.653 "uuid": "69112ad9-3bc8-48e8-9642-c37c8484a5ca", 00:19:02.653 "strip_size_kb": 0, 00:19:02.653 "state": "online", 00:19:02.653 "raid_level": "raid1", 00:19:02.653 "superblock": true, 00:19:02.653 "num_base_bdevs": 2, 00:19:02.653 "num_base_bdevs_discovered": 2, 00:19:02.653 "num_base_bdevs_operational": 2, 00:19:02.653 "base_bdevs_list": [ 00:19:02.653 { 00:19:02.653 "name": "spare", 00:19:02.653 "uuid": "657d6b7f-5e5d-5f12-99ca-d7d8c46a451e", 00:19:02.653 "is_configured": true, 00:19:02.653 "data_offset": 2048, 00:19:02.653 "data_size": 63488 00:19:02.653 }, 00:19:02.653 { 00:19:02.653 "name": "BaseBdev2", 00:19:02.653 "uuid": "37a6e5ed-2702-530f-8dc8-5f3a741736fa", 00:19:02.653 "is_configured": true, 00:19:02.653 "data_offset": 2048, 00:19:02.653 "data_size": 63488 00:19:02.653 } 00:19:02.653 ] 00:19:02.653 }' 00:19:02.653 23:40:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:02.653 23:40:47 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:19:03.219 23:40:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:03.219 [2024-07-24 23:40:48.086635] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:03.219 [2024-07-24 23:40:48.086658] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:03.219 00:19:03.219 Latency(us) 00:19:03.219 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:19:03.219 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:19:03.219 raid_bdev1 : 9.95 119.66 358.98 0.00 0.00 11267.69 243.81 112347.43 00:19:03.219 =================================================================================================================== 00:19:03.219 Total : 119.66 358.98 0.00 0.00 11267.69 243.81 112347.43 00:19:03.219 [2024-07-24 23:40:48.153442] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:03.219 [2024-07-24 23:40:48.153462] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:03.219 [2024-07-24 23:40:48.153532] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:03.219 [2024-07-24 23:40:48.153539] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b784f0 name raid_bdev1, state offline 00:19:03.219 0 00:19:03.219 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:03.219 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # jq length 00:19:03.478 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:19:03.478 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:19:03.478 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:19:03.478 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:19:03.478 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:19:03.478 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:19:03.478 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:19:03.478 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:19:03.478 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:19:03.478 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:19:03.478 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:19:03.478 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:19:03.478 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:19:03.737 /dev/nbd0 00:19:03.737 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:19:03.737 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:19:03.737 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:19:03.737 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # local i 00:19:03.737 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:19:03.737 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:19:03.737 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:19:03.737 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@873 -- # break 00:19:03.737 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:19:03.737 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:19:03.737 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:03.737 1+0 records in 00:19:03.737 1+0 records out 00:19:03.737 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000240468 s, 17.0 MB/s 00:19:03.737 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:03.737 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # size=4096 00:19:03.737 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:03.737 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:19:03.737 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@889 -- # return 0 00:19:03.737 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:19:03.737 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:19:03.737 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:19:03.737 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev2 ']' 00:19:03.737 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev2 /dev/nbd1 00:19:03.737 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:19:03.737 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev2') 00:19:03.737 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:19:03.737 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:19:03.737 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:19:03.737 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:19:03.737 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:19:03.737 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:19:03.737 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev2 /dev/nbd1 00:19:03.737 /dev/nbd1 00:19:03.996 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:19:03.996 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:19:03.996 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:19:03.996 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # local i 00:19:03.996 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:19:03.996 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:19:03.996 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:19:03.996 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@873 -- # break 00:19:03.996 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:19:03.996 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:19:03.996 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:03.996 1+0 records in 00:19:03.996 1+0 records out 00:19:03.996 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000185274 s, 22.1 MB/s 00:19:03.996 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:03.996 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # size=4096 00:19:03.996 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:03.996 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:19:03.996 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@889 -- # return 0 00:19:03.996 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:19:03.996 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:19:03.996 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:19:03.996 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:19:03.996 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:19:03.996 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:19:03.996 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:19:03.996 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:19:03.996 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:03.996 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:19:03.996 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:19:03.996 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:19:03.996 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:19:03.996 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:03.996 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:03.996 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:19:03.996 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:19:03.996 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:19:03.996 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:19:03.996 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:19:03.996 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:19:03.996 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:19:03.996 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:19:03.996 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:03.996 23:40:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:19:04.256 23:40:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:19:04.256 23:40:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:19:04.256 23:40:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:19:04.256 23:40:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:04.256 23:40:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:04.256 23:40:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:19:04.256 23:40:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:19:04.256 23:40:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:19:04.256 23:40:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:19:04.256 23:40:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:19:04.515 23:40:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:19:04.515 [2024-07-24 23:40:49.500326] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:19:04.515 [2024-07-24 23:40:49.500357] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:04.515 [2024-07-24 23:40:49.500368] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19c7910 00:19:04.515 [2024-07-24 23:40:49.500394] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:04.515 [2024-07-24 23:40:49.501576] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:04.515 [2024-07-24 23:40:49.501595] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:19:04.515 [2024-07-24 23:40:49.501646] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:19:04.515 [2024-07-24 23:40:49.501667] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:19:04.515 [2024-07-24 23:40:49.501737] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:04.515 spare 00:19:04.774 23:40:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:19:04.774 23:40:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:04.774 23:40:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:04.774 23:40:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:04.774 23:40:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:04.774 23:40:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:19:04.774 23:40:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:04.774 23:40:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:04.774 23:40:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:04.774 23:40:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:04.774 23:40:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:04.774 23:40:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:04.774 [2024-07-24 23:40:49.602030] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1afa7e0 00:19:04.774 [2024-07-24 23:40:49.602040] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:19:04.774 [2024-07-24 23:40:49.602158] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b713d0 00:19:04.774 [2024-07-24 23:40:49.602256] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1afa7e0 00:19:04.774 [2024-07-24 23:40:49.602261] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1afa7e0 00:19:04.774 [2024-07-24 23:40:49.602326] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:04.774 23:40:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:04.774 "name": "raid_bdev1", 00:19:04.774 "uuid": "69112ad9-3bc8-48e8-9642-c37c8484a5ca", 00:19:04.774 "strip_size_kb": 0, 00:19:04.774 "state": "online", 00:19:04.774 "raid_level": "raid1", 00:19:04.774 "superblock": true, 00:19:04.774 "num_base_bdevs": 2, 00:19:04.774 "num_base_bdevs_discovered": 2, 00:19:04.774 "num_base_bdevs_operational": 2, 00:19:04.774 "base_bdevs_list": [ 00:19:04.774 { 00:19:04.774 "name": "spare", 00:19:04.774 "uuid": "657d6b7f-5e5d-5f12-99ca-d7d8c46a451e", 00:19:04.774 "is_configured": true, 00:19:04.774 "data_offset": 2048, 00:19:04.774 "data_size": 63488 00:19:04.774 }, 00:19:04.774 { 00:19:04.774 "name": "BaseBdev2", 00:19:04.774 "uuid": "37a6e5ed-2702-530f-8dc8-5f3a741736fa", 00:19:04.774 "is_configured": true, 00:19:04.774 "data_offset": 2048, 00:19:04.774 "data_size": 63488 00:19:04.774 } 00:19:04.774 ] 00:19:04.774 }' 00:19:04.774 23:40:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:04.774 23:40:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:19:05.342 23:40:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:19:05.342 23:40:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:05.342 23:40:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:19:05.342 23:40:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:19:05.342 23:40:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:05.342 23:40:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:05.342 23:40:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:05.342 23:40:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:05.342 "name": "raid_bdev1", 00:19:05.342 "uuid": "69112ad9-3bc8-48e8-9642-c37c8484a5ca", 00:19:05.342 "strip_size_kb": 0, 00:19:05.342 "state": "online", 00:19:05.342 "raid_level": "raid1", 00:19:05.342 "superblock": true, 00:19:05.342 "num_base_bdevs": 2, 00:19:05.342 "num_base_bdevs_discovered": 2, 00:19:05.342 "num_base_bdevs_operational": 2, 00:19:05.342 "base_bdevs_list": [ 00:19:05.342 { 00:19:05.342 "name": "spare", 00:19:05.342 "uuid": "657d6b7f-5e5d-5f12-99ca-d7d8c46a451e", 00:19:05.342 "is_configured": true, 00:19:05.342 "data_offset": 2048, 00:19:05.342 "data_size": 63488 00:19:05.342 }, 00:19:05.342 { 00:19:05.342 "name": "BaseBdev2", 00:19:05.342 "uuid": "37a6e5ed-2702-530f-8dc8-5f3a741736fa", 00:19:05.342 "is_configured": true, 00:19:05.342 "data_offset": 2048, 00:19:05.342 "data_size": 63488 00:19:05.342 } 00:19:05.342 ] 00:19:05.342 }' 00:19:05.342 23:40:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:05.601 23:40:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:19:05.601 23:40:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:05.601 23:40:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:19:05.601 23:40:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:05.601 23:40:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:19:05.601 23:40:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:19:05.601 23:40:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:19:05.860 [2024-07-24 23:40:50.719640] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:19:05.860 23:40:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:19:05.860 23:40:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:05.860 23:40:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:05.860 23:40:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:05.860 23:40:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:05.860 23:40:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:19:05.860 23:40:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:05.860 23:40:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:05.860 23:40:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:05.860 23:40:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:05.860 23:40:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:05.860 23:40:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:06.118 23:40:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:06.118 "name": "raid_bdev1", 00:19:06.118 "uuid": "69112ad9-3bc8-48e8-9642-c37c8484a5ca", 00:19:06.118 "strip_size_kb": 0, 00:19:06.118 "state": "online", 00:19:06.118 "raid_level": "raid1", 00:19:06.118 "superblock": true, 00:19:06.118 "num_base_bdevs": 2, 00:19:06.118 "num_base_bdevs_discovered": 1, 00:19:06.118 "num_base_bdevs_operational": 1, 00:19:06.118 "base_bdevs_list": [ 00:19:06.118 { 00:19:06.118 "name": null, 00:19:06.118 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:06.118 "is_configured": false, 00:19:06.118 "data_offset": 2048, 00:19:06.118 "data_size": 63488 00:19:06.118 }, 00:19:06.118 { 00:19:06.118 "name": "BaseBdev2", 00:19:06.118 "uuid": "37a6e5ed-2702-530f-8dc8-5f3a741736fa", 00:19:06.118 "is_configured": true, 00:19:06.118 "data_offset": 2048, 00:19:06.118 "data_size": 63488 00:19:06.118 } 00:19:06.118 ] 00:19:06.118 }' 00:19:06.118 23:40:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:06.118 23:40:50 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:19:06.685 23:40:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:19:06.685 [2024-07-24 23:40:51.537852] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:19:06.685 [2024-07-24 23:40:51.537974] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:19:06.685 [2024-07-24 23:40:51.537983] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:19:06.685 [2024-07-24 23:40:51.538004] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:19:06.685 [2024-07-24 23:40:51.542667] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b713d0 00:19:06.685 [2024-07-24 23:40:51.544141] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:19:06.685 23:40:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@755 -- # sleep 1 00:19:07.621 23:40:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:07.621 23:40:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:07.621 23:40:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:07.621 23:40:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:07.621 23:40:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:07.621 23:40:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:07.621 23:40:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:07.880 23:40:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:07.880 "name": "raid_bdev1", 00:19:07.880 "uuid": "69112ad9-3bc8-48e8-9642-c37c8484a5ca", 00:19:07.880 "strip_size_kb": 0, 00:19:07.880 "state": "online", 00:19:07.880 "raid_level": "raid1", 00:19:07.880 "superblock": true, 00:19:07.880 "num_base_bdevs": 2, 00:19:07.880 "num_base_bdevs_discovered": 2, 00:19:07.880 "num_base_bdevs_operational": 2, 00:19:07.880 "process": { 00:19:07.880 "type": "rebuild", 00:19:07.880 "target": "spare", 00:19:07.880 "progress": { 00:19:07.880 "blocks": 22528, 00:19:07.880 "percent": 35 00:19:07.880 } 00:19:07.880 }, 00:19:07.880 "base_bdevs_list": [ 00:19:07.880 { 00:19:07.881 "name": "spare", 00:19:07.881 "uuid": "657d6b7f-5e5d-5f12-99ca-d7d8c46a451e", 00:19:07.881 "is_configured": true, 00:19:07.881 "data_offset": 2048, 00:19:07.881 "data_size": 63488 00:19:07.881 }, 00:19:07.881 { 00:19:07.881 "name": "BaseBdev2", 00:19:07.881 "uuid": "37a6e5ed-2702-530f-8dc8-5f3a741736fa", 00:19:07.881 "is_configured": true, 00:19:07.881 "data_offset": 2048, 00:19:07.881 "data_size": 63488 00:19:07.881 } 00:19:07.881 ] 00:19:07.881 }' 00:19:07.881 23:40:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:07.881 23:40:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:07.881 23:40:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:07.881 23:40:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:19:07.881 23:40:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:19:08.140 [2024-07-24 23:40:52.974555] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:19:08.140 [2024-07-24 23:40:53.054734] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:19:08.140 [2024-07-24 23:40:53.054767] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:08.140 [2024-07-24 23:40:53.054791] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:19:08.140 [2024-07-24 23:40:53.054796] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:19:08.140 23:40:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:19:08.140 23:40:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:08.140 23:40:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:08.140 23:40:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:08.140 23:40:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:08.140 23:40:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:19:08.140 23:40:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:08.140 23:40:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:08.140 23:40:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:08.140 23:40:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:08.140 23:40:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:08.140 23:40:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:08.399 23:40:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:08.399 "name": "raid_bdev1", 00:19:08.399 "uuid": "69112ad9-3bc8-48e8-9642-c37c8484a5ca", 00:19:08.399 "strip_size_kb": 0, 00:19:08.399 "state": "online", 00:19:08.399 "raid_level": "raid1", 00:19:08.399 "superblock": true, 00:19:08.399 "num_base_bdevs": 2, 00:19:08.399 "num_base_bdevs_discovered": 1, 00:19:08.399 "num_base_bdevs_operational": 1, 00:19:08.399 "base_bdevs_list": [ 00:19:08.399 { 00:19:08.399 "name": null, 00:19:08.399 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:08.399 "is_configured": false, 00:19:08.399 "data_offset": 2048, 00:19:08.399 "data_size": 63488 00:19:08.399 }, 00:19:08.399 { 00:19:08.399 "name": "BaseBdev2", 00:19:08.399 "uuid": "37a6e5ed-2702-530f-8dc8-5f3a741736fa", 00:19:08.399 "is_configured": true, 00:19:08.399 "data_offset": 2048, 00:19:08.399 "data_size": 63488 00:19:08.399 } 00:19:08.399 ] 00:19:08.399 }' 00:19:08.399 23:40:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:08.399 23:40:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:19:08.964 23:40:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:19:08.964 [2024-07-24 23:40:53.881288] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:19:08.964 [2024-07-24 23:40:53.881326] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:08.964 [2024-07-24 23:40:53.881340] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b74f50 00:19:08.964 [2024-07-24 23:40:53.881346] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:08.964 [2024-07-24 23:40:53.881641] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:08.964 [2024-07-24 23:40:53.881652] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:19:08.964 [2024-07-24 23:40:53.881712] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:19:08.964 [2024-07-24 23:40:53.881719] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:19:08.964 [2024-07-24 23:40:53.881725] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:19:08.964 [2024-07-24 23:40:53.881747] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:19:08.964 [2024-07-24 23:40:53.886397] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b713d0 00:19:08.964 spare 00:19:08.964 [2024-07-24 23:40:53.887434] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:19:08.964 23:40:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@762 -- # sleep 1 00:19:10.340 23:40:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:10.340 23:40:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:10.340 23:40:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:10.340 23:40:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:10.340 23:40:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:10.340 23:40:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:10.340 23:40:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:10.340 23:40:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:10.340 "name": "raid_bdev1", 00:19:10.340 "uuid": "69112ad9-3bc8-48e8-9642-c37c8484a5ca", 00:19:10.340 "strip_size_kb": 0, 00:19:10.340 "state": "online", 00:19:10.340 "raid_level": "raid1", 00:19:10.340 "superblock": true, 00:19:10.340 "num_base_bdevs": 2, 00:19:10.340 "num_base_bdevs_discovered": 2, 00:19:10.340 "num_base_bdevs_operational": 2, 00:19:10.340 "process": { 00:19:10.340 "type": "rebuild", 00:19:10.340 "target": "spare", 00:19:10.340 "progress": { 00:19:10.340 "blocks": 22528, 00:19:10.340 "percent": 35 00:19:10.340 } 00:19:10.340 }, 00:19:10.340 "base_bdevs_list": [ 00:19:10.340 { 00:19:10.340 "name": "spare", 00:19:10.340 "uuid": "657d6b7f-5e5d-5f12-99ca-d7d8c46a451e", 00:19:10.340 "is_configured": true, 00:19:10.340 "data_offset": 2048, 00:19:10.340 "data_size": 63488 00:19:10.340 }, 00:19:10.340 { 00:19:10.340 "name": "BaseBdev2", 00:19:10.340 "uuid": "37a6e5ed-2702-530f-8dc8-5f3a741736fa", 00:19:10.340 "is_configured": true, 00:19:10.340 "data_offset": 2048, 00:19:10.340 "data_size": 63488 00:19:10.340 } 00:19:10.340 ] 00:19:10.340 }' 00:19:10.340 23:40:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:10.340 23:40:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:10.340 23:40:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:10.340 23:40:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:19:10.340 23:40:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:19:10.340 [2024-07-24 23:40:55.290067] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:19:10.340 [2024-07-24 23:40:55.297296] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:19:10.340 [2024-07-24 23:40:55.297326] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:10.340 [2024-07-24 23:40:55.297335] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:19:10.340 [2024-07-24 23:40:55.297339] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:19:10.340 23:40:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:19:10.340 23:40:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:10.340 23:40:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:10.340 23:40:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:10.340 23:40:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:10.340 23:40:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:19:10.340 23:40:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:10.340 23:40:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:10.340 23:40:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:10.340 23:40:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:10.340 23:40:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:10.340 23:40:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:10.599 23:40:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:10.599 "name": "raid_bdev1", 00:19:10.599 "uuid": "69112ad9-3bc8-48e8-9642-c37c8484a5ca", 00:19:10.599 "strip_size_kb": 0, 00:19:10.599 "state": "online", 00:19:10.599 "raid_level": "raid1", 00:19:10.599 "superblock": true, 00:19:10.599 "num_base_bdevs": 2, 00:19:10.599 "num_base_bdevs_discovered": 1, 00:19:10.599 "num_base_bdevs_operational": 1, 00:19:10.599 "base_bdevs_list": [ 00:19:10.599 { 00:19:10.599 "name": null, 00:19:10.599 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:10.599 "is_configured": false, 00:19:10.599 "data_offset": 2048, 00:19:10.599 "data_size": 63488 00:19:10.599 }, 00:19:10.599 { 00:19:10.599 "name": "BaseBdev2", 00:19:10.599 "uuid": "37a6e5ed-2702-530f-8dc8-5f3a741736fa", 00:19:10.599 "is_configured": true, 00:19:10.599 "data_offset": 2048, 00:19:10.599 "data_size": 63488 00:19:10.599 } 00:19:10.599 ] 00:19:10.599 }' 00:19:10.599 23:40:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:10.599 23:40:55 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:19:11.166 23:40:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:19:11.166 23:40:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:11.166 23:40:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:19:11.167 23:40:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:19:11.167 23:40:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:11.167 23:40:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:11.167 23:40:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:11.167 23:40:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:11.167 "name": "raid_bdev1", 00:19:11.167 "uuid": "69112ad9-3bc8-48e8-9642-c37c8484a5ca", 00:19:11.167 "strip_size_kb": 0, 00:19:11.167 "state": "online", 00:19:11.167 "raid_level": "raid1", 00:19:11.167 "superblock": true, 00:19:11.167 "num_base_bdevs": 2, 00:19:11.167 "num_base_bdevs_discovered": 1, 00:19:11.167 "num_base_bdevs_operational": 1, 00:19:11.167 "base_bdevs_list": [ 00:19:11.167 { 00:19:11.167 "name": null, 00:19:11.167 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:11.167 "is_configured": false, 00:19:11.167 "data_offset": 2048, 00:19:11.167 "data_size": 63488 00:19:11.167 }, 00:19:11.167 { 00:19:11.167 "name": "BaseBdev2", 00:19:11.167 "uuid": "37a6e5ed-2702-530f-8dc8-5f3a741736fa", 00:19:11.167 "is_configured": true, 00:19:11.167 "data_offset": 2048, 00:19:11.167 "data_size": 63488 00:19:11.167 } 00:19:11.167 ] 00:19:11.167 }' 00:19:11.167 23:40:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:11.425 23:40:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:19:11.425 23:40:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:11.425 23:40:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:19:11.425 23:40:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:19:11.425 23:40:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:19:11.684 [2024-07-24 23:40:56.556967] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:19:11.684 [2024-07-24 23:40:56.557000] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:11.684 [2024-07-24 23:40:56.557011] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1afab60 00:19:11.684 [2024-07-24 23:40:56.557017] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:11.684 [2024-07-24 23:40:56.557258] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:11.684 [2024-07-24 23:40:56.557267] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:19:11.684 [2024-07-24 23:40:56.557310] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:19:11.684 [2024-07-24 23:40:56.557317] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:19:11.684 [2024-07-24 23:40:56.557326] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:19:11.684 BaseBdev1 00:19:11.684 23:40:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@773 -- # sleep 1 00:19:12.619 23:40:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:19:12.619 23:40:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:12.619 23:40:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:12.619 23:40:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:12.619 23:40:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:12.619 23:40:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:19:12.619 23:40:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:12.619 23:40:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:12.620 23:40:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:12.620 23:40:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:12.620 23:40:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:12.620 23:40:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:12.879 23:40:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:12.879 "name": "raid_bdev1", 00:19:12.879 "uuid": "69112ad9-3bc8-48e8-9642-c37c8484a5ca", 00:19:12.879 "strip_size_kb": 0, 00:19:12.879 "state": "online", 00:19:12.879 "raid_level": "raid1", 00:19:12.879 "superblock": true, 00:19:12.879 "num_base_bdevs": 2, 00:19:12.879 "num_base_bdevs_discovered": 1, 00:19:12.879 "num_base_bdevs_operational": 1, 00:19:12.879 "base_bdevs_list": [ 00:19:12.879 { 00:19:12.879 "name": null, 00:19:12.879 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:12.879 "is_configured": false, 00:19:12.879 "data_offset": 2048, 00:19:12.879 "data_size": 63488 00:19:12.879 }, 00:19:12.879 { 00:19:12.879 "name": "BaseBdev2", 00:19:12.879 "uuid": "37a6e5ed-2702-530f-8dc8-5f3a741736fa", 00:19:12.879 "is_configured": true, 00:19:12.879 "data_offset": 2048, 00:19:12.879 "data_size": 63488 00:19:12.879 } 00:19:12.879 ] 00:19:12.879 }' 00:19:12.879 23:40:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:12.879 23:40:57 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:19:13.445 23:40:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:19:13.445 23:40:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:13.445 23:40:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:19:13.445 23:40:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:19:13.445 23:40:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:13.445 23:40:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:13.445 23:40:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:13.445 23:40:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:13.445 "name": "raid_bdev1", 00:19:13.445 "uuid": "69112ad9-3bc8-48e8-9642-c37c8484a5ca", 00:19:13.445 "strip_size_kb": 0, 00:19:13.445 "state": "online", 00:19:13.445 "raid_level": "raid1", 00:19:13.445 "superblock": true, 00:19:13.445 "num_base_bdevs": 2, 00:19:13.445 "num_base_bdevs_discovered": 1, 00:19:13.445 "num_base_bdevs_operational": 1, 00:19:13.445 "base_bdevs_list": [ 00:19:13.445 { 00:19:13.445 "name": null, 00:19:13.445 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:13.445 "is_configured": false, 00:19:13.445 "data_offset": 2048, 00:19:13.445 "data_size": 63488 00:19:13.445 }, 00:19:13.445 { 00:19:13.445 "name": "BaseBdev2", 00:19:13.445 "uuid": "37a6e5ed-2702-530f-8dc8-5f3a741736fa", 00:19:13.445 "is_configured": true, 00:19:13.445 "data_offset": 2048, 00:19:13.445 "data_size": 63488 00:19:13.445 } 00:19:13.445 ] 00:19:13.445 }' 00:19:13.445 23:40:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:13.445 23:40:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:19:13.445 23:40:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:13.704 23:40:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:19:13.704 23:40:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:19:13.704 23:40:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@650 -- # local es=0 00:19:13.704 23:40:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:19:13.704 23:40:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:13.704 23:40:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:19:13.704 23:40:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:13.704 23:40:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:19:13.704 23:40:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:13.704 23:40:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:19:13.704 23:40:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:13.704 23:40:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:19:13.704 23:40:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:19:13.704 [2024-07-24 23:40:58.622491] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:13.704 [2024-07-24 23:40:58.622585] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:19:13.704 [2024-07-24 23:40:58.622593] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:19:13.704 request: 00:19:13.704 { 00:19:13.704 "base_bdev": "BaseBdev1", 00:19:13.704 "raid_bdev": "raid_bdev1", 00:19:13.704 "method": "bdev_raid_add_base_bdev", 00:19:13.704 "req_id": 1 00:19:13.704 } 00:19:13.704 Got JSON-RPC error response 00:19:13.704 response: 00:19:13.704 { 00:19:13.704 "code": -22, 00:19:13.704 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:19:13.704 } 00:19:13.704 23:40:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@653 -- # es=1 00:19:13.704 23:40:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:19:13.704 23:40:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:19:13.704 23:40:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:19:13.704 23:40:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@777 -- # sleep 1 00:19:15.114 23:40:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:19:15.114 23:40:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:15.114 23:40:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:15.114 23:40:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:15.114 23:40:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:15.114 23:40:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:19:15.114 23:40:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:15.114 23:40:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:15.114 23:40:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:15.114 23:40:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:15.114 23:40:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:15.114 23:40:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:15.114 23:40:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:15.114 "name": "raid_bdev1", 00:19:15.114 "uuid": "69112ad9-3bc8-48e8-9642-c37c8484a5ca", 00:19:15.114 "strip_size_kb": 0, 00:19:15.114 "state": "online", 00:19:15.114 "raid_level": "raid1", 00:19:15.114 "superblock": true, 00:19:15.114 "num_base_bdevs": 2, 00:19:15.114 "num_base_bdevs_discovered": 1, 00:19:15.114 "num_base_bdevs_operational": 1, 00:19:15.114 "base_bdevs_list": [ 00:19:15.114 { 00:19:15.114 "name": null, 00:19:15.114 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:15.114 "is_configured": false, 00:19:15.114 "data_offset": 2048, 00:19:15.114 "data_size": 63488 00:19:15.114 }, 00:19:15.114 { 00:19:15.114 "name": "BaseBdev2", 00:19:15.114 "uuid": "37a6e5ed-2702-530f-8dc8-5f3a741736fa", 00:19:15.114 "is_configured": true, 00:19:15.114 "data_offset": 2048, 00:19:15.114 "data_size": 63488 00:19:15.114 } 00:19:15.114 ] 00:19:15.114 }' 00:19:15.114 23:40:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:15.114 23:40:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:19:15.403 23:41:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:19:15.403 23:41:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:15.403 23:41:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:19:15.403 23:41:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:19:15.403 23:41:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:15.403 23:41:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:15.403 23:41:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:15.662 23:41:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:15.662 "name": "raid_bdev1", 00:19:15.662 "uuid": "69112ad9-3bc8-48e8-9642-c37c8484a5ca", 00:19:15.662 "strip_size_kb": 0, 00:19:15.662 "state": "online", 00:19:15.662 "raid_level": "raid1", 00:19:15.662 "superblock": true, 00:19:15.662 "num_base_bdevs": 2, 00:19:15.662 "num_base_bdevs_discovered": 1, 00:19:15.662 "num_base_bdevs_operational": 1, 00:19:15.662 "base_bdevs_list": [ 00:19:15.662 { 00:19:15.662 "name": null, 00:19:15.662 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:15.662 "is_configured": false, 00:19:15.662 "data_offset": 2048, 00:19:15.662 "data_size": 63488 00:19:15.662 }, 00:19:15.662 { 00:19:15.662 "name": "BaseBdev2", 00:19:15.662 "uuid": "37a6e5ed-2702-530f-8dc8-5f3a741736fa", 00:19:15.662 "is_configured": true, 00:19:15.662 "data_offset": 2048, 00:19:15.662 "data_size": 63488 00:19:15.662 } 00:19:15.662 ] 00:19:15.662 }' 00:19:15.662 23:41:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:15.662 23:41:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:19:15.662 23:41:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:15.662 23:41:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:19:15.662 23:41:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@782 -- # killprocess 356748 00:19:15.662 23:41:00 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@950 -- # '[' -z 356748 ']' 00:19:15.662 23:41:00 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # kill -0 356748 00:19:15.662 23:41:00 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@955 -- # uname 00:19:15.662 23:41:00 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:19:15.662 23:41:00 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 356748 00:19:15.662 23:41:00 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:19:15.662 23:41:00 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:19:15.662 23:41:00 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@968 -- # echo 'killing process with pid 356748' 00:19:15.662 killing process with pid 356748 00:19:15.662 23:41:00 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@969 -- # kill 356748 00:19:15.662 Received shutdown signal, test time was about 22.349259 seconds 00:19:15.662 00:19:15.662 Latency(us) 00:19:15.662 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:19:15.662 =================================================================================================================== 00:19:15.662 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:19:15.662 [2024-07-24 23:41:00.577114] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:15.662 23:41:00 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@974 -- # wait 356748 00:19:15.662 [2024-07-24 23:41:00.577187] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:15.662 [2024-07-24 23:41:00.577221] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:15.662 [2024-07-24 23:41:00.577227] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1afa7e0 name raid_bdev1, state offline 00:19:15.662 [2024-07-24 23:41:00.595694] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:15.921 23:41:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@784 -- # return 0 00:19:15.921 00:19:15.921 real 0m25.920s 00:19:15.921 user 0m39.702s 00:19:15.921 sys 0m2.820s 00:19:15.921 23:41:00 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1126 -- # xtrace_disable 00:19:15.921 23:41:00 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:19:15.921 ************************************ 00:19:15.921 END TEST raid_rebuild_test_sb_io 00:19:15.921 ************************************ 00:19:15.921 23:41:00 bdev_raid -- bdev/bdev_raid.sh@876 -- # for n in 2 4 00:19:15.921 23:41:00 bdev_raid -- bdev/bdev_raid.sh@877 -- # run_test raid_rebuild_test raid_rebuild_test raid1 4 false false true 00:19:15.921 23:41:00 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:19:15.921 23:41:00 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:19:15.921 23:41:00 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:15.921 ************************************ 00:19:15.921 START TEST raid_rebuild_test 00:19:15.921 ************************************ 00:19:15.921 23:41:00 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1125 -- # raid_rebuild_test raid1 4 false false true 00:19:15.921 23:41:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:19:15.921 23:41:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:19:15.921 23:41:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:19:15.921 23:41:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:19:15.921 23:41:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@572 -- # local verify=true 00:19:15.921 23:41:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:19:15.921 23:41:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:19:15.921 23:41:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:19:15.921 23:41:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:19:15.921 23:41:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:19:15.921 23:41:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:19:15.921 23:41:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:19:15.921 23:41:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:19:15.921 23:41:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:19:15.921 23:41:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:19:15.921 23:41:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:19:15.921 23:41:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:19:15.921 23:41:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:19:15.921 23:41:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:19:15.921 23:41:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:19:15.921 23:41:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:19:15.921 23:41:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:19:15.921 23:41:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # local strip_size 00:19:15.921 23:41:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@576 -- # local create_arg 00:19:15.921 23:41:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:19:15.921 23:41:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@578 -- # local data_offset 00:19:15.921 23:41:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:19:15.921 23:41:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:19:15.921 23:41:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:19:15.921 23:41:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@596 -- # raid_pid=361362 00:19:15.921 23:41:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@597 -- # waitforlisten 361362 /var/tmp/spdk-raid.sock 00:19:15.921 23:41:00 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@831 -- # '[' -z 361362 ']' 00:19:15.921 23:41:00 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:15.921 23:41:00 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:19:15.921 23:41:00 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:15.921 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:15.921 23:41:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:19:15.921 23:41:00 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:19:15.921 23:41:00 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:19:15.921 [2024-07-24 23:41:00.893531] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:19:15.921 [2024-07-24 23:41:00.893570] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid361362 ] 00:19:15.921 I/O size of 3145728 is greater than zero copy threshold (65536). 00:19:15.921 Zero copy mechanism will not be used. 00:19:16.180 [2024-07-24 23:41:00.955616] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:16.180 [2024-07-24 23:41:01.035181] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:19:16.180 [2024-07-24 23:41:01.087527] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:16.180 [2024-07-24 23:41:01.087554] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:16.746 23:41:01 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:19:16.746 23:41:01 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@864 -- # return 0 00:19:16.746 23:41:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:19:16.746 23:41:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:19:17.004 BaseBdev1_malloc 00:19:17.004 23:41:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:19:17.263 [2024-07-24 23:41:02.011429] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:19:17.263 [2024-07-24 23:41:02.011463] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:17.263 [2024-07-24 23:41:02.011488] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16c51c0 00:19:17.263 [2024-07-24 23:41:02.011494] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:17.263 [2024-07-24 23:41:02.012646] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:17.263 [2024-07-24 23:41:02.012667] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:19:17.263 BaseBdev1 00:19:17.263 23:41:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:19:17.263 23:41:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:19:17.263 BaseBdev2_malloc 00:19:17.263 23:41:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:19:17.522 [2024-07-24 23:41:02.344060] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:19:17.522 [2024-07-24 23:41:02.344091] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:17.522 [2024-07-24 23:41:02.344109] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16c5ce0 00:19:17.522 [2024-07-24 23:41:02.344116] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:17.522 [2024-07-24 23:41:02.345184] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:17.522 [2024-07-24 23:41:02.345205] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:19:17.522 BaseBdev2 00:19:17.522 23:41:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:19:17.522 23:41:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:19:17.522 BaseBdev3_malloc 00:19:17.781 23:41:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:19:17.781 [2024-07-24 23:41:02.672407] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:19:17.781 [2024-07-24 23:41:02.672438] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:17.781 [2024-07-24 23:41:02.672451] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1872d70 00:19:17.781 [2024-07-24 23:41:02.672457] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:17.781 [2024-07-24 23:41:02.673449] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:17.781 [2024-07-24 23:41:02.673476] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:19:17.781 BaseBdev3 00:19:17.781 23:41:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:19:17.781 23:41:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:19:18.039 BaseBdev4_malloc 00:19:18.039 23:41:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:19:18.039 [2024-07-24 23:41:03.008634] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:19:18.039 [2024-07-24 23:41:03.008667] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:18.039 [2024-07-24 23:41:03.008681] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1871f50 00:19:18.039 [2024-07-24 23:41:03.008703] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:18.040 [2024-07-24 23:41:03.009740] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:18.040 [2024-07-24 23:41:03.009777] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:19:18.040 BaseBdev4 00:19:18.040 23:41:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:19:18.298 spare_malloc 00:19:18.298 23:41:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:19:18.604 spare_delay 00:19:18.604 23:41:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:19:18.604 [2024-07-24 23:41:03.497338] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:19:18.604 [2024-07-24 23:41:03.497368] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:18.604 [2024-07-24 23:41:03.497392] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1876a30 00:19:18.604 [2024-07-24 23:41:03.497413] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:18.604 [2024-07-24 23:41:03.498432] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:18.604 [2024-07-24 23:41:03.498452] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:19:18.604 spare 00:19:18.604 23:41:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:19:18.862 [2024-07-24 23:41:03.657774] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:18.862 [2024-07-24 23:41:03.658632] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:18.862 [2024-07-24 23:41:03.658671] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:18.862 [2024-07-24 23:41:03.658699] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:18.862 [2024-07-24 23:41:03.658756] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x17f5d20 00:19:18.862 [2024-07-24 23:41:03.658762] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:19:18.862 [2024-07-24 23:41:03.658908] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1870290 00:19:18.862 [2024-07-24 23:41:03.659008] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x17f5d20 00:19:18.862 [2024-07-24 23:41:03.659013] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x17f5d20 00:19:18.862 [2024-07-24 23:41:03.659084] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:18.862 23:41:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:19:18.862 23:41:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:18.862 23:41:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:18.862 23:41:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:18.862 23:41:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:18.862 23:41:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:18.862 23:41:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:18.862 23:41:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:18.862 23:41:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:18.862 23:41:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:18.862 23:41:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:18.862 23:41:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:18.862 23:41:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:18.862 "name": "raid_bdev1", 00:19:18.862 "uuid": "294a2b37-014c-419e-970f-750fbc612095", 00:19:18.862 "strip_size_kb": 0, 00:19:18.862 "state": "online", 00:19:18.862 "raid_level": "raid1", 00:19:18.862 "superblock": false, 00:19:18.862 "num_base_bdevs": 4, 00:19:18.862 "num_base_bdevs_discovered": 4, 00:19:18.862 "num_base_bdevs_operational": 4, 00:19:18.862 "base_bdevs_list": [ 00:19:18.862 { 00:19:18.862 "name": "BaseBdev1", 00:19:18.862 "uuid": "a101ffcb-f90b-5045-9c31-c1492cb900d3", 00:19:18.862 "is_configured": true, 00:19:18.862 "data_offset": 0, 00:19:18.862 "data_size": 65536 00:19:18.862 }, 00:19:18.862 { 00:19:18.862 "name": "BaseBdev2", 00:19:18.862 "uuid": "9430907b-b1ad-5cf0-9f60-4a1f051353c3", 00:19:18.862 "is_configured": true, 00:19:18.862 "data_offset": 0, 00:19:18.862 "data_size": 65536 00:19:18.862 }, 00:19:18.862 { 00:19:18.862 "name": "BaseBdev3", 00:19:18.862 "uuid": "441d37a3-77cb-5ed6-beef-cafdc9527948", 00:19:18.862 "is_configured": true, 00:19:18.862 "data_offset": 0, 00:19:18.862 "data_size": 65536 00:19:18.862 }, 00:19:18.862 { 00:19:18.862 "name": "BaseBdev4", 00:19:18.862 "uuid": "e4265fd5-87df-52e8-9e11-a9191139fdaf", 00:19:18.862 "is_configured": true, 00:19:18.862 "data_offset": 0, 00:19:18.862 "data_size": 65536 00:19:18.862 } 00:19:18.862 ] 00:19:18.862 }' 00:19:18.862 23:41:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:18.862 23:41:03 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:19:19.429 23:41:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:19:19.429 23:41:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:19.687 [2024-07-24 23:41:04.476073] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:19.687 23:41:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:19:19.687 23:41:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:19.687 23:41:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:19:19.687 23:41:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:19:19.687 23:41:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:19:19.687 23:41:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:19:19.687 23:41:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:19:19.687 23:41:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:19:19.687 23:41:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:19:19.687 23:41:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:19:19.687 23:41:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:19:19.687 23:41:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:19:19.687 23:41:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:19:19.687 23:41:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:19:19.687 23:41:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:19:19.687 23:41:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:19:19.687 23:41:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:19:19.945 [2024-07-24 23:41:04.800743] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1870290 00:19:19.945 /dev/nbd0 00:19:19.945 23:41:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:19:19.945 23:41:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:19:19.945 23:41:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:19:19.945 23:41:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # local i 00:19:19.945 23:41:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:19:19.945 23:41:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:19:19.946 23:41:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:19:19.946 23:41:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@873 -- # break 00:19:19.946 23:41:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:19:19.946 23:41:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:19:19.946 23:41:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:19.946 1+0 records in 00:19:19.946 1+0 records out 00:19:19.946 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000184312 s, 22.2 MB/s 00:19:19.946 23:41:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:19.946 23:41:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # size=4096 00:19:19.946 23:41:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:19.946 23:41:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:19:19.946 23:41:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@889 -- # return 0 00:19:19.946 23:41:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:19:19.946 23:41:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:19:19.946 23:41:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:19:19.946 23:41:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:19:19.946 23:41:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=65536 oflag=direct 00:19:25.213 65536+0 records in 00:19:25.213 65536+0 records out 00:19:25.213 33554432 bytes (34 MB, 32 MiB) copied, 4.38383 s, 7.7 MB/s 00:19:25.213 23:41:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:19:25.213 23:41:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:19:25.213 23:41:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:19:25.213 23:41:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:19:25.213 23:41:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:19:25.213 23:41:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:25.213 23:41:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:19:25.213 [2024-07-24 23:41:09.438942] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:25.213 23:41:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:19:25.213 23:41:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:19:25.213 23:41:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:19:25.213 23:41:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:25.213 23:41:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:25.213 23:41:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:19:25.213 23:41:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:19:25.214 23:41:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:19:25.214 23:41:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:19:25.214 [2024-07-24 23:41:09.603403] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:25.214 23:41:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:19:25.214 23:41:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:25.214 23:41:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:25.214 23:41:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:25.214 23:41:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:25.214 23:41:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:25.214 23:41:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:25.214 23:41:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:25.214 23:41:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:25.214 23:41:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:25.214 23:41:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:25.214 23:41:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:25.214 23:41:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:25.214 "name": "raid_bdev1", 00:19:25.214 "uuid": "294a2b37-014c-419e-970f-750fbc612095", 00:19:25.214 "strip_size_kb": 0, 00:19:25.214 "state": "online", 00:19:25.214 "raid_level": "raid1", 00:19:25.214 "superblock": false, 00:19:25.214 "num_base_bdevs": 4, 00:19:25.214 "num_base_bdevs_discovered": 3, 00:19:25.214 "num_base_bdevs_operational": 3, 00:19:25.214 "base_bdevs_list": [ 00:19:25.214 { 00:19:25.214 "name": null, 00:19:25.214 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:25.214 "is_configured": false, 00:19:25.214 "data_offset": 0, 00:19:25.214 "data_size": 65536 00:19:25.214 }, 00:19:25.214 { 00:19:25.214 "name": "BaseBdev2", 00:19:25.214 "uuid": "9430907b-b1ad-5cf0-9f60-4a1f051353c3", 00:19:25.214 "is_configured": true, 00:19:25.214 "data_offset": 0, 00:19:25.214 "data_size": 65536 00:19:25.214 }, 00:19:25.214 { 00:19:25.214 "name": "BaseBdev3", 00:19:25.214 "uuid": "441d37a3-77cb-5ed6-beef-cafdc9527948", 00:19:25.214 "is_configured": true, 00:19:25.214 "data_offset": 0, 00:19:25.214 "data_size": 65536 00:19:25.214 }, 00:19:25.214 { 00:19:25.214 "name": "BaseBdev4", 00:19:25.214 "uuid": "e4265fd5-87df-52e8-9e11-a9191139fdaf", 00:19:25.214 "is_configured": true, 00:19:25.214 "data_offset": 0, 00:19:25.214 "data_size": 65536 00:19:25.214 } 00:19:25.214 ] 00:19:25.214 }' 00:19:25.214 23:41:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:25.214 23:41:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:19:25.473 23:41:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:19:25.473 [2024-07-24 23:41:10.413505] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:19:25.473 [2024-07-24 23:41:10.417078] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1870290 00:19:25.473 [2024-07-24 23:41:10.418515] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:19:25.473 23:41:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@646 -- # sleep 1 00:19:26.850 23:41:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:26.850 23:41:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:26.850 23:41:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:26.850 23:41:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:26.850 23:41:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:26.850 23:41:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:26.850 23:41:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:26.850 23:41:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:26.850 "name": "raid_bdev1", 00:19:26.850 "uuid": "294a2b37-014c-419e-970f-750fbc612095", 00:19:26.850 "strip_size_kb": 0, 00:19:26.850 "state": "online", 00:19:26.850 "raid_level": "raid1", 00:19:26.850 "superblock": false, 00:19:26.850 "num_base_bdevs": 4, 00:19:26.850 "num_base_bdevs_discovered": 4, 00:19:26.850 "num_base_bdevs_operational": 4, 00:19:26.850 "process": { 00:19:26.850 "type": "rebuild", 00:19:26.850 "target": "spare", 00:19:26.850 "progress": { 00:19:26.850 "blocks": 22528, 00:19:26.850 "percent": 34 00:19:26.850 } 00:19:26.850 }, 00:19:26.850 "base_bdevs_list": [ 00:19:26.850 { 00:19:26.850 "name": "spare", 00:19:26.850 "uuid": "3cd73924-43e3-5c78-bbf5-2b58a49cf660", 00:19:26.850 "is_configured": true, 00:19:26.850 "data_offset": 0, 00:19:26.850 "data_size": 65536 00:19:26.850 }, 00:19:26.850 { 00:19:26.850 "name": "BaseBdev2", 00:19:26.850 "uuid": "9430907b-b1ad-5cf0-9f60-4a1f051353c3", 00:19:26.850 "is_configured": true, 00:19:26.850 "data_offset": 0, 00:19:26.850 "data_size": 65536 00:19:26.850 }, 00:19:26.850 { 00:19:26.850 "name": "BaseBdev3", 00:19:26.850 "uuid": "441d37a3-77cb-5ed6-beef-cafdc9527948", 00:19:26.850 "is_configured": true, 00:19:26.850 "data_offset": 0, 00:19:26.850 "data_size": 65536 00:19:26.850 }, 00:19:26.850 { 00:19:26.850 "name": "BaseBdev4", 00:19:26.850 "uuid": "e4265fd5-87df-52e8-9e11-a9191139fdaf", 00:19:26.850 "is_configured": true, 00:19:26.850 "data_offset": 0, 00:19:26.850 "data_size": 65536 00:19:26.850 } 00:19:26.850 ] 00:19:26.850 }' 00:19:26.850 23:41:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:26.850 23:41:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:26.850 23:41:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:26.850 23:41:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:19:26.850 23:41:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:19:26.850 [2024-07-24 23:41:11.818674] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:19:26.850 [2024-07-24 23:41:11.828304] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:19:26.850 [2024-07-24 23:41:11.828333] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:26.850 [2024-07-24 23:41:11.828343] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:19:26.850 [2024-07-24 23:41:11.828347] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:19:26.850 23:41:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:19:26.850 23:41:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:26.850 23:41:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:26.850 23:41:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:26.850 23:41:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:26.850 23:41:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:26.850 23:41:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:26.850 23:41:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:27.110 23:41:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:27.110 23:41:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:27.110 23:41:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:27.110 23:41:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:27.110 23:41:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:27.110 "name": "raid_bdev1", 00:19:27.110 "uuid": "294a2b37-014c-419e-970f-750fbc612095", 00:19:27.110 "strip_size_kb": 0, 00:19:27.110 "state": "online", 00:19:27.110 "raid_level": "raid1", 00:19:27.110 "superblock": false, 00:19:27.110 "num_base_bdevs": 4, 00:19:27.110 "num_base_bdevs_discovered": 3, 00:19:27.110 "num_base_bdevs_operational": 3, 00:19:27.110 "base_bdevs_list": [ 00:19:27.110 { 00:19:27.110 "name": null, 00:19:27.110 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:27.110 "is_configured": false, 00:19:27.110 "data_offset": 0, 00:19:27.110 "data_size": 65536 00:19:27.110 }, 00:19:27.110 { 00:19:27.110 "name": "BaseBdev2", 00:19:27.110 "uuid": "9430907b-b1ad-5cf0-9f60-4a1f051353c3", 00:19:27.110 "is_configured": true, 00:19:27.110 "data_offset": 0, 00:19:27.110 "data_size": 65536 00:19:27.110 }, 00:19:27.110 { 00:19:27.110 "name": "BaseBdev3", 00:19:27.110 "uuid": "441d37a3-77cb-5ed6-beef-cafdc9527948", 00:19:27.110 "is_configured": true, 00:19:27.110 "data_offset": 0, 00:19:27.110 "data_size": 65536 00:19:27.110 }, 00:19:27.110 { 00:19:27.110 "name": "BaseBdev4", 00:19:27.110 "uuid": "e4265fd5-87df-52e8-9e11-a9191139fdaf", 00:19:27.110 "is_configured": true, 00:19:27.110 "data_offset": 0, 00:19:27.110 "data_size": 65536 00:19:27.110 } 00:19:27.110 ] 00:19:27.110 }' 00:19:27.110 23:41:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:27.110 23:41:12 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:19:27.679 23:41:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:19:27.679 23:41:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:27.679 23:41:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:19:27.679 23:41:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:19:27.679 23:41:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:27.679 23:41:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:27.679 23:41:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:27.679 23:41:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:27.679 "name": "raid_bdev1", 00:19:27.679 "uuid": "294a2b37-014c-419e-970f-750fbc612095", 00:19:27.679 "strip_size_kb": 0, 00:19:27.679 "state": "online", 00:19:27.679 "raid_level": "raid1", 00:19:27.679 "superblock": false, 00:19:27.679 "num_base_bdevs": 4, 00:19:27.679 "num_base_bdevs_discovered": 3, 00:19:27.679 "num_base_bdevs_operational": 3, 00:19:27.679 "base_bdevs_list": [ 00:19:27.679 { 00:19:27.679 "name": null, 00:19:27.679 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:27.679 "is_configured": false, 00:19:27.679 "data_offset": 0, 00:19:27.679 "data_size": 65536 00:19:27.679 }, 00:19:27.679 { 00:19:27.679 "name": "BaseBdev2", 00:19:27.679 "uuid": "9430907b-b1ad-5cf0-9f60-4a1f051353c3", 00:19:27.679 "is_configured": true, 00:19:27.679 "data_offset": 0, 00:19:27.679 "data_size": 65536 00:19:27.679 }, 00:19:27.679 { 00:19:27.679 "name": "BaseBdev3", 00:19:27.679 "uuid": "441d37a3-77cb-5ed6-beef-cafdc9527948", 00:19:27.679 "is_configured": true, 00:19:27.679 "data_offset": 0, 00:19:27.679 "data_size": 65536 00:19:27.679 }, 00:19:27.679 { 00:19:27.679 "name": "BaseBdev4", 00:19:27.679 "uuid": "e4265fd5-87df-52e8-9e11-a9191139fdaf", 00:19:27.679 "is_configured": true, 00:19:27.679 "data_offset": 0, 00:19:27.679 "data_size": 65536 00:19:27.679 } 00:19:27.679 ] 00:19:27.679 }' 00:19:27.679 23:41:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:27.938 23:41:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:19:27.938 23:41:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:27.938 23:41:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:19:27.938 23:41:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:19:27.938 [2024-07-24 23:41:12.898734] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:19:27.938 [2024-07-24 23:41:12.902255] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17fbb30 00:19:27.938 [2024-07-24 23:41:12.903330] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:19:27.938 23:41:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@662 -- # sleep 1 00:19:29.314 23:41:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:29.314 23:41:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:29.314 23:41:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:29.314 23:41:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:29.315 23:41:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:29.315 23:41:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:29.315 23:41:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:29.315 23:41:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:29.315 "name": "raid_bdev1", 00:19:29.315 "uuid": "294a2b37-014c-419e-970f-750fbc612095", 00:19:29.315 "strip_size_kb": 0, 00:19:29.315 "state": "online", 00:19:29.315 "raid_level": "raid1", 00:19:29.315 "superblock": false, 00:19:29.315 "num_base_bdevs": 4, 00:19:29.315 "num_base_bdevs_discovered": 4, 00:19:29.315 "num_base_bdevs_operational": 4, 00:19:29.315 "process": { 00:19:29.315 "type": "rebuild", 00:19:29.315 "target": "spare", 00:19:29.315 "progress": { 00:19:29.315 "blocks": 22528, 00:19:29.315 "percent": 34 00:19:29.315 } 00:19:29.315 }, 00:19:29.315 "base_bdevs_list": [ 00:19:29.315 { 00:19:29.315 "name": "spare", 00:19:29.315 "uuid": "3cd73924-43e3-5c78-bbf5-2b58a49cf660", 00:19:29.315 "is_configured": true, 00:19:29.315 "data_offset": 0, 00:19:29.315 "data_size": 65536 00:19:29.315 }, 00:19:29.315 { 00:19:29.315 "name": "BaseBdev2", 00:19:29.315 "uuid": "9430907b-b1ad-5cf0-9f60-4a1f051353c3", 00:19:29.315 "is_configured": true, 00:19:29.315 "data_offset": 0, 00:19:29.315 "data_size": 65536 00:19:29.315 }, 00:19:29.315 { 00:19:29.315 "name": "BaseBdev3", 00:19:29.315 "uuid": "441d37a3-77cb-5ed6-beef-cafdc9527948", 00:19:29.315 "is_configured": true, 00:19:29.315 "data_offset": 0, 00:19:29.315 "data_size": 65536 00:19:29.315 }, 00:19:29.315 { 00:19:29.315 "name": "BaseBdev4", 00:19:29.315 "uuid": "e4265fd5-87df-52e8-9e11-a9191139fdaf", 00:19:29.315 "is_configured": true, 00:19:29.315 "data_offset": 0, 00:19:29.315 "data_size": 65536 00:19:29.315 } 00:19:29.315 ] 00:19:29.315 }' 00:19:29.315 23:41:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:29.315 23:41:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:29.315 23:41:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:29.315 23:41:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:19:29.315 23:41:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:19:29.315 23:41:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:19:29.315 23:41:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:19:29.315 23:41:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:19:29.315 23:41:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:19:29.574 [2024-07-24 23:41:14.328013] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:19:29.574 [2024-07-24 23:41:14.413920] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x17fbb30 00:19:29.574 23:41:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:19:29.574 23:41:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:19:29.574 23:41:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:29.574 23:41:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:29.574 23:41:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:29.574 23:41:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:29.574 23:41:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:29.574 23:41:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:29.574 23:41:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:29.832 23:41:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:29.832 "name": "raid_bdev1", 00:19:29.832 "uuid": "294a2b37-014c-419e-970f-750fbc612095", 00:19:29.832 "strip_size_kb": 0, 00:19:29.832 "state": "online", 00:19:29.832 "raid_level": "raid1", 00:19:29.832 "superblock": false, 00:19:29.832 "num_base_bdevs": 4, 00:19:29.832 "num_base_bdevs_discovered": 3, 00:19:29.832 "num_base_bdevs_operational": 3, 00:19:29.832 "process": { 00:19:29.832 "type": "rebuild", 00:19:29.832 "target": "spare", 00:19:29.832 "progress": { 00:19:29.832 "blocks": 32768, 00:19:29.832 "percent": 50 00:19:29.832 } 00:19:29.832 }, 00:19:29.832 "base_bdevs_list": [ 00:19:29.832 { 00:19:29.832 "name": "spare", 00:19:29.833 "uuid": "3cd73924-43e3-5c78-bbf5-2b58a49cf660", 00:19:29.833 "is_configured": true, 00:19:29.833 "data_offset": 0, 00:19:29.833 "data_size": 65536 00:19:29.833 }, 00:19:29.833 { 00:19:29.833 "name": null, 00:19:29.833 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:29.833 "is_configured": false, 00:19:29.833 "data_offset": 0, 00:19:29.833 "data_size": 65536 00:19:29.833 }, 00:19:29.833 { 00:19:29.833 "name": "BaseBdev3", 00:19:29.833 "uuid": "441d37a3-77cb-5ed6-beef-cafdc9527948", 00:19:29.833 "is_configured": true, 00:19:29.833 "data_offset": 0, 00:19:29.833 "data_size": 65536 00:19:29.833 }, 00:19:29.833 { 00:19:29.833 "name": "BaseBdev4", 00:19:29.833 "uuid": "e4265fd5-87df-52e8-9e11-a9191139fdaf", 00:19:29.833 "is_configured": true, 00:19:29.833 "data_offset": 0, 00:19:29.833 "data_size": 65536 00:19:29.833 } 00:19:29.833 ] 00:19:29.833 }' 00:19:29.833 23:41:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:29.833 23:41:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:29.833 23:41:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:29.833 23:41:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:19:29.833 23:41:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@705 -- # local timeout=667 00:19:29.833 23:41:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:19:29.833 23:41:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:29.833 23:41:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:29.833 23:41:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:29.833 23:41:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:29.833 23:41:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:29.833 23:41:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:29.833 23:41:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:30.092 23:41:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:30.092 "name": "raid_bdev1", 00:19:30.092 "uuid": "294a2b37-014c-419e-970f-750fbc612095", 00:19:30.092 "strip_size_kb": 0, 00:19:30.092 "state": "online", 00:19:30.092 "raid_level": "raid1", 00:19:30.092 "superblock": false, 00:19:30.092 "num_base_bdevs": 4, 00:19:30.092 "num_base_bdevs_discovered": 3, 00:19:30.092 "num_base_bdevs_operational": 3, 00:19:30.092 "process": { 00:19:30.092 "type": "rebuild", 00:19:30.092 "target": "spare", 00:19:30.092 "progress": { 00:19:30.092 "blocks": 38912, 00:19:30.092 "percent": 59 00:19:30.092 } 00:19:30.092 }, 00:19:30.092 "base_bdevs_list": [ 00:19:30.092 { 00:19:30.092 "name": "spare", 00:19:30.092 "uuid": "3cd73924-43e3-5c78-bbf5-2b58a49cf660", 00:19:30.092 "is_configured": true, 00:19:30.092 "data_offset": 0, 00:19:30.092 "data_size": 65536 00:19:30.092 }, 00:19:30.092 { 00:19:30.092 "name": null, 00:19:30.092 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:30.092 "is_configured": false, 00:19:30.092 "data_offset": 0, 00:19:30.092 "data_size": 65536 00:19:30.092 }, 00:19:30.092 { 00:19:30.092 "name": "BaseBdev3", 00:19:30.092 "uuid": "441d37a3-77cb-5ed6-beef-cafdc9527948", 00:19:30.092 "is_configured": true, 00:19:30.092 "data_offset": 0, 00:19:30.092 "data_size": 65536 00:19:30.092 }, 00:19:30.092 { 00:19:30.092 "name": "BaseBdev4", 00:19:30.092 "uuid": "e4265fd5-87df-52e8-9e11-a9191139fdaf", 00:19:30.092 "is_configured": true, 00:19:30.092 "data_offset": 0, 00:19:30.092 "data_size": 65536 00:19:30.092 } 00:19:30.092 ] 00:19:30.092 }' 00:19:30.092 23:41:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:30.092 23:41:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:30.092 23:41:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:30.092 23:41:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:19:30.092 23:41:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:19:31.027 23:41:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:19:31.027 23:41:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:31.027 23:41:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:31.027 23:41:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:31.027 23:41:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:31.028 23:41:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:31.028 23:41:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:31.028 23:41:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:31.286 23:41:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:31.286 "name": "raid_bdev1", 00:19:31.286 "uuid": "294a2b37-014c-419e-970f-750fbc612095", 00:19:31.286 "strip_size_kb": 0, 00:19:31.286 "state": "online", 00:19:31.286 "raid_level": "raid1", 00:19:31.286 "superblock": false, 00:19:31.286 "num_base_bdevs": 4, 00:19:31.286 "num_base_bdevs_discovered": 3, 00:19:31.286 "num_base_bdevs_operational": 3, 00:19:31.286 "process": { 00:19:31.286 "type": "rebuild", 00:19:31.286 "target": "spare", 00:19:31.286 "progress": { 00:19:31.286 "blocks": 63488, 00:19:31.286 "percent": 96 00:19:31.286 } 00:19:31.286 }, 00:19:31.286 "base_bdevs_list": [ 00:19:31.286 { 00:19:31.286 "name": "spare", 00:19:31.286 "uuid": "3cd73924-43e3-5c78-bbf5-2b58a49cf660", 00:19:31.286 "is_configured": true, 00:19:31.286 "data_offset": 0, 00:19:31.286 "data_size": 65536 00:19:31.286 }, 00:19:31.286 { 00:19:31.286 "name": null, 00:19:31.286 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:31.286 "is_configured": false, 00:19:31.286 "data_offset": 0, 00:19:31.286 "data_size": 65536 00:19:31.286 }, 00:19:31.286 { 00:19:31.286 "name": "BaseBdev3", 00:19:31.286 "uuid": "441d37a3-77cb-5ed6-beef-cafdc9527948", 00:19:31.286 "is_configured": true, 00:19:31.286 "data_offset": 0, 00:19:31.286 "data_size": 65536 00:19:31.286 }, 00:19:31.286 { 00:19:31.286 "name": "BaseBdev4", 00:19:31.287 "uuid": "e4265fd5-87df-52e8-9e11-a9191139fdaf", 00:19:31.287 "is_configured": true, 00:19:31.287 "data_offset": 0, 00:19:31.287 "data_size": 65536 00:19:31.287 } 00:19:31.287 ] 00:19:31.287 }' 00:19:31.287 23:41:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:31.287 [2024-07-24 23:41:16.125824] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:19:31.287 [2024-07-24 23:41:16.125866] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:19:31.287 [2024-07-24 23:41:16.125891] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:31.287 23:41:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:31.287 23:41:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:31.287 23:41:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:19:31.287 23:41:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:19:32.222 23:41:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:19:32.222 23:41:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:32.222 23:41:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:32.222 23:41:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:32.222 23:41:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:32.222 23:41:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:32.222 23:41:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:32.222 23:41:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:32.482 23:41:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:32.482 "name": "raid_bdev1", 00:19:32.482 "uuid": "294a2b37-014c-419e-970f-750fbc612095", 00:19:32.482 "strip_size_kb": 0, 00:19:32.482 "state": "online", 00:19:32.482 "raid_level": "raid1", 00:19:32.482 "superblock": false, 00:19:32.482 "num_base_bdevs": 4, 00:19:32.482 "num_base_bdevs_discovered": 3, 00:19:32.482 "num_base_bdevs_operational": 3, 00:19:32.482 "base_bdevs_list": [ 00:19:32.482 { 00:19:32.482 "name": "spare", 00:19:32.482 "uuid": "3cd73924-43e3-5c78-bbf5-2b58a49cf660", 00:19:32.482 "is_configured": true, 00:19:32.482 "data_offset": 0, 00:19:32.482 "data_size": 65536 00:19:32.482 }, 00:19:32.482 { 00:19:32.482 "name": null, 00:19:32.482 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:32.482 "is_configured": false, 00:19:32.482 "data_offset": 0, 00:19:32.482 "data_size": 65536 00:19:32.482 }, 00:19:32.482 { 00:19:32.482 "name": "BaseBdev3", 00:19:32.482 "uuid": "441d37a3-77cb-5ed6-beef-cafdc9527948", 00:19:32.482 "is_configured": true, 00:19:32.482 "data_offset": 0, 00:19:32.482 "data_size": 65536 00:19:32.482 }, 00:19:32.482 { 00:19:32.482 "name": "BaseBdev4", 00:19:32.482 "uuid": "e4265fd5-87df-52e8-9e11-a9191139fdaf", 00:19:32.482 "is_configured": true, 00:19:32.482 "data_offset": 0, 00:19:32.482 "data_size": 65536 00:19:32.482 } 00:19:32.482 ] 00:19:32.482 }' 00:19:32.482 23:41:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:32.482 23:41:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:19:32.482 23:41:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:32.482 23:41:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:19:32.482 23:41:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # break 00:19:32.482 23:41:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:19:32.482 23:41:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:32.482 23:41:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:19:32.482 23:41:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:19:32.482 23:41:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:32.482 23:41:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:32.482 23:41:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:32.741 23:41:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:32.741 "name": "raid_bdev1", 00:19:32.741 "uuid": "294a2b37-014c-419e-970f-750fbc612095", 00:19:32.741 "strip_size_kb": 0, 00:19:32.741 "state": "online", 00:19:32.741 "raid_level": "raid1", 00:19:32.741 "superblock": false, 00:19:32.741 "num_base_bdevs": 4, 00:19:32.741 "num_base_bdevs_discovered": 3, 00:19:32.741 "num_base_bdevs_operational": 3, 00:19:32.741 "base_bdevs_list": [ 00:19:32.741 { 00:19:32.741 "name": "spare", 00:19:32.741 "uuid": "3cd73924-43e3-5c78-bbf5-2b58a49cf660", 00:19:32.741 "is_configured": true, 00:19:32.741 "data_offset": 0, 00:19:32.741 "data_size": 65536 00:19:32.741 }, 00:19:32.741 { 00:19:32.741 "name": null, 00:19:32.741 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:32.741 "is_configured": false, 00:19:32.741 "data_offset": 0, 00:19:32.741 "data_size": 65536 00:19:32.741 }, 00:19:32.741 { 00:19:32.741 "name": "BaseBdev3", 00:19:32.741 "uuid": "441d37a3-77cb-5ed6-beef-cafdc9527948", 00:19:32.741 "is_configured": true, 00:19:32.741 "data_offset": 0, 00:19:32.741 "data_size": 65536 00:19:32.741 }, 00:19:32.742 { 00:19:32.742 "name": "BaseBdev4", 00:19:32.742 "uuid": "e4265fd5-87df-52e8-9e11-a9191139fdaf", 00:19:32.742 "is_configured": true, 00:19:32.742 "data_offset": 0, 00:19:32.742 "data_size": 65536 00:19:32.742 } 00:19:32.742 ] 00:19:32.742 }' 00:19:32.742 23:41:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:32.742 23:41:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:19:32.742 23:41:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:32.742 23:41:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:19:32.742 23:41:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:19:32.742 23:41:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:32.742 23:41:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:32.742 23:41:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:32.742 23:41:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:32.742 23:41:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:32.742 23:41:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:32.742 23:41:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:32.742 23:41:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:32.742 23:41:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:32.742 23:41:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:32.742 23:41:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:33.001 23:41:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:33.001 "name": "raid_bdev1", 00:19:33.001 "uuid": "294a2b37-014c-419e-970f-750fbc612095", 00:19:33.001 "strip_size_kb": 0, 00:19:33.001 "state": "online", 00:19:33.001 "raid_level": "raid1", 00:19:33.001 "superblock": false, 00:19:33.001 "num_base_bdevs": 4, 00:19:33.001 "num_base_bdevs_discovered": 3, 00:19:33.001 "num_base_bdevs_operational": 3, 00:19:33.001 "base_bdevs_list": [ 00:19:33.001 { 00:19:33.001 "name": "spare", 00:19:33.001 "uuid": "3cd73924-43e3-5c78-bbf5-2b58a49cf660", 00:19:33.001 "is_configured": true, 00:19:33.001 "data_offset": 0, 00:19:33.001 "data_size": 65536 00:19:33.001 }, 00:19:33.001 { 00:19:33.001 "name": null, 00:19:33.001 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:33.001 "is_configured": false, 00:19:33.001 "data_offset": 0, 00:19:33.001 "data_size": 65536 00:19:33.001 }, 00:19:33.001 { 00:19:33.001 "name": "BaseBdev3", 00:19:33.001 "uuid": "441d37a3-77cb-5ed6-beef-cafdc9527948", 00:19:33.001 "is_configured": true, 00:19:33.001 "data_offset": 0, 00:19:33.001 "data_size": 65536 00:19:33.001 }, 00:19:33.001 { 00:19:33.001 "name": "BaseBdev4", 00:19:33.001 "uuid": "e4265fd5-87df-52e8-9e11-a9191139fdaf", 00:19:33.001 "is_configured": true, 00:19:33.001 "data_offset": 0, 00:19:33.001 "data_size": 65536 00:19:33.001 } 00:19:33.001 ] 00:19:33.001 }' 00:19:33.001 23:41:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:33.001 23:41:17 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:19:33.568 23:41:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:33.568 [2024-07-24 23:41:18.447538] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:33.568 [2024-07-24 23:41:18.447559] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:33.568 [2024-07-24 23:41:18.447602] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:33.568 [2024-07-24 23:41:18.447653] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:33.568 [2024-07-24 23:41:18.447660] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17f5d20 name raid_bdev1, state offline 00:19:33.568 23:41:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:33.568 23:41:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # jq length 00:19:33.828 23:41:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:19:33.828 23:41:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:19:33.828 23:41:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:19:33.828 23:41:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:19:33.828 23:41:18 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:19:33.828 23:41:18 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:19:33.828 23:41:18 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:19:33.828 23:41:18 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:19:33.828 23:41:18 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:19:33.828 23:41:18 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:19:33.828 23:41:18 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:19:33.828 23:41:18 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:19:33.828 23:41:18 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:19:33.828 /dev/nbd0 00:19:33.828 23:41:18 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:19:34.183 23:41:18 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:19:34.183 23:41:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:19:34.183 23:41:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # local i 00:19:34.183 23:41:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:19:34.183 23:41:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:19:34.183 23:41:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:19:34.183 23:41:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@873 -- # break 00:19:34.184 23:41:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:19:34.184 23:41:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:19:34.184 23:41:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:34.184 1+0 records in 00:19:34.184 1+0 records out 00:19:34.184 4096 bytes (4.1 kB, 4.0 KiB) copied, 9.9253e-05 s, 41.3 MB/s 00:19:34.184 23:41:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:34.184 23:41:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # size=4096 00:19:34.184 23:41:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:34.184 23:41:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:19:34.184 23:41:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@889 -- # return 0 00:19:34.184 23:41:18 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:19:34.184 23:41:18 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:19:34.184 23:41:18 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:19:34.184 /dev/nbd1 00:19:34.184 23:41:19 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:19:34.184 23:41:19 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:19:34.184 23:41:19 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:19:34.184 23:41:19 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # local i 00:19:34.184 23:41:19 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:19:34.184 23:41:19 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:19:34.184 23:41:19 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:19:34.184 23:41:19 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@873 -- # break 00:19:34.184 23:41:19 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:19:34.184 23:41:19 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:19:34.184 23:41:19 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:34.184 1+0 records in 00:19:34.184 1+0 records out 00:19:34.184 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000150565 s, 27.2 MB/s 00:19:34.184 23:41:19 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:34.184 23:41:19 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # size=4096 00:19:34.184 23:41:19 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:34.184 23:41:19 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:19:34.184 23:41:19 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@889 -- # return 0 00:19:34.184 23:41:19 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:19:34.184 23:41:19 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:19:34.184 23:41:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@737 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:19:34.184 23:41:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:19:34.184 23:41:19 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:19:34.184 23:41:19 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:19:34.184 23:41:19 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:19:34.184 23:41:19 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:19:34.184 23:41:19 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:34.184 23:41:19 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:19:34.444 23:41:19 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:19:34.444 23:41:19 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:19:34.444 23:41:19 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:19:34.444 23:41:19 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:34.444 23:41:19 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:34.444 23:41:19 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:19:34.444 23:41:19 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:19:34.444 23:41:19 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:19:34.444 23:41:19 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:34.444 23:41:19 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:19:34.703 23:41:19 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:19:34.703 23:41:19 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:19:34.703 23:41:19 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:19:34.703 23:41:19 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:34.703 23:41:19 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:34.703 23:41:19 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:19:34.703 23:41:19 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:19:34.703 23:41:19 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:19:34.703 23:41:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:19:34.703 23:41:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@782 -- # killprocess 361362 00:19:34.703 23:41:19 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@950 -- # '[' -z 361362 ']' 00:19:34.703 23:41:19 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # kill -0 361362 00:19:34.703 23:41:19 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@955 -- # uname 00:19:34.703 23:41:19 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:19:34.703 23:41:19 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 361362 00:19:34.703 23:41:19 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:19:34.703 23:41:19 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:19:34.703 23:41:19 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 361362' 00:19:34.703 killing process with pid 361362 00:19:34.703 23:41:19 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@969 -- # kill 361362 00:19:34.703 Received shutdown signal, test time was about 60.000000 seconds 00:19:34.703 00:19:34.703 Latency(us) 00:19:34.703 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:19:34.703 =================================================================================================================== 00:19:34.703 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:19:34.703 [2024-07-24 23:41:19.550220] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:34.703 23:41:19 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@974 -- # wait 361362 00:19:34.703 [2024-07-24 23:41:19.589722] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:34.962 23:41:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@784 -- # return 0 00:19:34.962 00:19:34.962 real 0m18.924s 00:19:34.962 user 0m26.003s 00:19:34.962 sys 0m3.002s 00:19:34.962 23:41:19 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:19:34.962 23:41:19 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:19:34.962 ************************************ 00:19:34.962 END TEST raid_rebuild_test 00:19:34.962 ************************************ 00:19:34.962 23:41:19 bdev_raid -- bdev/bdev_raid.sh@878 -- # run_test raid_rebuild_test_sb raid_rebuild_test raid1 4 true false true 00:19:34.962 23:41:19 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:19:34.962 23:41:19 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:19:34.962 23:41:19 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:34.962 ************************************ 00:19:34.962 START TEST raid_rebuild_test_sb 00:19:34.962 ************************************ 00:19:34.962 23:41:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1125 -- # raid_rebuild_test raid1 4 true false true 00:19:34.962 23:41:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:19:34.962 23:41:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:19:34.962 23:41:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:19:34.962 23:41:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:19:34.962 23:41:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@572 -- # local verify=true 00:19:34.962 23:41:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:19:34.962 23:41:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:19:34.962 23:41:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:19:34.962 23:41:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:19:34.962 23:41:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:19:34.962 23:41:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:19:34.962 23:41:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:19:34.962 23:41:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:19:34.962 23:41:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:19:34.962 23:41:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:19:34.962 23:41:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:19:34.962 23:41:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:19:34.962 23:41:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:19:34.962 23:41:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:19:34.962 23:41:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:19:34.962 23:41:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:19:34.962 23:41:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:19:34.962 23:41:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # local strip_size 00:19:34.962 23:41:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@576 -- # local create_arg 00:19:34.962 23:41:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:19:34.962 23:41:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@578 -- # local data_offset 00:19:34.962 23:41:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:19:34.962 23:41:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:19:34.962 23:41:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:19:34.962 23:41:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:19:34.962 23:41:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@596 -- # raid_pid=364776 00:19:34.962 23:41:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:19:34.962 23:41:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@597 -- # waitforlisten 364776 /var/tmp/spdk-raid.sock 00:19:34.963 23:41:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@831 -- # '[' -z 364776 ']' 00:19:34.963 23:41:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:34.963 23:41:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@836 -- # local max_retries=100 00:19:34.963 23:41:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:34.963 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:34.963 23:41:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@840 -- # xtrace_disable 00:19:34.963 23:41:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:34.963 [2024-07-24 23:41:19.872653] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:19:34.963 [2024-07-24 23:41:19.872689] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid364776 ] 00:19:34.963 I/O size of 3145728 is greater than zero copy threshold (65536). 00:19:34.963 Zero copy mechanism will not be used. 00:19:34.963 [2024-07-24 23:41:19.936686] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:35.221 [2024-07-24 23:41:20.017536] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:19:35.221 [2024-07-24 23:41:20.075376] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:35.221 [2024-07-24 23:41:20.075405] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:35.789 23:41:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:19:35.789 23:41:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@864 -- # return 0 00:19:35.789 23:41:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:19:35.789 23:41:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:19:36.048 BaseBdev1_malloc 00:19:36.048 23:41:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:19:36.048 [2024-07-24 23:41:20.991108] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:19:36.048 [2024-07-24 23:41:20.991142] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:36.048 [2024-07-24 23:41:20.991155] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x229f1c0 00:19:36.048 [2024-07-24 23:41:20.991177] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:36.048 [2024-07-24 23:41:20.992305] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:36.048 [2024-07-24 23:41:20.992325] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:19:36.048 BaseBdev1 00:19:36.048 23:41:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:19:36.048 23:41:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:19:36.306 BaseBdev2_malloc 00:19:36.306 23:41:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:19:36.565 [2024-07-24 23:41:21.311491] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:19:36.565 [2024-07-24 23:41:21.311524] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:36.565 [2024-07-24 23:41:21.311539] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x229fce0 00:19:36.565 [2024-07-24 23:41:21.311545] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:36.565 [2024-07-24 23:41:21.312544] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:36.565 [2024-07-24 23:41:21.312564] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:19:36.565 BaseBdev2 00:19:36.565 23:41:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:19:36.565 23:41:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:19:36.565 BaseBdev3_malloc 00:19:36.565 23:41:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:19:36.824 [2024-07-24 23:41:21.631802] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:19:36.824 [2024-07-24 23:41:21.631830] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:36.824 [2024-07-24 23:41:21.631840] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x244cd70 00:19:36.824 [2024-07-24 23:41:21.631846] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:36.824 [2024-07-24 23:41:21.632872] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:36.824 [2024-07-24 23:41:21.632892] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:19:36.824 BaseBdev3 00:19:36.824 23:41:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:19:36.824 23:41:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:19:36.824 BaseBdev4_malloc 00:19:36.824 23:41:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:19:37.083 [2024-07-24 23:41:21.948311] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:19:37.083 [2024-07-24 23:41:21.948342] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:37.083 [2024-07-24 23:41:21.948355] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x244bf50 00:19:37.083 [2024-07-24 23:41:21.948377] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:37.083 [2024-07-24 23:41:21.949403] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:37.083 [2024-07-24 23:41:21.949424] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:19:37.083 BaseBdev4 00:19:37.083 23:41:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:19:37.342 spare_malloc 00:19:37.342 23:41:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:19:37.342 spare_delay 00:19:37.342 23:41:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:19:37.601 [2024-07-24 23:41:22.445098] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:19:37.601 [2024-07-24 23:41:22.445129] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:37.601 [2024-07-24 23:41:22.445140] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2450a30 00:19:37.601 [2024-07-24 23:41:22.445146] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:37.601 [2024-07-24 23:41:22.446215] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:37.601 [2024-07-24 23:41:22.446234] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:19:37.601 spare 00:19:37.601 23:41:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:19:37.902 [2024-07-24 23:41:22.609572] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:37.903 [2024-07-24 23:41:22.610450] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:37.903 [2024-07-24 23:41:22.610497] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:37.903 [2024-07-24 23:41:22.610527] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:37.903 [2024-07-24 23:41:22.610659] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x23cfd20 00:19:37.903 [2024-07-24 23:41:22.610666] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:19:37.903 [2024-07-24 23:41:22.610811] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x244a290 00:19:37.903 [2024-07-24 23:41:22.610912] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x23cfd20 00:19:37.903 [2024-07-24 23:41:22.610918] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x23cfd20 00:19:37.903 [2024-07-24 23:41:22.610978] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:37.903 23:41:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:19:37.903 23:41:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:37.903 23:41:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:37.903 23:41:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:37.903 23:41:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:37.903 23:41:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:37.903 23:41:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:37.903 23:41:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:37.903 23:41:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:37.903 23:41:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:37.903 23:41:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:37.903 23:41:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:37.903 23:41:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:37.903 "name": "raid_bdev1", 00:19:37.903 "uuid": "497ce2d0-6132-4f1a-87ad-8a4f392325dd", 00:19:37.903 "strip_size_kb": 0, 00:19:37.903 "state": "online", 00:19:37.903 "raid_level": "raid1", 00:19:37.903 "superblock": true, 00:19:37.903 "num_base_bdevs": 4, 00:19:37.903 "num_base_bdevs_discovered": 4, 00:19:37.903 "num_base_bdevs_operational": 4, 00:19:37.903 "base_bdevs_list": [ 00:19:37.903 { 00:19:37.903 "name": "BaseBdev1", 00:19:37.903 "uuid": "df03eb38-af80-5136-9cff-7973887f3bd0", 00:19:37.903 "is_configured": true, 00:19:37.903 "data_offset": 2048, 00:19:37.903 "data_size": 63488 00:19:37.903 }, 00:19:37.903 { 00:19:37.903 "name": "BaseBdev2", 00:19:37.903 "uuid": "cdaaa756-604e-5fbd-b43f-2d305b363a41", 00:19:37.903 "is_configured": true, 00:19:37.903 "data_offset": 2048, 00:19:37.903 "data_size": 63488 00:19:37.903 }, 00:19:37.903 { 00:19:37.903 "name": "BaseBdev3", 00:19:37.903 "uuid": "47821aa0-2490-58e2-bc09-95c6fd9ad0ae", 00:19:37.903 "is_configured": true, 00:19:37.903 "data_offset": 2048, 00:19:37.903 "data_size": 63488 00:19:37.903 }, 00:19:37.903 { 00:19:37.903 "name": "BaseBdev4", 00:19:37.903 "uuid": "915ca892-f98f-5872-af79-105e3effc765", 00:19:37.903 "is_configured": true, 00:19:37.903 "data_offset": 2048, 00:19:37.903 "data_size": 63488 00:19:37.903 } 00:19:37.903 ] 00:19:37.903 }' 00:19:37.903 23:41:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:37.903 23:41:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:38.537 23:41:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:38.537 23:41:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:19:38.537 [2024-07-24 23:41:23.435866] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:38.537 23:41:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:19:38.537 23:41:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:38.537 23:41:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:19:38.796 23:41:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:19:38.796 23:41:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:19:38.796 23:41:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:19:38.796 23:41:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:19:38.796 23:41:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:19:38.796 23:41:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:19:38.796 23:41:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:19:38.796 23:41:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:19:38.796 23:41:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:19:38.796 23:41:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:19:38.796 23:41:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:19:38.796 23:41:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:19:38.796 23:41:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:19:38.796 23:41:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:19:38.796 [2024-07-24 23:41:23.776576] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x244a290 00:19:38.796 /dev/nbd0 00:19:39.055 23:41:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:19:39.055 23:41:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:19:39.055 23:41:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:19:39.055 23:41:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # local i 00:19:39.055 23:41:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:19:39.055 23:41:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:19:39.055 23:41:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:19:39.055 23:41:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@873 -- # break 00:19:39.055 23:41:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:19:39.055 23:41:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:19:39.055 23:41:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:39.055 1+0 records in 00:19:39.055 1+0 records out 00:19:39.055 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000210136 s, 19.5 MB/s 00:19:39.055 23:41:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:39.055 23:41:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # size=4096 00:19:39.055 23:41:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:39.055 23:41:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:19:39.055 23:41:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@889 -- # return 0 00:19:39.055 23:41:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:19:39.055 23:41:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:19:39.055 23:41:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:19:39.055 23:41:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:19:39.055 23:41:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=63488 oflag=direct 00:19:44.324 63488+0 records in 00:19:44.324 63488+0 records out 00:19:44.324 32505856 bytes (33 MB, 31 MiB) copied, 4.50492 s, 7.2 MB/s 00:19:44.324 23:41:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:19:44.324 23:41:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:19:44.324 23:41:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:19:44.324 23:41:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:19:44.324 23:41:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:19:44.324 23:41:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:44.324 23:41:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:19:44.324 23:41:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:19:44.324 23:41:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:19:44.324 23:41:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:19:44.324 23:41:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:44.324 23:41:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:44.324 23:41:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:19:44.324 [2024-07-24 23:41:28.510066] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:44.324 23:41:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:19:44.324 23:41:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:19:44.324 23:41:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:19:44.324 [2024-07-24 23:41:28.661145] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:44.324 23:41:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:19:44.324 23:41:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:44.324 23:41:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:44.324 23:41:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:44.324 23:41:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:44.324 23:41:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:44.324 23:41:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:44.324 23:41:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:44.324 23:41:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:44.324 23:41:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:44.324 23:41:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:44.324 23:41:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:44.324 23:41:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:44.324 "name": "raid_bdev1", 00:19:44.324 "uuid": "497ce2d0-6132-4f1a-87ad-8a4f392325dd", 00:19:44.324 "strip_size_kb": 0, 00:19:44.324 "state": "online", 00:19:44.324 "raid_level": "raid1", 00:19:44.324 "superblock": true, 00:19:44.324 "num_base_bdevs": 4, 00:19:44.324 "num_base_bdevs_discovered": 3, 00:19:44.324 "num_base_bdevs_operational": 3, 00:19:44.324 "base_bdevs_list": [ 00:19:44.324 { 00:19:44.324 "name": null, 00:19:44.324 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:44.324 "is_configured": false, 00:19:44.324 "data_offset": 2048, 00:19:44.324 "data_size": 63488 00:19:44.324 }, 00:19:44.324 { 00:19:44.324 "name": "BaseBdev2", 00:19:44.324 "uuid": "cdaaa756-604e-5fbd-b43f-2d305b363a41", 00:19:44.324 "is_configured": true, 00:19:44.324 "data_offset": 2048, 00:19:44.324 "data_size": 63488 00:19:44.324 }, 00:19:44.324 { 00:19:44.324 "name": "BaseBdev3", 00:19:44.324 "uuid": "47821aa0-2490-58e2-bc09-95c6fd9ad0ae", 00:19:44.324 "is_configured": true, 00:19:44.324 "data_offset": 2048, 00:19:44.324 "data_size": 63488 00:19:44.324 }, 00:19:44.324 { 00:19:44.324 "name": "BaseBdev4", 00:19:44.324 "uuid": "915ca892-f98f-5872-af79-105e3effc765", 00:19:44.324 "is_configured": true, 00:19:44.324 "data_offset": 2048, 00:19:44.324 "data_size": 63488 00:19:44.324 } 00:19:44.324 ] 00:19:44.324 }' 00:19:44.324 23:41:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:44.324 23:41:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:44.582 23:41:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:19:44.582 [2024-07-24 23:41:29.491270] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:19:44.582 [2024-07-24 23:41:29.494810] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x244a290 00:19:44.582 [2024-07-24 23:41:29.496239] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:19:44.582 23:41:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@646 -- # sleep 1 00:19:45.518 23:41:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:45.518 23:41:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:45.518 23:41:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:45.518 23:41:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:45.518 23:41:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:45.776 23:41:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:45.776 23:41:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:45.776 23:41:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:45.776 "name": "raid_bdev1", 00:19:45.776 "uuid": "497ce2d0-6132-4f1a-87ad-8a4f392325dd", 00:19:45.776 "strip_size_kb": 0, 00:19:45.776 "state": "online", 00:19:45.776 "raid_level": "raid1", 00:19:45.776 "superblock": true, 00:19:45.776 "num_base_bdevs": 4, 00:19:45.776 "num_base_bdevs_discovered": 4, 00:19:45.776 "num_base_bdevs_operational": 4, 00:19:45.776 "process": { 00:19:45.776 "type": "rebuild", 00:19:45.776 "target": "spare", 00:19:45.776 "progress": { 00:19:45.776 "blocks": 22528, 00:19:45.776 "percent": 35 00:19:45.776 } 00:19:45.776 }, 00:19:45.776 "base_bdevs_list": [ 00:19:45.776 { 00:19:45.776 "name": "spare", 00:19:45.776 "uuid": "544878d6-27dc-564a-92e4-79e1bfc7de27", 00:19:45.776 "is_configured": true, 00:19:45.776 "data_offset": 2048, 00:19:45.776 "data_size": 63488 00:19:45.776 }, 00:19:45.776 { 00:19:45.776 "name": "BaseBdev2", 00:19:45.776 "uuid": "cdaaa756-604e-5fbd-b43f-2d305b363a41", 00:19:45.776 "is_configured": true, 00:19:45.776 "data_offset": 2048, 00:19:45.776 "data_size": 63488 00:19:45.776 }, 00:19:45.776 { 00:19:45.776 "name": "BaseBdev3", 00:19:45.776 "uuid": "47821aa0-2490-58e2-bc09-95c6fd9ad0ae", 00:19:45.776 "is_configured": true, 00:19:45.776 "data_offset": 2048, 00:19:45.776 "data_size": 63488 00:19:45.776 }, 00:19:45.776 { 00:19:45.776 "name": "BaseBdev4", 00:19:45.776 "uuid": "915ca892-f98f-5872-af79-105e3effc765", 00:19:45.776 "is_configured": true, 00:19:45.776 "data_offset": 2048, 00:19:45.776 "data_size": 63488 00:19:45.776 } 00:19:45.776 ] 00:19:45.776 }' 00:19:45.776 23:41:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:45.776 23:41:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:45.776 23:41:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:46.035 23:41:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:19:46.035 23:41:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:19:46.035 [2024-07-24 23:41:30.928529] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:19:46.035 [2024-07-24 23:41:31.006784] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:19:46.035 [2024-07-24 23:41:31.006820] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:46.035 [2024-07-24 23:41:31.006831] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:19:46.035 [2024-07-24 23:41:31.006835] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:19:46.035 23:41:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:19:46.035 23:41:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:46.035 23:41:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:46.035 23:41:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:46.035 23:41:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:46.035 23:41:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:46.035 23:41:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:46.035 23:41:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:46.035 23:41:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:46.035 23:41:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:46.035 23:41:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:46.035 23:41:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:46.294 23:41:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:46.294 "name": "raid_bdev1", 00:19:46.294 "uuid": "497ce2d0-6132-4f1a-87ad-8a4f392325dd", 00:19:46.294 "strip_size_kb": 0, 00:19:46.294 "state": "online", 00:19:46.294 "raid_level": "raid1", 00:19:46.294 "superblock": true, 00:19:46.294 "num_base_bdevs": 4, 00:19:46.294 "num_base_bdevs_discovered": 3, 00:19:46.294 "num_base_bdevs_operational": 3, 00:19:46.294 "base_bdevs_list": [ 00:19:46.294 { 00:19:46.294 "name": null, 00:19:46.294 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:46.294 "is_configured": false, 00:19:46.294 "data_offset": 2048, 00:19:46.294 "data_size": 63488 00:19:46.294 }, 00:19:46.294 { 00:19:46.294 "name": "BaseBdev2", 00:19:46.294 "uuid": "cdaaa756-604e-5fbd-b43f-2d305b363a41", 00:19:46.294 "is_configured": true, 00:19:46.294 "data_offset": 2048, 00:19:46.294 "data_size": 63488 00:19:46.294 }, 00:19:46.294 { 00:19:46.294 "name": "BaseBdev3", 00:19:46.294 "uuid": "47821aa0-2490-58e2-bc09-95c6fd9ad0ae", 00:19:46.294 "is_configured": true, 00:19:46.294 "data_offset": 2048, 00:19:46.294 "data_size": 63488 00:19:46.294 }, 00:19:46.294 { 00:19:46.294 "name": "BaseBdev4", 00:19:46.294 "uuid": "915ca892-f98f-5872-af79-105e3effc765", 00:19:46.294 "is_configured": true, 00:19:46.294 "data_offset": 2048, 00:19:46.294 "data_size": 63488 00:19:46.294 } 00:19:46.294 ] 00:19:46.294 }' 00:19:46.294 23:41:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:46.294 23:41:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:46.861 23:41:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:19:46.862 23:41:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:46.862 23:41:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:19:46.862 23:41:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:19:46.862 23:41:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:46.862 23:41:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:46.862 23:41:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:46.862 23:41:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:46.862 "name": "raid_bdev1", 00:19:46.862 "uuid": "497ce2d0-6132-4f1a-87ad-8a4f392325dd", 00:19:46.862 "strip_size_kb": 0, 00:19:46.862 "state": "online", 00:19:46.862 "raid_level": "raid1", 00:19:46.862 "superblock": true, 00:19:46.862 "num_base_bdevs": 4, 00:19:46.862 "num_base_bdevs_discovered": 3, 00:19:46.862 "num_base_bdevs_operational": 3, 00:19:46.862 "base_bdevs_list": [ 00:19:46.862 { 00:19:46.862 "name": null, 00:19:46.862 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:46.862 "is_configured": false, 00:19:46.862 "data_offset": 2048, 00:19:46.862 "data_size": 63488 00:19:46.862 }, 00:19:46.862 { 00:19:46.862 "name": "BaseBdev2", 00:19:46.862 "uuid": "cdaaa756-604e-5fbd-b43f-2d305b363a41", 00:19:46.862 "is_configured": true, 00:19:46.862 "data_offset": 2048, 00:19:46.862 "data_size": 63488 00:19:46.862 }, 00:19:46.862 { 00:19:46.862 "name": "BaseBdev3", 00:19:46.862 "uuid": "47821aa0-2490-58e2-bc09-95c6fd9ad0ae", 00:19:46.862 "is_configured": true, 00:19:46.862 "data_offset": 2048, 00:19:46.862 "data_size": 63488 00:19:46.862 }, 00:19:46.862 { 00:19:46.862 "name": "BaseBdev4", 00:19:46.862 "uuid": "915ca892-f98f-5872-af79-105e3effc765", 00:19:46.862 "is_configured": true, 00:19:46.862 "data_offset": 2048, 00:19:46.862 "data_size": 63488 00:19:46.862 } 00:19:46.862 ] 00:19:46.862 }' 00:19:46.862 23:41:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:47.193 23:41:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:19:47.193 23:41:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:47.193 23:41:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:19:47.193 23:41:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:19:47.193 [2024-07-24 23:41:32.069197] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:19:47.193 [2024-07-24 23:41:32.072787] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x229e8f0 00:19:47.193 [2024-07-24 23:41:32.073803] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:19:47.193 23:41:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@662 -- # sleep 1 00:19:48.128 23:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:48.128 23:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:48.128 23:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:48.128 23:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:48.128 23:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:48.128 23:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:48.128 23:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:48.387 23:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:48.387 "name": "raid_bdev1", 00:19:48.387 "uuid": "497ce2d0-6132-4f1a-87ad-8a4f392325dd", 00:19:48.387 "strip_size_kb": 0, 00:19:48.387 "state": "online", 00:19:48.387 "raid_level": "raid1", 00:19:48.387 "superblock": true, 00:19:48.387 "num_base_bdevs": 4, 00:19:48.387 "num_base_bdevs_discovered": 4, 00:19:48.387 "num_base_bdevs_operational": 4, 00:19:48.387 "process": { 00:19:48.387 "type": "rebuild", 00:19:48.387 "target": "spare", 00:19:48.387 "progress": { 00:19:48.387 "blocks": 22528, 00:19:48.387 "percent": 35 00:19:48.387 } 00:19:48.387 }, 00:19:48.387 "base_bdevs_list": [ 00:19:48.387 { 00:19:48.387 "name": "spare", 00:19:48.387 "uuid": "544878d6-27dc-564a-92e4-79e1bfc7de27", 00:19:48.387 "is_configured": true, 00:19:48.387 "data_offset": 2048, 00:19:48.387 "data_size": 63488 00:19:48.387 }, 00:19:48.387 { 00:19:48.387 "name": "BaseBdev2", 00:19:48.387 "uuid": "cdaaa756-604e-5fbd-b43f-2d305b363a41", 00:19:48.387 "is_configured": true, 00:19:48.387 "data_offset": 2048, 00:19:48.387 "data_size": 63488 00:19:48.387 }, 00:19:48.387 { 00:19:48.387 "name": "BaseBdev3", 00:19:48.387 "uuid": "47821aa0-2490-58e2-bc09-95c6fd9ad0ae", 00:19:48.387 "is_configured": true, 00:19:48.387 "data_offset": 2048, 00:19:48.387 "data_size": 63488 00:19:48.387 }, 00:19:48.387 { 00:19:48.387 "name": "BaseBdev4", 00:19:48.387 "uuid": "915ca892-f98f-5872-af79-105e3effc765", 00:19:48.387 "is_configured": true, 00:19:48.387 "data_offset": 2048, 00:19:48.387 "data_size": 63488 00:19:48.387 } 00:19:48.387 ] 00:19:48.387 }' 00:19:48.387 23:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:48.387 23:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:48.387 23:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:48.387 23:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:19:48.387 23:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:19:48.387 23:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:19:48.387 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:19:48.387 23:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:19:48.387 23:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:19:48.387 23:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:19:48.387 23:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:19:48.645 [2024-07-24 23:41:33.494034] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:19:48.904 [2024-07-24 23:41:33.684615] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x229e8f0 00:19:48.904 23:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:19:48.904 23:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:19:48.904 23:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:48.904 23:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:48.904 23:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:48.904 23:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:48.904 23:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:48.904 23:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:48.904 23:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:48.904 23:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:48.904 "name": "raid_bdev1", 00:19:48.904 "uuid": "497ce2d0-6132-4f1a-87ad-8a4f392325dd", 00:19:48.904 "strip_size_kb": 0, 00:19:48.904 "state": "online", 00:19:48.904 "raid_level": "raid1", 00:19:48.904 "superblock": true, 00:19:48.904 "num_base_bdevs": 4, 00:19:48.904 "num_base_bdevs_discovered": 3, 00:19:48.904 "num_base_bdevs_operational": 3, 00:19:48.904 "process": { 00:19:48.904 "type": "rebuild", 00:19:48.904 "target": "spare", 00:19:48.904 "progress": { 00:19:48.904 "blocks": 32768, 00:19:48.904 "percent": 51 00:19:48.904 } 00:19:48.904 }, 00:19:48.904 "base_bdevs_list": [ 00:19:48.904 { 00:19:48.904 "name": "spare", 00:19:48.904 "uuid": "544878d6-27dc-564a-92e4-79e1bfc7de27", 00:19:48.904 "is_configured": true, 00:19:48.904 "data_offset": 2048, 00:19:48.904 "data_size": 63488 00:19:48.904 }, 00:19:48.904 { 00:19:48.904 "name": null, 00:19:48.904 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:48.904 "is_configured": false, 00:19:48.904 "data_offset": 2048, 00:19:48.904 "data_size": 63488 00:19:48.904 }, 00:19:48.904 { 00:19:48.904 "name": "BaseBdev3", 00:19:48.904 "uuid": "47821aa0-2490-58e2-bc09-95c6fd9ad0ae", 00:19:48.904 "is_configured": true, 00:19:48.904 "data_offset": 2048, 00:19:48.904 "data_size": 63488 00:19:48.904 }, 00:19:48.904 { 00:19:48.904 "name": "BaseBdev4", 00:19:48.904 "uuid": "915ca892-f98f-5872-af79-105e3effc765", 00:19:48.904 "is_configured": true, 00:19:48.904 "data_offset": 2048, 00:19:48.904 "data_size": 63488 00:19:48.904 } 00:19:48.904 ] 00:19:48.904 }' 00:19:48.904 23:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:49.162 23:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:49.162 23:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:49.162 23:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:19:49.162 23:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@705 -- # local timeout=686 00:19:49.162 23:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:19:49.162 23:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:49.162 23:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:49.162 23:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:49.162 23:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:49.162 23:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:49.163 23:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:49.163 23:41:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:49.163 23:41:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:49.163 "name": "raid_bdev1", 00:19:49.163 "uuid": "497ce2d0-6132-4f1a-87ad-8a4f392325dd", 00:19:49.163 "strip_size_kb": 0, 00:19:49.163 "state": "online", 00:19:49.163 "raid_level": "raid1", 00:19:49.163 "superblock": true, 00:19:49.163 "num_base_bdevs": 4, 00:19:49.163 "num_base_bdevs_discovered": 3, 00:19:49.163 "num_base_bdevs_operational": 3, 00:19:49.163 "process": { 00:19:49.163 "type": "rebuild", 00:19:49.163 "target": "spare", 00:19:49.163 "progress": { 00:19:49.163 "blocks": 38912, 00:19:49.163 "percent": 61 00:19:49.163 } 00:19:49.163 }, 00:19:49.163 "base_bdevs_list": [ 00:19:49.163 { 00:19:49.163 "name": "spare", 00:19:49.163 "uuid": "544878d6-27dc-564a-92e4-79e1bfc7de27", 00:19:49.163 "is_configured": true, 00:19:49.163 "data_offset": 2048, 00:19:49.163 "data_size": 63488 00:19:49.163 }, 00:19:49.163 { 00:19:49.163 "name": null, 00:19:49.163 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:49.163 "is_configured": false, 00:19:49.163 "data_offset": 2048, 00:19:49.163 "data_size": 63488 00:19:49.163 }, 00:19:49.163 { 00:19:49.163 "name": "BaseBdev3", 00:19:49.163 "uuid": "47821aa0-2490-58e2-bc09-95c6fd9ad0ae", 00:19:49.163 "is_configured": true, 00:19:49.163 "data_offset": 2048, 00:19:49.163 "data_size": 63488 00:19:49.163 }, 00:19:49.163 { 00:19:49.163 "name": "BaseBdev4", 00:19:49.163 "uuid": "915ca892-f98f-5872-af79-105e3effc765", 00:19:49.163 "is_configured": true, 00:19:49.163 "data_offset": 2048, 00:19:49.163 "data_size": 63488 00:19:49.163 } 00:19:49.163 ] 00:19:49.163 }' 00:19:49.163 23:41:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:49.163 23:41:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:49.163 23:41:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:49.421 23:41:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:19:49.421 23:41:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:19:50.357 23:41:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:19:50.357 23:41:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:50.357 23:41:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:50.357 23:41:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:50.357 23:41:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:50.357 23:41:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:50.357 23:41:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:50.357 23:41:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:50.357 [2024-07-24 23:41:35.295846] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:19:50.357 [2024-07-24 23:41:35.295887] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:19:50.357 [2024-07-24 23:41:35.295962] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:50.616 23:41:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:50.616 "name": "raid_bdev1", 00:19:50.616 "uuid": "497ce2d0-6132-4f1a-87ad-8a4f392325dd", 00:19:50.616 "strip_size_kb": 0, 00:19:50.616 "state": "online", 00:19:50.616 "raid_level": "raid1", 00:19:50.616 "superblock": true, 00:19:50.616 "num_base_bdevs": 4, 00:19:50.616 "num_base_bdevs_discovered": 3, 00:19:50.616 "num_base_bdevs_operational": 3, 00:19:50.616 "base_bdevs_list": [ 00:19:50.616 { 00:19:50.616 "name": "spare", 00:19:50.616 "uuid": "544878d6-27dc-564a-92e4-79e1bfc7de27", 00:19:50.616 "is_configured": true, 00:19:50.616 "data_offset": 2048, 00:19:50.616 "data_size": 63488 00:19:50.616 }, 00:19:50.616 { 00:19:50.616 "name": null, 00:19:50.616 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:50.616 "is_configured": false, 00:19:50.616 "data_offset": 2048, 00:19:50.616 "data_size": 63488 00:19:50.616 }, 00:19:50.616 { 00:19:50.616 "name": "BaseBdev3", 00:19:50.616 "uuid": "47821aa0-2490-58e2-bc09-95c6fd9ad0ae", 00:19:50.616 "is_configured": true, 00:19:50.616 "data_offset": 2048, 00:19:50.616 "data_size": 63488 00:19:50.616 }, 00:19:50.616 { 00:19:50.616 "name": "BaseBdev4", 00:19:50.616 "uuid": "915ca892-f98f-5872-af79-105e3effc765", 00:19:50.616 "is_configured": true, 00:19:50.616 "data_offset": 2048, 00:19:50.616 "data_size": 63488 00:19:50.616 } 00:19:50.616 ] 00:19:50.616 }' 00:19:50.616 23:41:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:50.616 23:41:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:19:50.616 23:41:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:50.616 23:41:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:19:50.616 23:41:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # break 00:19:50.616 23:41:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:19:50.616 23:41:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:50.616 23:41:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:19:50.616 23:41:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:19:50.616 23:41:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:50.616 23:41:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:50.616 23:41:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:50.616 23:41:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:50.616 "name": "raid_bdev1", 00:19:50.616 "uuid": "497ce2d0-6132-4f1a-87ad-8a4f392325dd", 00:19:50.616 "strip_size_kb": 0, 00:19:50.616 "state": "online", 00:19:50.616 "raid_level": "raid1", 00:19:50.616 "superblock": true, 00:19:50.616 "num_base_bdevs": 4, 00:19:50.616 "num_base_bdevs_discovered": 3, 00:19:50.616 "num_base_bdevs_operational": 3, 00:19:50.616 "base_bdevs_list": [ 00:19:50.616 { 00:19:50.616 "name": "spare", 00:19:50.616 "uuid": "544878d6-27dc-564a-92e4-79e1bfc7de27", 00:19:50.616 "is_configured": true, 00:19:50.616 "data_offset": 2048, 00:19:50.616 "data_size": 63488 00:19:50.616 }, 00:19:50.616 { 00:19:50.616 "name": null, 00:19:50.616 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:50.616 "is_configured": false, 00:19:50.616 "data_offset": 2048, 00:19:50.616 "data_size": 63488 00:19:50.616 }, 00:19:50.616 { 00:19:50.616 "name": "BaseBdev3", 00:19:50.616 "uuid": "47821aa0-2490-58e2-bc09-95c6fd9ad0ae", 00:19:50.616 "is_configured": true, 00:19:50.616 "data_offset": 2048, 00:19:50.616 "data_size": 63488 00:19:50.616 }, 00:19:50.616 { 00:19:50.616 "name": "BaseBdev4", 00:19:50.616 "uuid": "915ca892-f98f-5872-af79-105e3effc765", 00:19:50.616 "is_configured": true, 00:19:50.616 "data_offset": 2048, 00:19:50.616 "data_size": 63488 00:19:50.616 } 00:19:50.616 ] 00:19:50.616 }' 00:19:50.616 23:41:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:50.876 23:41:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:19:50.876 23:41:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:50.876 23:41:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:19:50.876 23:41:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:19:50.876 23:41:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:50.876 23:41:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:50.876 23:41:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:50.876 23:41:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:50.876 23:41:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:50.876 23:41:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:50.876 23:41:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:50.876 23:41:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:50.876 23:41:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:50.876 23:41:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:50.876 23:41:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:50.876 23:41:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:50.876 "name": "raid_bdev1", 00:19:50.876 "uuid": "497ce2d0-6132-4f1a-87ad-8a4f392325dd", 00:19:50.876 "strip_size_kb": 0, 00:19:50.876 "state": "online", 00:19:50.876 "raid_level": "raid1", 00:19:50.876 "superblock": true, 00:19:50.876 "num_base_bdevs": 4, 00:19:50.876 "num_base_bdevs_discovered": 3, 00:19:50.876 "num_base_bdevs_operational": 3, 00:19:50.876 "base_bdevs_list": [ 00:19:50.876 { 00:19:50.876 "name": "spare", 00:19:50.876 "uuid": "544878d6-27dc-564a-92e4-79e1bfc7de27", 00:19:50.876 "is_configured": true, 00:19:50.876 "data_offset": 2048, 00:19:50.876 "data_size": 63488 00:19:50.876 }, 00:19:50.876 { 00:19:50.876 "name": null, 00:19:50.876 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:50.876 "is_configured": false, 00:19:50.876 "data_offset": 2048, 00:19:50.876 "data_size": 63488 00:19:50.876 }, 00:19:50.876 { 00:19:50.876 "name": "BaseBdev3", 00:19:50.876 "uuid": "47821aa0-2490-58e2-bc09-95c6fd9ad0ae", 00:19:50.876 "is_configured": true, 00:19:50.876 "data_offset": 2048, 00:19:50.876 "data_size": 63488 00:19:50.876 }, 00:19:50.876 { 00:19:50.876 "name": "BaseBdev4", 00:19:50.876 "uuid": "915ca892-f98f-5872-af79-105e3effc765", 00:19:50.876 "is_configured": true, 00:19:50.876 "data_offset": 2048, 00:19:50.876 "data_size": 63488 00:19:50.876 } 00:19:50.876 ] 00:19:50.876 }' 00:19:50.876 23:41:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:50.876 23:41:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:51.443 23:41:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:51.702 [2024-07-24 23:41:36.454560] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:51.702 [2024-07-24 23:41:36.454580] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:51.702 [2024-07-24 23:41:36.454622] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:51.702 [2024-07-24 23:41:36.454670] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:51.702 [2024-07-24 23:41:36.454676] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23cfd20 name raid_bdev1, state offline 00:19:51.702 23:41:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:51.702 23:41:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # jq length 00:19:51.702 23:41:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:19:51.702 23:41:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:19:51.702 23:41:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:19:51.702 23:41:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:19:51.702 23:41:36 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:19:51.702 23:41:36 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:19:51.702 23:41:36 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:19:51.702 23:41:36 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:19:51.702 23:41:36 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:19:51.702 23:41:36 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:19:51.702 23:41:36 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:19:51.702 23:41:36 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:19:51.702 23:41:36 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:19:51.961 /dev/nbd0 00:19:51.962 23:41:36 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:19:51.962 23:41:36 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:19:51.962 23:41:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:19:51.962 23:41:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # local i 00:19:51.962 23:41:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:19:51.962 23:41:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:19:51.962 23:41:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:19:51.962 23:41:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@873 -- # break 00:19:51.962 23:41:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:19:51.962 23:41:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:19:51.962 23:41:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:51.962 1+0 records in 00:19:51.962 1+0 records out 00:19:51.962 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000228096 s, 18.0 MB/s 00:19:51.962 23:41:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:51.962 23:41:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # size=4096 00:19:51.962 23:41:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:51.962 23:41:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:19:51.962 23:41:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@889 -- # return 0 00:19:51.962 23:41:36 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:19:51.962 23:41:36 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:19:51.962 23:41:36 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:19:52.221 /dev/nbd1 00:19:52.221 23:41:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:19:52.221 23:41:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:19:52.221 23:41:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:19:52.221 23:41:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # local i 00:19:52.221 23:41:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:19:52.221 23:41:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:19:52.221 23:41:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:19:52.221 23:41:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@873 -- # break 00:19:52.221 23:41:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:19:52.221 23:41:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:19:52.221 23:41:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:52.221 1+0 records in 00:19:52.221 1+0 records out 00:19:52.221 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000220398 s, 18.6 MB/s 00:19:52.221 23:41:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:52.221 23:41:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # size=4096 00:19:52.221 23:41:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:52.221 23:41:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:19:52.221 23:41:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@889 -- # return 0 00:19:52.221 23:41:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:19:52.221 23:41:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:19:52.221 23:41:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:19:52.221 23:41:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:19:52.221 23:41:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:19:52.221 23:41:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:19:52.221 23:41:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:19:52.221 23:41:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:19:52.221 23:41:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:52.221 23:41:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:19:52.480 23:41:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:19:52.480 23:41:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:19:52.480 23:41:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:19:52.480 23:41:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:52.480 23:41:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:52.480 23:41:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:19:52.480 23:41:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:19:52.480 23:41:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:19:52.480 23:41:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:52.480 23:41:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:19:52.739 23:41:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:19:52.739 23:41:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:19:52.739 23:41:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:19:52.739 23:41:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:52.739 23:41:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:52.739 23:41:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:19:52.739 23:41:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:19:52.739 23:41:37 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:19:52.739 23:41:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:19:52.739 23:41:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:19:52.739 23:41:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:19:52.998 [2024-07-24 23:41:37.799285] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:19:52.998 [2024-07-24 23:41:37.799313] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:52.998 [2024-07-24 23:41:37.799325] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23d4c10 00:19:52.998 [2024-07-24 23:41:37.799331] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:52.998 [2024-07-24 23:41:37.800505] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:52.998 [2024-07-24 23:41:37.800524] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:19:52.998 [2024-07-24 23:41:37.800573] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:19:52.998 [2024-07-24 23:41:37.800592] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:19:52.998 [2024-07-24 23:41:37.800661] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:52.998 [2024-07-24 23:41:37.800709] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:52.998 spare 00:19:52.998 23:41:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:19:52.998 23:41:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:52.998 23:41:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:52.998 23:41:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:52.998 23:41:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:52.998 23:41:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:52.998 23:41:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:52.998 23:41:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:52.998 23:41:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:52.998 23:41:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:52.998 23:41:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:52.998 23:41:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:52.998 [2024-07-24 23:41:37.901002] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x23d5630 00:19:52.998 [2024-07-24 23:41:37.901011] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:19:52.998 [2024-07-24 23:41:37.901131] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x243baa0 00:19:52.998 [2024-07-24 23:41:37.901223] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x23d5630 00:19:52.998 [2024-07-24 23:41:37.901228] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x23d5630 00:19:52.998 [2024-07-24 23:41:37.901289] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:52.998 23:41:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:52.998 "name": "raid_bdev1", 00:19:52.998 "uuid": "497ce2d0-6132-4f1a-87ad-8a4f392325dd", 00:19:52.998 "strip_size_kb": 0, 00:19:52.998 "state": "online", 00:19:52.998 "raid_level": "raid1", 00:19:52.998 "superblock": true, 00:19:52.998 "num_base_bdevs": 4, 00:19:52.998 "num_base_bdevs_discovered": 3, 00:19:52.998 "num_base_bdevs_operational": 3, 00:19:52.998 "base_bdevs_list": [ 00:19:52.998 { 00:19:52.998 "name": "spare", 00:19:52.998 "uuid": "544878d6-27dc-564a-92e4-79e1bfc7de27", 00:19:52.998 "is_configured": true, 00:19:52.998 "data_offset": 2048, 00:19:52.998 "data_size": 63488 00:19:52.998 }, 00:19:52.998 { 00:19:52.998 "name": null, 00:19:52.998 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:52.998 "is_configured": false, 00:19:52.998 "data_offset": 2048, 00:19:52.998 "data_size": 63488 00:19:52.998 }, 00:19:52.998 { 00:19:52.998 "name": "BaseBdev3", 00:19:52.998 "uuid": "47821aa0-2490-58e2-bc09-95c6fd9ad0ae", 00:19:52.998 "is_configured": true, 00:19:52.998 "data_offset": 2048, 00:19:52.998 "data_size": 63488 00:19:52.998 }, 00:19:52.998 { 00:19:52.998 "name": "BaseBdev4", 00:19:52.998 "uuid": "915ca892-f98f-5872-af79-105e3effc765", 00:19:52.998 "is_configured": true, 00:19:52.998 "data_offset": 2048, 00:19:52.998 "data_size": 63488 00:19:52.998 } 00:19:52.998 ] 00:19:52.998 }' 00:19:52.998 23:41:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:52.998 23:41:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:53.565 23:41:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:19:53.565 23:41:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:53.565 23:41:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:19:53.565 23:41:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:19:53.565 23:41:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:53.565 23:41:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:53.565 23:41:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:53.824 23:41:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:53.824 "name": "raid_bdev1", 00:19:53.824 "uuid": "497ce2d0-6132-4f1a-87ad-8a4f392325dd", 00:19:53.824 "strip_size_kb": 0, 00:19:53.824 "state": "online", 00:19:53.824 "raid_level": "raid1", 00:19:53.824 "superblock": true, 00:19:53.824 "num_base_bdevs": 4, 00:19:53.824 "num_base_bdevs_discovered": 3, 00:19:53.824 "num_base_bdevs_operational": 3, 00:19:53.824 "base_bdevs_list": [ 00:19:53.824 { 00:19:53.824 "name": "spare", 00:19:53.824 "uuid": "544878d6-27dc-564a-92e4-79e1bfc7de27", 00:19:53.824 "is_configured": true, 00:19:53.824 "data_offset": 2048, 00:19:53.824 "data_size": 63488 00:19:53.824 }, 00:19:53.824 { 00:19:53.824 "name": null, 00:19:53.824 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:53.824 "is_configured": false, 00:19:53.824 "data_offset": 2048, 00:19:53.824 "data_size": 63488 00:19:53.824 }, 00:19:53.824 { 00:19:53.824 "name": "BaseBdev3", 00:19:53.824 "uuid": "47821aa0-2490-58e2-bc09-95c6fd9ad0ae", 00:19:53.824 "is_configured": true, 00:19:53.824 "data_offset": 2048, 00:19:53.824 "data_size": 63488 00:19:53.824 }, 00:19:53.824 { 00:19:53.824 "name": "BaseBdev4", 00:19:53.824 "uuid": "915ca892-f98f-5872-af79-105e3effc765", 00:19:53.824 "is_configured": true, 00:19:53.824 "data_offset": 2048, 00:19:53.824 "data_size": 63488 00:19:53.824 } 00:19:53.824 ] 00:19:53.824 }' 00:19:53.824 23:41:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:53.824 23:41:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:19:53.824 23:41:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:53.824 23:41:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:19:53.824 23:41:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:53.824 23:41:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:19:54.084 23:41:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:19:54.084 23:41:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:19:54.084 [2024-07-24 23:41:39.054595] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:19:54.084 23:41:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:19:54.084 23:41:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:54.084 23:41:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:54.084 23:41:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:54.084 23:41:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:54.084 23:41:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:19:54.084 23:41:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:54.084 23:41:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:54.084 23:41:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:54.084 23:41:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:54.084 23:41:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:54.084 23:41:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:54.343 23:41:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:54.343 "name": "raid_bdev1", 00:19:54.343 "uuid": "497ce2d0-6132-4f1a-87ad-8a4f392325dd", 00:19:54.343 "strip_size_kb": 0, 00:19:54.343 "state": "online", 00:19:54.343 "raid_level": "raid1", 00:19:54.343 "superblock": true, 00:19:54.343 "num_base_bdevs": 4, 00:19:54.343 "num_base_bdevs_discovered": 2, 00:19:54.343 "num_base_bdevs_operational": 2, 00:19:54.343 "base_bdevs_list": [ 00:19:54.343 { 00:19:54.343 "name": null, 00:19:54.343 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:54.343 "is_configured": false, 00:19:54.343 "data_offset": 2048, 00:19:54.343 "data_size": 63488 00:19:54.343 }, 00:19:54.343 { 00:19:54.343 "name": null, 00:19:54.343 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:54.343 "is_configured": false, 00:19:54.343 "data_offset": 2048, 00:19:54.343 "data_size": 63488 00:19:54.343 }, 00:19:54.343 { 00:19:54.343 "name": "BaseBdev3", 00:19:54.343 "uuid": "47821aa0-2490-58e2-bc09-95c6fd9ad0ae", 00:19:54.343 "is_configured": true, 00:19:54.343 "data_offset": 2048, 00:19:54.343 "data_size": 63488 00:19:54.343 }, 00:19:54.343 { 00:19:54.343 "name": "BaseBdev4", 00:19:54.343 "uuid": "915ca892-f98f-5872-af79-105e3effc765", 00:19:54.343 "is_configured": true, 00:19:54.343 "data_offset": 2048, 00:19:54.343 "data_size": 63488 00:19:54.343 } 00:19:54.343 ] 00:19:54.343 }' 00:19:54.343 23:41:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:54.343 23:41:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:54.910 23:41:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:19:54.910 [2024-07-24 23:41:39.864722] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:19:54.910 [2024-07-24 23:41:39.864835] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:19:54.910 [2024-07-24 23:41:39.864844] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:19:54.910 [2024-07-24 23:41:39.864862] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:19:54.910 [2024-07-24 23:41:39.868295] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x244f890 00:19:54.910 [2024-07-24 23:41:39.869802] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:19:54.910 23:41:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@755 -- # sleep 1 00:19:56.286 23:41:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:56.286 23:41:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:56.286 23:41:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:56.286 23:41:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:56.286 23:41:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:56.286 23:41:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:56.286 23:41:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:56.287 23:41:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:56.287 "name": "raid_bdev1", 00:19:56.287 "uuid": "497ce2d0-6132-4f1a-87ad-8a4f392325dd", 00:19:56.287 "strip_size_kb": 0, 00:19:56.287 "state": "online", 00:19:56.287 "raid_level": "raid1", 00:19:56.287 "superblock": true, 00:19:56.287 "num_base_bdevs": 4, 00:19:56.287 "num_base_bdevs_discovered": 3, 00:19:56.287 "num_base_bdevs_operational": 3, 00:19:56.287 "process": { 00:19:56.287 "type": "rebuild", 00:19:56.287 "target": "spare", 00:19:56.287 "progress": { 00:19:56.287 "blocks": 22528, 00:19:56.287 "percent": 35 00:19:56.287 } 00:19:56.287 }, 00:19:56.287 "base_bdevs_list": [ 00:19:56.287 { 00:19:56.287 "name": "spare", 00:19:56.287 "uuid": "544878d6-27dc-564a-92e4-79e1bfc7de27", 00:19:56.287 "is_configured": true, 00:19:56.287 "data_offset": 2048, 00:19:56.287 "data_size": 63488 00:19:56.287 }, 00:19:56.287 { 00:19:56.287 "name": null, 00:19:56.287 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:56.287 "is_configured": false, 00:19:56.287 "data_offset": 2048, 00:19:56.287 "data_size": 63488 00:19:56.287 }, 00:19:56.287 { 00:19:56.287 "name": "BaseBdev3", 00:19:56.287 "uuid": "47821aa0-2490-58e2-bc09-95c6fd9ad0ae", 00:19:56.287 "is_configured": true, 00:19:56.287 "data_offset": 2048, 00:19:56.287 "data_size": 63488 00:19:56.287 }, 00:19:56.287 { 00:19:56.287 "name": "BaseBdev4", 00:19:56.287 "uuid": "915ca892-f98f-5872-af79-105e3effc765", 00:19:56.287 "is_configured": true, 00:19:56.287 "data_offset": 2048, 00:19:56.287 "data_size": 63488 00:19:56.287 } 00:19:56.287 ] 00:19:56.287 }' 00:19:56.287 23:41:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:56.287 23:41:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:56.287 23:41:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:56.287 23:41:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:19:56.287 23:41:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:19:56.546 [2024-07-24 23:41:41.306288] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:19:56.546 [2024-07-24 23:41:41.380425] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:19:56.546 [2024-07-24 23:41:41.380453] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:56.546 [2024-07-24 23:41:41.380461] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:19:56.546 [2024-07-24 23:41:41.380465] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:19:56.546 23:41:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:19:56.546 23:41:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:56.546 23:41:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:56.546 23:41:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:56.546 23:41:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:56.546 23:41:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:19:56.546 23:41:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:56.546 23:41:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:56.546 23:41:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:56.546 23:41:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:56.546 23:41:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:56.546 23:41:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:56.805 23:41:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:56.805 "name": "raid_bdev1", 00:19:56.805 "uuid": "497ce2d0-6132-4f1a-87ad-8a4f392325dd", 00:19:56.805 "strip_size_kb": 0, 00:19:56.805 "state": "online", 00:19:56.805 "raid_level": "raid1", 00:19:56.805 "superblock": true, 00:19:56.805 "num_base_bdevs": 4, 00:19:56.805 "num_base_bdevs_discovered": 2, 00:19:56.805 "num_base_bdevs_operational": 2, 00:19:56.805 "base_bdevs_list": [ 00:19:56.805 { 00:19:56.805 "name": null, 00:19:56.805 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:56.805 "is_configured": false, 00:19:56.805 "data_offset": 2048, 00:19:56.806 "data_size": 63488 00:19:56.806 }, 00:19:56.806 { 00:19:56.806 "name": null, 00:19:56.806 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:56.806 "is_configured": false, 00:19:56.806 "data_offset": 2048, 00:19:56.806 "data_size": 63488 00:19:56.806 }, 00:19:56.806 { 00:19:56.806 "name": "BaseBdev3", 00:19:56.806 "uuid": "47821aa0-2490-58e2-bc09-95c6fd9ad0ae", 00:19:56.806 "is_configured": true, 00:19:56.806 "data_offset": 2048, 00:19:56.806 "data_size": 63488 00:19:56.806 }, 00:19:56.806 { 00:19:56.806 "name": "BaseBdev4", 00:19:56.806 "uuid": "915ca892-f98f-5872-af79-105e3effc765", 00:19:56.806 "is_configured": true, 00:19:56.806 "data_offset": 2048, 00:19:56.806 "data_size": 63488 00:19:56.806 } 00:19:56.806 ] 00:19:56.806 }' 00:19:56.806 23:41:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:56.806 23:41:41 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:57.373 23:41:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:19:57.373 [2024-07-24 23:41:42.206093] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:19:57.373 [2024-07-24 23:41:42.206130] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:57.374 [2024-07-24 23:41:42.206143] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2449ca0 00:19:57.374 [2024-07-24 23:41:42.206149] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:57.374 [2024-07-24 23:41:42.206404] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:57.374 [2024-07-24 23:41:42.206414] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:19:57.374 [2024-07-24 23:41:42.206464] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:19:57.374 [2024-07-24 23:41:42.206476] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:19:57.374 [2024-07-24 23:41:42.206481] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:19:57.374 [2024-07-24 23:41:42.206491] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:19:57.374 [2024-07-24 23:41:42.209884] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2449f30 00:19:57.374 [2024-07-24 23:41:42.210848] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:19:57.374 spare 00:19:57.374 23:41:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@762 -- # sleep 1 00:19:58.310 23:41:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:58.310 23:41:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:58.310 23:41:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:19:58.310 23:41:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:19:58.310 23:41:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:58.310 23:41:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:58.310 23:41:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:58.569 23:41:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:58.569 "name": "raid_bdev1", 00:19:58.569 "uuid": "497ce2d0-6132-4f1a-87ad-8a4f392325dd", 00:19:58.569 "strip_size_kb": 0, 00:19:58.569 "state": "online", 00:19:58.569 "raid_level": "raid1", 00:19:58.569 "superblock": true, 00:19:58.569 "num_base_bdevs": 4, 00:19:58.569 "num_base_bdevs_discovered": 3, 00:19:58.569 "num_base_bdevs_operational": 3, 00:19:58.569 "process": { 00:19:58.569 "type": "rebuild", 00:19:58.569 "target": "spare", 00:19:58.569 "progress": { 00:19:58.569 "blocks": 22528, 00:19:58.569 "percent": 35 00:19:58.569 } 00:19:58.569 }, 00:19:58.569 "base_bdevs_list": [ 00:19:58.569 { 00:19:58.569 "name": "spare", 00:19:58.569 "uuid": "544878d6-27dc-564a-92e4-79e1bfc7de27", 00:19:58.569 "is_configured": true, 00:19:58.569 "data_offset": 2048, 00:19:58.569 "data_size": 63488 00:19:58.569 }, 00:19:58.569 { 00:19:58.569 "name": null, 00:19:58.569 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:58.569 "is_configured": false, 00:19:58.569 "data_offset": 2048, 00:19:58.569 "data_size": 63488 00:19:58.569 }, 00:19:58.569 { 00:19:58.569 "name": "BaseBdev3", 00:19:58.569 "uuid": "47821aa0-2490-58e2-bc09-95c6fd9ad0ae", 00:19:58.569 "is_configured": true, 00:19:58.569 "data_offset": 2048, 00:19:58.569 "data_size": 63488 00:19:58.569 }, 00:19:58.569 { 00:19:58.569 "name": "BaseBdev4", 00:19:58.569 "uuid": "915ca892-f98f-5872-af79-105e3effc765", 00:19:58.569 "is_configured": true, 00:19:58.569 "data_offset": 2048, 00:19:58.569 "data_size": 63488 00:19:58.569 } 00:19:58.569 ] 00:19:58.569 }' 00:19:58.569 23:41:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:58.569 23:41:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:58.569 23:41:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:58.569 23:41:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:19:58.569 23:41:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:19:58.827 [2024-07-24 23:41:43.647570] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:19:58.827 [2024-07-24 23:41:43.721377] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:19:58.827 [2024-07-24 23:41:43.721404] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:58.827 [2024-07-24 23:41:43.721413] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:19:58.827 [2024-07-24 23:41:43.721417] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:19:58.827 23:41:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:19:58.827 23:41:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:58.827 23:41:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:58.827 23:41:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:58.827 23:41:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:58.827 23:41:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:19:58.827 23:41:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:58.827 23:41:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:58.827 23:41:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:58.827 23:41:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:58.827 23:41:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:58.828 23:41:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:59.086 23:41:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:59.086 "name": "raid_bdev1", 00:19:59.086 "uuid": "497ce2d0-6132-4f1a-87ad-8a4f392325dd", 00:19:59.086 "strip_size_kb": 0, 00:19:59.086 "state": "online", 00:19:59.086 "raid_level": "raid1", 00:19:59.086 "superblock": true, 00:19:59.086 "num_base_bdevs": 4, 00:19:59.086 "num_base_bdevs_discovered": 2, 00:19:59.086 "num_base_bdevs_operational": 2, 00:19:59.086 "base_bdevs_list": [ 00:19:59.086 { 00:19:59.086 "name": null, 00:19:59.086 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:59.086 "is_configured": false, 00:19:59.086 "data_offset": 2048, 00:19:59.086 "data_size": 63488 00:19:59.086 }, 00:19:59.086 { 00:19:59.086 "name": null, 00:19:59.086 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:59.086 "is_configured": false, 00:19:59.086 "data_offset": 2048, 00:19:59.086 "data_size": 63488 00:19:59.086 }, 00:19:59.086 { 00:19:59.086 "name": "BaseBdev3", 00:19:59.086 "uuid": "47821aa0-2490-58e2-bc09-95c6fd9ad0ae", 00:19:59.086 "is_configured": true, 00:19:59.086 "data_offset": 2048, 00:19:59.086 "data_size": 63488 00:19:59.086 }, 00:19:59.086 { 00:19:59.086 "name": "BaseBdev4", 00:19:59.086 "uuid": "915ca892-f98f-5872-af79-105e3effc765", 00:19:59.086 "is_configured": true, 00:19:59.086 "data_offset": 2048, 00:19:59.086 "data_size": 63488 00:19:59.086 } 00:19:59.086 ] 00:19:59.086 }' 00:19:59.086 23:41:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:59.086 23:41:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:59.653 23:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:19:59.653 23:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:19:59.653 23:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:19:59.653 23:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:19:59.653 23:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:19:59.653 23:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:59.653 23:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:59.653 23:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:19:59.653 "name": "raid_bdev1", 00:19:59.653 "uuid": "497ce2d0-6132-4f1a-87ad-8a4f392325dd", 00:19:59.653 "strip_size_kb": 0, 00:19:59.653 "state": "online", 00:19:59.653 "raid_level": "raid1", 00:19:59.653 "superblock": true, 00:19:59.653 "num_base_bdevs": 4, 00:19:59.653 "num_base_bdevs_discovered": 2, 00:19:59.653 "num_base_bdevs_operational": 2, 00:19:59.653 "base_bdevs_list": [ 00:19:59.653 { 00:19:59.653 "name": null, 00:19:59.653 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:59.653 "is_configured": false, 00:19:59.653 "data_offset": 2048, 00:19:59.653 "data_size": 63488 00:19:59.653 }, 00:19:59.653 { 00:19:59.653 "name": null, 00:19:59.653 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:59.653 "is_configured": false, 00:19:59.653 "data_offset": 2048, 00:19:59.653 "data_size": 63488 00:19:59.653 }, 00:19:59.653 { 00:19:59.653 "name": "BaseBdev3", 00:19:59.653 "uuid": "47821aa0-2490-58e2-bc09-95c6fd9ad0ae", 00:19:59.653 "is_configured": true, 00:19:59.653 "data_offset": 2048, 00:19:59.653 "data_size": 63488 00:19:59.653 }, 00:19:59.653 { 00:19:59.653 "name": "BaseBdev4", 00:19:59.653 "uuid": "915ca892-f98f-5872-af79-105e3effc765", 00:19:59.653 "is_configured": true, 00:19:59.653 "data_offset": 2048, 00:19:59.653 "data_size": 63488 00:19:59.653 } 00:19:59.653 ] 00:19:59.653 }' 00:19:59.653 23:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:19:59.653 23:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:19:59.653 23:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:19:59.653 23:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:19:59.653 23:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:19:59.912 23:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:20:00.170 [2024-07-24 23:41:44.960253] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:20:00.170 [2024-07-24 23:41:44.960288] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:00.170 [2024-07-24 23:41:44.960300] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x229f3f0 00:20:00.170 [2024-07-24 23:41:44.960306] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:00.170 [2024-07-24 23:41:44.960575] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:00.170 [2024-07-24 23:41:44.960590] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:20:00.170 [2024-07-24 23:41:44.960634] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:20:00.171 [2024-07-24 23:41:44.960641] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:20:00.171 [2024-07-24 23:41:44.960646] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:20:00.171 BaseBdev1 00:20:00.171 23:41:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@773 -- # sleep 1 00:20:01.185 23:41:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:20:01.185 23:41:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:01.185 23:41:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:01.185 23:41:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:01.185 23:41:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:01.185 23:41:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:20:01.185 23:41:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:01.185 23:41:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:01.185 23:41:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:01.185 23:41:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:01.185 23:41:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:01.185 23:41:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:01.185 23:41:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:01.185 "name": "raid_bdev1", 00:20:01.185 "uuid": "497ce2d0-6132-4f1a-87ad-8a4f392325dd", 00:20:01.185 "strip_size_kb": 0, 00:20:01.185 "state": "online", 00:20:01.185 "raid_level": "raid1", 00:20:01.185 "superblock": true, 00:20:01.185 "num_base_bdevs": 4, 00:20:01.185 "num_base_bdevs_discovered": 2, 00:20:01.185 "num_base_bdevs_operational": 2, 00:20:01.185 "base_bdevs_list": [ 00:20:01.185 { 00:20:01.185 "name": null, 00:20:01.185 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:01.185 "is_configured": false, 00:20:01.185 "data_offset": 2048, 00:20:01.185 "data_size": 63488 00:20:01.185 }, 00:20:01.185 { 00:20:01.185 "name": null, 00:20:01.185 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:01.185 "is_configured": false, 00:20:01.185 "data_offset": 2048, 00:20:01.185 "data_size": 63488 00:20:01.185 }, 00:20:01.185 { 00:20:01.185 "name": "BaseBdev3", 00:20:01.185 "uuid": "47821aa0-2490-58e2-bc09-95c6fd9ad0ae", 00:20:01.185 "is_configured": true, 00:20:01.185 "data_offset": 2048, 00:20:01.185 "data_size": 63488 00:20:01.185 }, 00:20:01.185 { 00:20:01.185 "name": "BaseBdev4", 00:20:01.185 "uuid": "915ca892-f98f-5872-af79-105e3effc765", 00:20:01.185 "is_configured": true, 00:20:01.185 "data_offset": 2048, 00:20:01.185 "data_size": 63488 00:20:01.185 } 00:20:01.185 ] 00:20:01.185 }' 00:20:01.185 23:41:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:01.185 23:41:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:01.752 23:41:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:01.752 23:41:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:01.752 23:41:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:20:01.752 23:41:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:20:01.752 23:41:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:01.752 23:41:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:01.752 23:41:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:02.012 23:41:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:02.012 "name": "raid_bdev1", 00:20:02.012 "uuid": "497ce2d0-6132-4f1a-87ad-8a4f392325dd", 00:20:02.012 "strip_size_kb": 0, 00:20:02.012 "state": "online", 00:20:02.012 "raid_level": "raid1", 00:20:02.012 "superblock": true, 00:20:02.012 "num_base_bdevs": 4, 00:20:02.012 "num_base_bdevs_discovered": 2, 00:20:02.012 "num_base_bdevs_operational": 2, 00:20:02.012 "base_bdevs_list": [ 00:20:02.012 { 00:20:02.012 "name": null, 00:20:02.012 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:02.012 "is_configured": false, 00:20:02.012 "data_offset": 2048, 00:20:02.012 "data_size": 63488 00:20:02.012 }, 00:20:02.012 { 00:20:02.012 "name": null, 00:20:02.012 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:02.012 "is_configured": false, 00:20:02.012 "data_offset": 2048, 00:20:02.012 "data_size": 63488 00:20:02.012 }, 00:20:02.012 { 00:20:02.012 "name": "BaseBdev3", 00:20:02.012 "uuid": "47821aa0-2490-58e2-bc09-95c6fd9ad0ae", 00:20:02.012 "is_configured": true, 00:20:02.012 "data_offset": 2048, 00:20:02.012 "data_size": 63488 00:20:02.012 }, 00:20:02.012 { 00:20:02.012 "name": "BaseBdev4", 00:20:02.012 "uuid": "915ca892-f98f-5872-af79-105e3effc765", 00:20:02.012 "is_configured": true, 00:20:02.012 "data_offset": 2048, 00:20:02.012 "data_size": 63488 00:20:02.012 } 00:20:02.012 ] 00:20:02.012 }' 00:20:02.012 23:41:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:02.012 23:41:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:20:02.012 23:41:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:02.012 23:41:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:02.012 23:41:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:20:02.012 23:41:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@650 -- # local es=0 00:20:02.012 23:41:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:20:02.012 23:41:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:02.012 23:41:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:20:02.012 23:41:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:02.012 23:41:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:20:02.012 23:41:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:02.012 23:41:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:20:02.012 23:41:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:02.012 23:41:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:20:02.012 23:41:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:20:02.271 [2024-07-24 23:41:47.041635] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:02.271 [2024-07-24 23:41:47.041739] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:20:02.271 [2024-07-24 23:41:47.041749] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:20:02.271 request: 00:20:02.271 { 00:20:02.271 "base_bdev": "BaseBdev1", 00:20:02.271 "raid_bdev": "raid_bdev1", 00:20:02.271 "method": "bdev_raid_add_base_bdev", 00:20:02.271 "req_id": 1 00:20:02.271 } 00:20:02.271 Got JSON-RPC error response 00:20:02.271 response: 00:20:02.271 { 00:20:02.271 "code": -22, 00:20:02.271 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:20:02.271 } 00:20:02.271 23:41:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@653 -- # es=1 00:20:02.271 23:41:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:20:02.271 23:41:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:20:02.271 23:41:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:20:02.271 23:41:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@777 -- # sleep 1 00:20:03.208 23:41:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:20:03.208 23:41:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:03.208 23:41:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:03.208 23:41:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:03.208 23:41:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:03.208 23:41:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:20:03.208 23:41:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:03.208 23:41:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:03.208 23:41:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:03.208 23:41:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:03.208 23:41:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:03.208 23:41:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:03.467 23:41:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:03.467 "name": "raid_bdev1", 00:20:03.467 "uuid": "497ce2d0-6132-4f1a-87ad-8a4f392325dd", 00:20:03.467 "strip_size_kb": 0, 00:20:03.467 "state": "online", 00:20:03.467 "raid_level": "raid1", 00:20:03.467 "superblock": true, 00:20:03.467 "num_base_bdevs": 4, 00:20:03.467 "num_base_bdevs_discovered": 2, 00:20:03.467 "num_base_bdevs_operational": 2, 00:20:03.467 "base_bdevs_list": [ 00:20:03.467 { 00:20:03.467 "name": null, 00:20:03.467 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:03.467 "is_configured": false, 00:20:03.467 "data_offset": 2048, 00:20:03.467 "data_size": 63488 00:20:03.467 }, 00:20:03.467 { 00:20:03.467 "name": null, 00:20:03.467 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:03.467 "is_configured": false, 00:20:03.467 "data_offset": 2048, 00:20:03.467 "data_size": 63488 00:20:03.467 }, 00:20:03.467 { 00:20:03.467 "name": "BaseBdev3", 00:20:03.467 "uuid": "47821aa0-2490-58e2-bc09-95c6fd9ad0ae", 00:20:03.467 "is_configured": true, 00:20:03.467 "data_offset": 2048, 00:20:03.467 "data_size": 63488 00:20:03.467 }, 00:20:03.467 { 00:20:03.467 "name": "BaseBdev4", 00:20:03.467 "uuid": "915ca892-f98f-5872-af79-105e3effc765", 00:20:03.467 "is_configured": true, 00:20:03.467 "data_offset": 2048, 00:20:03.467 "data_size": 63488 00:20:03.467 } 00:20:03.467 ] 00:20:03.467 }' 00:20:03.467 23:41:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:03.467 23:41:48 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:03.726 23:41:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:03.726 23:41:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:03.726 23:41:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:20:03.726 23:41:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:20:03.726 23:41:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:03.726 23:41:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:03.726 23:41:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:03.985 23:41:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:03.985 "name": "raid_bdev1", 00:20:03.985 "uuid": "497ce2d0-6132-4f1a-87ad-8a4f392325dd", 00:20:03.985 "strip_size_kb": 0, 00:20:03.985 "state": "online", 00:20:03.985 "raid_level": "raid1", 00:20:03.985 "superblock": true, 00:20:03.985 "num_base_bdevs": 4, 00:20:03.985 "num_base_bdevs_discovered": 2, 00:20:03.985 "num_base_bdevs_operational": 2, 00:20:03.985 "base_bdevs_list": [ 00:20:03.985 { 00:20:03.985 "name": null, 00:20:03.985 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:03.985 "is_configured": false, 00:20:03.985 "data_offset": 2048, 00:20:03.985 "data_size": 63488 00:20:03.985 }, 00:20:03.985 { 00:20:03.985 "name": null, 00:20:03.985 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:03.985 "is_configured": false, 00:20:03.985 "data_offset": 2048, 00:20:03.985 "data_size": 63488 00:20:03.985 }, 00:20:03.985 { 00:20:03.985 "name": "BaseBdev3", 00:20:03.985 "uuid": "47821aa0-2490-58e2-bc09-95c6fd9ad0ae", 00:20:03.985 "is_configured": true, 00:20:03.985 "data_offset": 2048, 00:20:03.985 "data_size": 63488 00:20:03.985 }, 00:20:03.985 { 00:20:03.985 "name": "BaseBdev4", 00:20:03.985 "uuid": "915ca892-f98f-5872-af79-105e3effc765", 00:20:03.985 "is_configured": true, 00:20:03.985 "data_offset": 2048, 00:20:03.985 "data_size": 63488 00:20:03.985 } 00:20:03.985 ] 00:20:03.985 }' 00:20:03.985 23:41:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:03.985 23:41:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:20:03.985 23:41:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:03.985 23:41:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:03.985 23:41:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@782 -- # killprocess 364776 00:20:03.985 23:41:48 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@950 -- # '[' -z 364776 ']' 00:20:03.985 23:41:48 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # kill -0 364776 00:20:03.985 23:41:48 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@955 -- # uname 00:20:03.985 23:41:48 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:20:03.985 23:41:48 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 364776 00:20:04.244 23:41:48 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:20:04.244 23:41:48 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:20:04.244 23:41:48 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@968 -- # echo 'killing process with pid 364776' 00:20:04.244 killing process with pid 364776 00:20:04.244 23:41:48 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@969 -- # kill 364776 00:20:04.244 Received shutdown signal, test time was about 60.000000 seconds 00:20:04.244 00:20:04.244 Latency(us) 00:20:04.244 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:20:04.244 =================================================================================================================== 00:20:04.244 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:20:04.244 [2024-07-24 23:41:48.996716] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:04.244 [2024-07-24 23:41:48.996785] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:04.244 [2024-07-24 23:41:48.996828] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:04.244 [2024-07-24 23:41:48.996834] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23d5630 name raid_bdev1, state offline 00:20:04.244 23:41:48 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@974 -- # wait 364776 00:20:04.244 [2024-07-24 23:41:49.036509] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:04.244 23:41:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@784 -- # return 0 00:20:04.244 00:20:04.244 real 0m29.375s 00:20:04.244 user 0m42.886s 00:20:04.244 sys 0m4.029s 00:20:04.244 23:41:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1126 -- # xtrace_disable 00:20:04.244 23:41:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:04.244 ************************************ 00:20:04.244 END TEST raid_rebuild_test_sb 00:20:04.244 ************************************ 00:20:04.244 23:41:49 bdev_raid -- bdev/bdev_raid.sh@879 -- # run_test raid_rebuild_test_io raid_rebuild_test raid1 4 false true true 00:20:04.244 23:41:49 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:20:04.244 23:41:49 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:20:04.244 23:41:49 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:04.504 ************************************ 00:20:04.504 START TEST raid_rebuild_test_io 00:20:04.504 ************************************ 00:20:04.504 23:41:49 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1125 -- # raid_rebuild_test raid1 4 false true true 00:20:04.504 23:41:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:20:04.504 23:41:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:20:04.504 23:41:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:20:04.504 23:41:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:20:04.504 23:41:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:20:04.504 23:41:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:20:04.504 23:41:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:04.504 23:41:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:20:04.504 23:41:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:20:04.504 23:41:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:04.504 23:41:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:20:04.504 23:41:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:20:04.504 23:41:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:04.504 23:41:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:20:04.504 23:41:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:20:04.504 23:41:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:04.504 23:41:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:20:04.504 23:41:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:20:04.504 23:41:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:04.504 23:41:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:20:04.504 23:41:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:20:04.504 23:41:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:20:04.504 23:41:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:20:04.504 23:41:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:20:04.504 23:41:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:20:04.504 23:41:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:20:04.504 23:41:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:20:04.504 23:41:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:20:04.504 23:41:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:20:04.504 23:41:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@596 -- # raid_pid=370037 00:20:04.504 23:41:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 370037 /var/tmp/spdk-raid.sock 00:20:04.504 23:41:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:20:04.504 23:41:49 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@831 -- # '[' -z 370037 ']' 00:20:04.504 23:41:49 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:04.504 23:41:49 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@836 -- # local max_retries=100 00:20:04.504 23:41:49 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:04.504 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:04.504 23:41:49 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@840 -- # xtrace_disable 00:20:04.504 23:41:49 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:20:04.504 [2024-07-24 23:41:49.335382] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:20:04.504 [2024-07-24 23:41:49.335421] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid370037 ] 00:20:04.504 I/O size of 3145728 is greater than zero copy threshold (65536). 00:20:04.504 Zero copy mechanism will not be used. 00:20:04.504 [2024-07-24 23:41:49.399432] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:04.504 [2024-07-24 23:41:49.478519] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:04.763 [2024-07-24 23:41:49.535249] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:04.763 [2024-07-24 23:41:49.535276] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:05.331 23:41:50 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:20:05.331 23:41:50 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@864 -- # return 0 00:20:05.331 23:41:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:20:05.331 23:41:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:20:05.331 BaseBdev1_malloc 00:20:05.331 23:41:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:20:05.590 [2024-07-24 23:41:50.475604] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:20:05.590 [2024-07-24 23:41:50.475638] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:05.590 [2024-07-24 23:41:50.475653] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x167f1c0 00:20:05.590 [2024-07-24 23:41:50.475676] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:05.590 [2024-07-24 23:41:50.476824] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:05.590 [2024-07-24 23:41:50.476843] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:20:05.590 BaseBdev1 00:20:05.590 23:41:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:20:05.590 23:41:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:20:05.848 BaseBdev2_malloc 00:20:05.849 23:41:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:20:05.849 [2024-07-24 23:41:50.812057] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:20:05.849 [2024-07-24 23:41:50.812090] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:05.849 [2024-07-24 23:41:50.812106] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x167fce0 00:20:05.849 [2024-07-24 23:41:50.812113] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:05.849 [2024-07-24 23:41:50.813168] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:05.849 [2024-07-24 23:41:50.813189] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:20:05.849 BaseBdev2 00:20:05.849 23:41:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:20:05.849 23:41:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:20:06.108 BaseBdev3_malloc 00:20:06.108 23:41:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:20:06.367 [2024-07-24 23:41:51.156519] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:20:06.367 [2024-07-24 23:41:51.156551] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:06.367 [2024-07-24 23:41:51.156563] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x182cd70 00:20:06.367 [2024-07-24 23:41:51.156573] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:06.367 [2024-07-24 23:41:51.157610] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:06.367 [2024-07-24 23:41:51.157630] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:20:06.367 BaseBdev3 00:20:06.367 23:41:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:20:06.367 23:41:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:20:06.367 BaseBdev4_malloc 00:20:06.367 23:41:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:20:06.625 [2024-07-24 23:41:51.488940] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:20:06.626 [2024-07-24 23:41:51.488972] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:06.626 [2024-07-24 23:41:51.488984] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x182bf50 00:20:06.626 [2024-07-24 23:41:51.489006] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:06.626 [2024-07-24 23:41:51.490043] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:06.626 [2024-07-24 23:41:51.490063] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:20:06.626 BaseBdev4 00:20:06.626 23:41:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:20:06.884 spare_malloc 00:20:06.884 23:41:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:20:06.884 spare_delay 00:20:06.884 23:41:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:20:07.143 [2024-07-24 23:41:51.977779] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:20:07.143 [2024-07-24 23:41:51.977808] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:07.143 [2024-07-24 23:41:51.977819] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1830a30 00:20:07.143 [2024-07-24 23:41:51.977825] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:07.143 [2024-07-24 23:41:51.978899] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:07.143 [2024-07-24 23:41:51.978918] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:20:07.143 spare 00:20:07.143 23:41:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:20:07.401 [2024-07-24 23:41:52.146238] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:07.401 [2024-07-24 23:41:52.147136] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:07.401 [2024-07-24 23:41:52.147175] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:07.401 [2024-07-24 23:41:52.147205] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:07.401 [2024-07-24 23:41:52.147260] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x17afd20 00:20:07.401 [2024-07-24 23:41:52.147265] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:20:07.401 [2024-07-24 23:41:52.147411] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x182a290 00:20:07.401 [2024-07-24 23:41:52.147520] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x17afd20 00:20:07.401 [2024-07-24 23:41:52.147526] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x17afd20 00:20:07.401 [2024-07-24 23:41:52.147608] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:07.401 23:41:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:20:07.401 23:41:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:07.401 23:41:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:07.401 23:41:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:07.401 23:41:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:07.401 23:41:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:07.401 23:41:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:07.401 23:41:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:07.401 23:41:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:07.401 23:41:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:07.401 23:41:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:07.401 23:41:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:07.401 23:41:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:07.402 "name": "raid_bdev1", 00:20:07.402 "uuid": "e6bcc601-67a2-418e-a9f3-279c8afbc422", 00:20:07.402 "strip_size_kb": 0, 00:20:07.402 "state": "online", 00:20:07.402 "raid_level": "raid1", 00:20:07.402 "superblock": false, 00:20:07.402 "num_base_bdevs": 4, 00:20:07.402 "num_base_bdevs_discovered": 4, 00:20:07.402 "num_base_bdevs_operational": 4, 00:20:07.402 "base_bdevs_list": [ 00:20:07.402 { 00:20:07.402 "name": "BaseBdev1", 00:20:07.402 "uuid": "a51f6362-e42c-566e-a6b1-5bdc817b4206", 00:20:07.402 "is_configured": true, 00:20:07.402 "data_offset": 0, 00:20:07.402 "data_size": 65536 00:20:07.402 }, 00:20:07.402 { 00:20:07.402 "name": "BaseBdev2", 00:20:07.402 "uuid": "5115d1c5-a0ed-5eec-9cd3-023caa1fb2d2", 00:20:07.402 "is_configured": true, 00:20:07.402 "data_offset": 0, 00:20:07.402 "data_size": 65536 00:20:07.402 }, 00:20:07.402 { 00:20:07.402 "name": "BaseBdev3", 00:20:07.402 "uuid": "30a25f34-24b9-5732-bd38-2ee49cbf4d98", 00:20:07.402 "is_configured": true, 00:20:07.402 "data_offset": 0, 00:20:07.402 "data_size": 65536 00:20:07.402 }, 00:20:07.402 { 00:20:07.402 "name": "BaseBdev4", 00:20:07.402 "uuid": "03b91718-4f66-519e-8cda-0acb7d160f13", 00:20:07.402 "is_configured": true, 00:20:07.402 "data_offset": 0, 00:20:07.402 "data_size": 65536 00:20:07.402 } 00:20:07.402 ] 00:20:07.402 }' 00:20:07.402 23:41:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:07.402 23:41:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:20:07.969 23:41:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:07.969 23:41:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:20:07.969 [2024-07-24 23:41:52.964549] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:08.228 23:41:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:20:08.228 23:41:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:08.228 23:41:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:20:08.228 23:41:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:20:08.228 23:41:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:20:08.228 23:41:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:20:08.228 23:41:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:20:08.488 [2024-07-24 23:41:53.242840] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17b5df0 00:20:08.488 I/O size of 3145728 is greater than zero copy threshold (65536). 00:20:08.488 Zero copy mechanism will not be used. 00:20:08.488 Running I/O for 60 seconds... 00:20:08.488 [2024-07-24 23:41:53.302648] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:08.488 [2024-07-24 23:41:53.302814] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x17b5df0 00:20:08.488 23:41:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:20:08.488 23:41:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:08.488 23:41:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:08.488 23:41:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:08.488 23:41:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:08.488 23:41:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:08.488 23:41:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:08.488 23:41:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:08.488 23:41:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:08.488 23:41:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:08.488 23:41:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:08.488 23:41:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:08.753 23:41:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:08.753 "name": "raid_bdev1", 00:20:08.753 "uuid": "e6bcc601-67a2-418e-a9f3-279c8afbc422", 00:20:08.753 "strip_size_kb": 0, 00:20:08.753 "state": "online", 00:20:08.753 "raid_level": "raid1", 00:20:08.753 "superblock": false, 00:20:08.753 "num_base_bdevs": 4, 00:20:08.753 "num_base_bdevs_discovered": 3, 00:20:08.753 "num_base_bdevs_operational": 3, 00:20:08.753 "base_bdevs_list": [ 00:20:08.753 { 00:20:08.753 "name": null, 00:20:08.753 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:08.753 "is_configured": false, 00:20:08.753 "data_offset": 0, 00:20:08.754 "data_size": 65536 00:20:08.754 }, 00:20:08.754 { 00:20:08.754 "name": "BaseBdev2", 00:20:08.754 "uuid": "5115d1c5-a0ed-5eec-9cd3-023caa1fb2d2", 00:20:08.754 "is_configured": true, 00:20:08.754 "data_offset": 0, 00:20:08.754 "data_size": 65536 00:20:08.754 }, 00:20:08.754 { 00:20:08.754 "name": "BaseBdev3", 00:20:08.754 "uuid": "30a25f34-24b9-5732-bd38-2ee49cbf4d98", 00:20:08.754 "is_configured": true, 00:20:08.754 "data_offset": 0, 00:20:08.754 "data_size": 65536 00:20:08.754 }, 00:20:08.754 { 00:20:08.754 "name": "BaseBdev4", 00:20:08.754 "uuid": "03b91718-4f66-519e-8cda-0acb7d160f13", 00:20:08.754 "is_configured": true, 00:20:08.754 "data_offset": 0, 00:20:08.754 "data_size": 65536 00:20:08.754 } 00:20:08.754 ] 00:20:08.754 }' 00:20:08.754 23:41:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:08.754 23:41:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:20:09.014 23:41:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:20:09.272 [2024-07-24 23:41:54.148226] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:09.273 23:41:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:20:09.273 [2024-07-24 23:41:54.194211] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1385b50 00:20:09.273 [2024-07-24 23:41:54.195750] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:09.531 [2024-07-24 23:41:54.310851] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:20:09.531 [2024-07-24 23:41:54.311113] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:20:09.531 [2024-07-24 23:41:54.444732] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:20:09.531 [2024-07-24 23:41:54.444893] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:20:09.790 [2024-07-24 23:41:54.667964] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:20:09.790 [2024-07-24 23:41:54.669063] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:20:10.048 [2024-07-24 23:41:54.885135] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:20:10.048 [2024-07-24 23:41:54.885575] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:20:10.306 23:41:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:10.306 23:41:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:10.306 23:41:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:10.306 23:41:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:10.306 23:41:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:10.306 [2024-07-24 23:41:55.197827] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:20:10.306 23:41:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:10.306 23:41:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:10.564 [2024-07-24 23:41:55.315480] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:20:10.564 23:41:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:10.564 "name": "raid_bdev1", 00:20:10.564 "uuid": "e6bcc601-67a2-418e-a9f3-279c8afbc422", 00:20:10.564 "strip_size_kb": 0, 00:20:10.564 "state": "online", 00:20:10.564 "raid_level": "raid1", 00:20:10.564 "superblock": false, 00:20:10.564 "num_base_bdevs": 4, 00:20:10.564 "num_base_bdevs_discovered": 4, 00:20:10.564 "num_base_bdevs_operational": 4, 00:20:10.564 "process": { 00:20:10.564 "type": "rebuild", 00:20:10.564 "target": "spare", 00:20:10.564 "progress": { 00:20:10.564 "blocks": 16384, 00:20:10.564 "percent": 25 00:20:10.564 } 00:20:10.564 }, 00:20:10.564 "base_bdevs_list": [ 00:20:10.564 { 00:20:10.564 "name": "spare", 00:20:10.564 "uuid": "1443d845-5a78-5a2a-9d96-c4a268b8273e", 00:20:10.564 "is_configured": true, 00:20:10.564 "data_offset": 0, 00:20:10.564 "data_size": 65536 00:20:10.564 }, 00:20:10.564 { 00:20:10.564 "name": "BaseBdev2", 00:20:10.564 "uuid": "5115d1c5-a0ed-5eec-9cd3-023caa1fb2d2", 00:20:10.564 "is_configured": true, 00:20:10.564 "data_offset": 0, 00:20:10.564 "data_size": 65536 00:20:10.564 }, 00:20:10.564 { 00:20:10.564 "name": "BaseBdev3", 00:20:10.564 "uuid": "30a25f34-24b9-5732-bd38-2ee49cbf4d98", 00:20:10.565 "is_configured": true, 00:20:10.565 "data_offset": 0, 00:20:10.565 "data_size": 65536 00:20:10.565 }, 00:20:10.565 { 00:20:10.565 "name": "BaseBdev4", 00:20:10.565 "uuid": "03b91718-4f66-519e-8cda-0acb7d160f13", 00:20:10.565 "is_configured": true, 00:20:10.565 "data_offset": 0, 00:20:10.565 "data_size": 65536 00:20:10.565 } 00:20:10.565 ] 00:20:10.565 }' 00:20:10.565 23:41:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:10.565 23:41:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:10.565 23:41:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:10.565 23:41:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:10.565 23:41:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:20:10.565 [2024-07-24 23:41:55.553447] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:20:10.565 [2024-07-24 23:41:55.554530] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:20:10.823 [2024-07-24 23:41:55.612432] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:10.823 [2024-07-24 23:41:55.777205] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:20:10.823 [2024-07-24 23:41:55.785186] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:10.823 [2024-07-24 23:41:55.785213] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:10.823 [2024-07-24 23:41:55.785219] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:20:10.823 [2024-07-24 23:41:55.802866] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x17b5df0 00:20:11.082 23:41:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:20:11.082 23:41:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:11.082 23:41:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:11.082 23:41:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:11.082 23:41:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:11.082 23:41:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:11.082 23:41:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:11.082 23:41:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:11.082 23:41:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:11.082 23:41:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:11.082 23:41:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:11.082 23:41:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:11.082 23:41:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:11.082 "name": "raid_bdev1", 00:20:11.082 "uuid": "e6bcc601-67a2-418e-a9f3-279c8afbc422", 00:20:11.082 "strip_size_kb": 0, 00:20:11.082 "state": "online", 00:20:11.082 "raid_level": "raid1", 00:20:11.082 "superblock": false, 00:20:11.082 "num_base_bdevs": 4, 00:20:11.082 "num_base_bdevs_discovered": 3, 00:20:11.082 "num_base_bdevs_operational": 3, 00:20:11.082 "base_bdevs_list": [ 00:20:11.082 { 00:20:11.082 "name": null, 00:20:11.082 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:11.082 "is_configured": false, 00:20:11.082 "data_offset": 0, 00:20:11.082 "data_size": 65536 00:20:11.082 }, 00:20:11.082 { 00:20:11.082 "name": "BaseBdev2", 00:20:11.082 "uuid": "5115d1c5-a0ed-5eec-9cd3-023caa1fb2d2", 00:20:11.082 "is_configured": true, 00:20:11.082 "data_offset": 0, 00:20:11.082 "data_size": 65536 00:20:11.082 }, 00:20:11.082 { 00:20:11.082 "name": "BaseBdev3", 00:20:11.082 "uuid": "30a25f34-24b9-5732-bd38-2ee49cbf4d98", 00:20:11.082 "is_configured": true, 00:20:11.082 "data_offset": 0, 00:20:11.082 "data_size": 65536 00:20:11.082 }, 00:20:11.082 { 00:20:11.082 "name": "BaseBdev4", 00:20:11.082 "uuid": "03b91718-4f66-519e-8cda-0acb7d160f13", 00:20:11.082 "is_configured": true, 00:20:11.082 "data_offset": 0, 00:20:11.082 "data_size": 65536 00:20:11.082 } 00:20:11.082 ] 00:20:11.082 }' 00:20:11.082 23:41:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:11.082 23:41:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:20:11.648 23:41:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:11.648 23:41:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:11.648 23:41:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:20:11.648 23:41:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:20:11.648 23:41:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:11.648 23:41:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:11.648 23:41:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:11.907 23:41:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:11.907 "name": "raid_bdev1", 00:20:11.907 "uuid": "e6bcc601-67a2-418e-a9f3-279c8afbc422", 00:20:11.907 "strip_size_kb": 0, 00:20:11.907 "state": "online", 00:20:11.907 "raid_level": "raid1", 00:20:11.907 "superblock": false, 00:20:11.907 "num_base_bdevs": 4, 00:20:11.907 "num_base_bdevs_discovered": 3, 00:20:11.907 "num_base_bdevs_operational": 3, 00:20:11.907 "base_bdevs_list": [ 00:20:11.907 { 00:20:11.907 "name": null, 00:20:11.907 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:11.907 "is_configured": false, 00:20:11.907 "data_offset": 0, 00:20:11.907 "data_size": 65536 00:20:11.907 }, 00:20:11.907 { 00:20:11.907 "name": "BaseBdev2", 00:20:11.907 "uuid": "5115d1c5-a0ed-5eec-9cd3-023caa1fb2d2", 00:20:11.907 "is_configured": true, 00:20:11.907 "data_offset": 0, 00:20:11.907 "data_size": 65536 00:20:11.907 }, 00:20:11.907 { 00:20:11.907 "name": "BaseBdev3", 00:20:11.907 "uuid": "30a25f34-24b9-5732-bd38-2ee49cbf4d98", 00:20:11.907 "is_configured": true, 00:20:11.907 "data_offset": 0, 00:20:11.907 "data_size": 65536 00:20:11.907 }, 00:20:11.907 { 00:20:11.907 "name": "BaseBdev4", 00:20:11.907 "uuid": "03b91718-4f66-519e-8cda-0acb7d160f13", 00:20:11.907 "is_configured": true, 00:20:11.907 "data_offset": 0, 00:20:11.907 "data_size": 65536 00:20:11.907 } 00:20:11.907 ] 00:20:11.907 }' 00:20:11.908 23:41:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:11.908 23:41:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:20:11.908 23:41:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:11.908 23:41:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:11.908 23:41:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:20:12.167 [2024-07-24 23:41:56.963405] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:12.167 23:41:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:20:12.167 [2024-07-24 23:41:57.014690] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17b65d0 00:20:12.167 [2024-07-24 23:41:57.015798] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:12.167 [2024-07-24 23:41:57.137510] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:20:12.167 [2024-07-24 23:41:57.137801] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:20:12.425 [2024-07-24 23:41:57.340916] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:20:12.425 [2024-07-24 23:41:57.341423] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:20:12.684 [2024-07-24 23:41:57.667177] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:20:12.942 [2024-07-24 23:41:57.776129] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:20:13.200 23:41:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:13.200 23:41:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:13.200 23:41:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:13.200 23:41:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:13.200 23:41:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:13.200 23:41:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:13.200 23:41:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:13.200 [2024-07-24 23:41:58.131913] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:20:13.200 [2024-07-24 23:41:58.132976] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:20:13.200 23:41:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:13.200 "name": "raid_bdev1", 00:20:13.200 "uuid": "e6bcc601-67a2-418e-a9f3-279c8afbc422", 00:20:13.200 "strip_size_kb": 0, 00:20:13.200 "state": "online", 00:20:13.200 "raid_level": "raid1", 00:20:13.201 "superblock": false, 00:20:13.201 "num_base_bdevs": 4, 00:20:13.201 "num_base_bdevs_discovered": 4, 00:20:13.201 "num_base_bdevs_operational": 4, 00:20:13.201 "process": { 00:20:13.201 "type": "rebuild", 00:20:13.201 "target": "spare", 00:20:13.201 "progress": { 00:20:13.201 "blocks": 14336, 00:20:13.201 "percent": 21 00:20:13.201 } 00:20:13.201 }, 00:20:13.201 "base_bdevs_list": [ 00:20:13.201 { 00:20:13.201 "name": "spare", 00:20:13.201 "uuid": "1443d845-5a78-5a2a-9d96-c4a268b8273e", 00:20:13.201 "is_configured": true, 00:20:13.201 "data_offset": 0, 00:20:13.201 "data_size": 65536 00:20:13.201 }, 00:20:13.201 { 00:20:13.201 "name": "BaseBdev2", 00:20:13.201 "uuid": "5115d1c5-a0ed-5eec-9cd3-023caa1fb2d2", 00:20:13.201 "is_configured": true, 00:20:13.201 "data_offset": 0, 00:20:13.201 "data_size": 65536 00:20:13.201 }, 00:20:13.201 { 00:20:13.201 "name": "BaseBdev3", 00:20:13.201 "uuid": "30a25f34-24b9-5732-bd38-2ee49cbf4d98", 00:20:13.201 "is_configured": true, 00:20:13.201 "data_offset": 0, 00:20:13.201 "data_size": 65536 00:20:13.201 }, 00:20:13.201 { 00:20:13.201 "name": "BaseBdev4", 00:20:13.201 "uuid": "03b91718-4f66-519e-8cda-0acb7d160f13", 00:20:13.201 "is_configured": true, 00:20:13.201 "data_offset": 0, 00:20:13.201 "data_size": 65536 00:20:13.201 } 00:20:13.201 ] 00:20:13.201 }' 00:20:13.201 23:41:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:13.460 23:41:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:13.460 23:41:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:13.460 23:41:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:13.460 23:41:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:20:13.460 23:41:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:20:13.460 23:41:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:20:13.460 23:41:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:20:13.460 23:41:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:20:13.460 [2024-07-24 23:41:58.417884] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:20:13.719 [2024-07-24 23:41:58.568939] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x17b5df0 00:20:13.719 [2024-07-24 23:41:58.568962] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x17b65d0 00:20:13.719 23:41:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:20:13.719 23:41:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:20:13.719 23:41:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:13.719 23:41:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:13.719 23:41:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:13.719 23:41:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:13.719 23:41:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:13.719 23:41:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:13.719 23:41:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:13.719 [2024-07-24 23:41:58.689934] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:20:13.978 23:41:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:13.978 "name": "raid_bdev1", 00:20:13.978 "uuid": "e6bcc601-67a2-418e-a9f3-279c8afbc422", 00:20:13.978 "strip_size_kb": 0, 00:20:13.978 "state": "online", 00:20:13.978 "raid_level": "raid1", 00:20:13.978 "superblock": false, 00:20:13.978 "num_base_bdevs": 4, 00:20:13.978 "num_base_bdevs_discovered": 3, 00:20:13.978 "num_base_bdevs_operational": 3, 00:20:13.978 "process": { 00:20:13.978 "type": "rebuild", 00:20:13.979 "target": "spare", 00:20:13.979 "progress": { 00:20:13.979 "blocks": 20480, 00:20:13.979 "percent": 31 00:20:13.979 } 00:20:13.979 }, 00:20:13.979 "base_bdevs_list": [ 00:20:13.979 { 00:20:13.979 "name": "spare", 00:20:13.979 "uuid": "1443d845-5a78-5a2a-9d96-c4a268b8273e", 00:20:13.979 "is_configured": true, 00:20:13.979 "data_offset": 0, 00:20:13.979 "data_size": 65536 00:20:13.979 }, 00:20:13.979 { 00:20:13.979 "name": null, 00:20:13.979 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:13.979 "is_configured": false, 00:20:13.979 "data_offset": 0, 00:20:13.979 "data_size": 65536 00:20:13.979 }, 00:20:13.979 { 00:20:13.979 "name": "BaseBdev3", 00:20:13.979 "uuid": "30a25f34-24b9-5732-bd38-2ee49cbf4d98", 00:20:13.979 "is_configured": true, 00:20:13.979 "data_offset": 0, 00:20:13.979 "data_size": 65536 00:20:13.979 }, 00:20:13.979 { 00:20:13.979 "name": "BaseBdev4", 00:20:13.979 "uuid": "03b91718-4f66-519e-8cda-0acb7d160f13", 00:20:13.979 "is_configured": true, 00:20:13.979 "data_offset": 0, 00:20:13.979 "data_size": 65536 00:20:13.979 } 00:20:13.979 ] 00:20:13.979 }' 00:20:13.979 23:41:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:13.979 23:41:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:13.979 23:41:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:13.979 23:41:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:13.979 23:41:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@705 -- # local timeout=711 00:20:13.979 23:41:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:20:13.979 23:41:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:13.979 23:41:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:13.979 23:41:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:13.979 23:41:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:13.979 23:41:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:13.979 23:41:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:13.979 23:41:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:14.237 23:41:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:14.237 "name": "raid_bdev1", 00:20:14.237 "uuid": "e6bcc601-67a2-418e-a9f3-279c8afbc422", 00:20:14.237 "strip_size_kb": 0, 00:20:14.237 "state": "online", 00:20:14.238 "raid_level": "raid1", 00:20:14.238 "superblock": false, 00:20:14.238 "num_base_bdevs": 4, 00:20:14.238 "num_base_bdevs_discovered": 3, 00:20:14.238 "num_base_bdevs_operational": 3, 00:20:14.238 "process": { 00:20:14.238 "type": "rebuild", 00:20:14.238 "target": "spare", 00:20:14.238 "progress": { 00:20:14.238 "blocks": 22528, 00:20:14.238 "percent": 34 00:20:14.238 } 00:20:14.238 }, 00:20:14.238 "base_bdevs_list": [ 00:20:14.238 { 00:20:14.238 "name": "spare", 00:20:14.238 "uuid": "1443d845-5a78-5a2a-9d96-c4a268b8273e", 00:20:14.238 "is_configured": true, 00:20:14.238 "data_offset": 0, 00:20:14.238 "data_size": 65536 00:20:14.238 }, 00:20:14.238 { 00:20:14.238 "name": null, 00:20:14.238 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:14.238 "is_configured": false, 00:20:14.238 "data_offset": 0, 00:20:14.238 "data_size": 65536 00:20:14.238 }, 00:20:14.238 { 00:20:14.238 "name": "BaseBdev3", 00:20:14.238 "uuid": "30a25f34-24b9-5732-bd38-2ee49cbf4d98", 00:20:14.238 "is_configured": true, 00:20:14.238 "data_offset": 0, 00:20:14.238 "data_size": 65536 00:20:14.238 }, 00:20:14.238 { 00:20:14.238 "name": "BaseBdev4", 00:20:14.238 "uuid": "03b91718-4f66-519e-8cda-0acb7d160f13", 00:20:14.238 "is_configured": true, 00:20:14.238 "data_offset": 0, 00:20:14.238 "data_size": 65536 00:20:14.238 } 00:20:14.238 ] 00:20:14.238 }' 00:20:14.238 23:41:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:14.238 23:41:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:14.238 23:41:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:14.238 23:41:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:14.238 23:41:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:20:14.238 [2024-07-24 23:41:59.234258] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:20:15.173 23:42:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:20:15.173 23:42:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:15.173 23:42:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:15.173 23:42:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:15.173 23:42:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:15.173 23:42:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:15.173 23:42:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:15.173 23:42:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:15.431 [2024-07-24 23:42:00.234286] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 45056 offset_begin: 43008 offset_end: 49152 00:20:15.432 [2024-07-24 23:42:00.234658] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 45056 offset_begin: 43008 offset_end: 49152 00:20:15.432 23:42:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:15.432 "name": "raid_bdev1", 00:20:15.432 "uuid": "e6bcc601-67a2-418e-a9f3-279c8afbc422", 00:20:15.432 "strip_size_kb": 0, 00:20:15.432 "state": "online", 00:20:15.432 "raid_level": "raid1", 00:20:15.432 "superblock": false, 00:20:15.432 "num_base_bdevs": 4, 00:20:15.432 "num_base_bdevs_discovered": 3, 00:20:15.432 "num_base_bdevs_operational": 3, 00:20:15.432 "process": { 00:20:15.432 "type": "rebuild", 00:20:15.432 "target": "spare", 00:20:15.432 "progress": { 00:20:15.432 "blocks": 45056, 00:20:15.432 "percent": 68 00:20:15.432 } 00:20:15.432 }, 00:20:15.432 "base_bdevs_list": [ 00:20:15.432 { 00:20:15.432 "name": "spare", 00:20:15.432 "uuid": "1443d845-5a78-5a2a-9d96-c4a268b8273e", 00:20:15.432 "is_configured": true, 00:20:15.432 "data_offset": 0, 00:20:15.432 "data_size": 65536 00:20:15.432 }, 00:20:15.432 { 00:20:15.432 "name": null, 00:20:15.432 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:15.432 "is_configured": false, 00:20:15.432 "data_offset": 0, 00:20:15.432 "data_size": 65536 00:20:15.432 }, 00:20:15.432 { 00:20:15.432 "name": "BaseBdev3", 00:20:15.432 "uuid": "30a25f34-24b9-5732-bd38-2ee49cbf4d98", 00:20:15.432 "is_configured": true, 00:20:15.432 "data_offset": 0, 00:20:15.432 "data_size": 65536 00:20:15.432 }, 00:20:15.432 { 00:20:15.432 "name": "BaseBdev4", 00:20:15.432 "uuid": "03b91718-4f66-519e-8cda-0acb7d160f13", 00:20:15.432 "is_configured": true, 00:20:15.432 "data_offset": 0, 00:20:15.432 "data_size": 65536 00:20:15.432 } 00:20:15.432 ] 00:20:15.432 }' 00:20:15.432 23:42:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:15.432 23:42:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:15.432 23:42:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:15.432 [2024-07-24 23:42:00.349828] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 47104 offset_begin: 43008 offset_end: 49152 00:20:15.432 23:42:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:15.432 23:42:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:20:15.999 [2024-07-24 23:42:00.801033] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 53248 offset_begin: 49152 offset_end: 55296 00:20:16.257 [2024-07-24 23:42:01.141807] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 59392 offset_begin: 55296 offset_end: 61440 00:20:16.552 23:42:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:20:16.552 23:42:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:16.552 23:42:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:16.552 23:42:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:16.552 23:42:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:16.552 23:42:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:16.552 23:42:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:16.552 23:42:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:16.552 23:42:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:16.552 "name": "raid_bdev1", 00:20:16.552 "uuid": "e6bcc601-67a2-418e-a9f3-279c8afbc422", 00:20:16.552 "strip_size_kb": 0, 00:20:16.552 "state": "online", 00:20:16.552 "raid_level": "raid1", 00:20:16.552 "superblock": false, 00:20:16.552 "num_base_bdevs": 4, 00:20:16.552 "num_base_bdevs_discovered": 3, 00:20:16.552 "num_base_bdevs_operational": 3, 00:20:16.552 "process": { 00:20:16.552 "type": "rebuild", 00:20:16.552 "target": "spare", 00:20:16.552 "progress": { 00:20:16.552 "blocks": 63488, 00:20:16.552 "percent": 96 00:20:16.552 } 00:20:16.552 }, 00:20:16.552 "base_bdevs_list": [ 00:20:16.552 { 00:20:16.552 "name": "spare", 00:20:16.552 "uuid": "1443d845-5a78-5a2a-9d96-c4a268b8273e", 00:20:16.552 "is_configured": true, 00:20:16.552 "data_offset": 0, 00:20:16.552 "data_size": 65536 00:20:16.552 }, 00:20:16.552 { 00:20:16.552 "name": null, 00:20:16.552 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:16.552 "is_configured": false, 00:20:16.552 "data_offset": 0, 00:20:16.552 "data_size": 65536 00:20:16.552 }, 00:20:16.552 { 00:20:16.552 "name": "BaseBdev3", 00:20:16.552 "uuid": "30a25f34-24b9-5732-bd38-2ee49cbf4d98", 00:20:16.552 "is_configured": true, 00:20:16.552 "data_offset": 0, 00:20:16.552 "data_size": 65536 00:20:16.552 }, 00:20:16.552 { 00:20:16.552 "name": "BaseBdev4", 00:20:16.552 "uuid": "03b91718-4f66-519e-8cda-0acb7d160f13", 00:20:16.552 "is_configured": true, 00:20:16.552 "data_offset": 0, 00:20:16.552 "data_size": 65536 00:20:16.552 } 00:20:16.552 ] 00:20:16.552 }' 00:20:16.552 23:42:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:16.811 [2024-07-24 23:42:01.572382] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:20:16.811 23:42:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:16.811 23:42:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:16.811 23:42:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:16.811 23:42:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:20:16.811 [2024-07-24 23:42:01.678039] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:20:16.811 [2024-07-24 23:42:01.679983] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:17.745 23:42:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:20:17.745 23:42:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:17.745 23:42:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:17.745 23:42:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:17.745 23:42:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:17.745 23:42:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:17.745 23:42:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:17.745 23:42:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:18.003 23:42:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:18.004 "name": "raid_bdev1", 00:20:18.004 "uuid": "e6bcc601-67a2-418e-a9f3-279c8afbc422", 00:20:18.004 "strip_size_kb": 0, 00:20:18.004 "state": "online", 00:20:18.004 "raid_level": "raid1", 00:20:18.004 "superblock": false, 00:20:18.004 "num_base_bdevs": 4, 00:20:18.004 "num_base_bdevs_discovered": 3, 00:20:18.004 "num_base_bdevs_operational": 3, 00:20:18.004 "base_bdevs_list": [ 00:20:18.004 { 00:20:18.004 "name": "spare", 00:20:18.004 "uuid": "1443d845-5a78-5a2a-9d96-c4a268b8273e", 00:20:18.004 "is_configured": true, 00:20:18.004 "data_offset": 0, 00:20:18.004 "data_size": 65536 00:20:18.004 }, 00:20:18.004 { 00:20:18.004 "name": null, 00:20:18.004 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:18.004 "is_configured": false, 00:20:18.004 "data_offset": 0, 00:20:18.004 "data_size": 65536 00:20:18.004 }, 00:20:18.004 { 00:20:18.004 "name": "BaseBdev3", 00:20:18.004 "uuid": "30a25f34-24b9-5732-bd38-2ee49cbf4d98", 00:20:18.004 "is_configured": true, 00:20:18.004 "data_offset": 0, 00:20:18.004 "data_size": 65536 00:20:18.004 }, 00:20:18.004 { 00:20:18.004 "name": "BaseBdev4", 00:20:18.004 "uuid": "03b91718-4f66-519e-8cda-0acb7d160f13", 00:20:18.004 "is_configured": true, 00:20:18.004 "data_offset": 0, 00:20:18.004 "data_size": 65536 00:20:18.004 } 00:20:18.004 ] 00:20:18.004 }' 00:20:18.004 23:42:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:18.004 23:42:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:20:18.004 23:42:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:18.004 23:42:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:20:18.004 23:42:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # break 00:20:18.004 23:42:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:18.004 23:42:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:18.004 23:42:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:20:18.004 23:42:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:20:18.004 23:42:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:18.004 23:42:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:18.004 23:42:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:18.262 23:42:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:18.262 "name": "raid_bdev1", 00:20:18.262 "uuid": "e6bcc601-67a2-418e-a9f3-279c8afbc422", 00:20:18.262 "strip_size_kb": 0, 00:20:18.262 "state": "online", 00:20:18.262 "raid_level": "raid1", 00:20:18.262 "superblock": false, 00:20:18.262 "num_base_bdevs": 4, 00:20:18.262 "num_base_bdevs_discovered": 3, 00:20:18.262 "num_base_bdevs_operational": 3, 00:20:18.262 "base_bdevs_list": [ 00:20:18.262 { 00:20:18.262 "name": "spare", 00:20:18.262 "uuid": "1443d845-5a78-5a2a-9d96-c4a268b8273e", 00:20:18.262 "is_configured": true, 00:20:18.262 "data_offset": 0, 00:20:18.262 "data_size": 65536 00:20:18.262 }, 00:20:18.262 { 00:20:18.262 "name": null, 00:20:18.262 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:18.262 "is_configured": false, 00:20:18.262 "data_offset": 0, 00:20:18.262 "data_size": 65536 00:20:18.262 }, 00:20:18.262 { 00:20:18.262 "name": "BaseBdev3", 00:20:18.262 "uuid": "30a25f34-24b9-5732-bd38-2ee49cbf4d98", 00:20:18.262 "is_configured": true, 00:20:18.262 "data_offset": 0, 00:20:18.262 "data_size": 65536 00:20:18.262 }, 00:20:18.262 { 00:20:18.262 "name": "BaseBdev4", 00:20:18.262 "uuid": "03b91718-4f66-519e-8cda-0acb7d160f13", 00:20:18.262 "is_configured": true, 00:20:18.262 "data_offset": 0, 00:20:18.262 "data_size": 65536 00:20:18.262 } 00:20:18.262 ] 00:20:18.262 }' 00:20:18.262 23:42:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:18.262 23:42:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:20:18.262 23:42:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:18.262 23:42:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:18.262 23:42:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:20:18.262 23:42:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:18.262 23:42:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:18.262 23:42:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:18.262 23:42:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:18.262 23:42:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:18.262 23:42:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:18.262 23:42:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:18.262 23:42:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:18.262 23:42:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:18.262 23:42:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:18.262 23:42:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:18.520 23:42:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:18.520 "name": "raid_bdev1", 00:20:18.520 "uuid": "e6bcc601-67a2-418e-a9f3-279c8afbc422", 00:20:18.520 "strip_size_kb": 0, 00:20:18.520 "state": "online", 00:20:18.520 "raid_level": "raid1", 00:20:18.520 "superblock": false, 00:20:18.520 "num_base_bdevs": 4, 00:20:18.520 "num_base_bdevs_discovered": 3, 00:20:18.520 "num_base_bdevs_operational": 3, 00:20:18.520 "base_bdevs_list": [ 00:20:18.520 { 00:20:18.520 "name": "spare", 00:20:18.520 "uuid": "1443d845-5a78-5a2a-9d96-c4a268b8273e", 00:20:18.520 "is_configured": true, 00:20:18.520 "data_offset": 0, 00:20:18.520 "data_size": 65536 00:20:18.520 }, 00:20:18.520 { 00:20:18.520 "name": null, 00:20:18.520 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:18.520 "is_configured": false, 00:20:18.520 "data_offset": 0, 00:20:18.520 "data_size": 65536 00:20:18.520 }, 00:20:18.520 { 00:20:18.520 "name": "BaseBdev3", 00:20:18.520 "uuid": "30a25f34-24b9-5732-bd38-2ee49cbf4d98", 00:20:18.520 "is_configured": true, 00:20:18.520 "data_offset": 0, 00:20:18.520 "data_size": 65536 00:20:18.520 }, 00:20:18.520 { 00:20:18.520 "name": "BaseBdev4", 00:20:18.520 "uuid": "03b91718-4f66-519e-8cda-0acb7d160f13", 00:20:18.520 "is_configured": true, 00:20:18.520 "data_offset": 0, 00:20:18.520 "data_size": 65536 00:20:18.520 } 00:20:18.520 ] 00:20:18.521 }' 00:20:18.521 23:42:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:18.521 23:42:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:20:19.095 23:42:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:20:19.095 [2024-07-24 23:42:03.946074] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:19.095 [2024-07-24 23:42:03.946100] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:19.095 00:20:19.095 Latency(us) 00:20:19.095 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:20:19.095 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:20:19.095 raid_bdev1 : 10.78 103.38 310.15 0.00 0.00 13578.31 238.93 116342.00 00:20:19.095 =================================================================================================================== 00:20:19.095 Total : 103.38 310.15 0.00 0.00 13578.31 238.93 116342.00 00:20:19.095 [2024-07-24 23:42:04.049097] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:19.095 [2024-07-24 23:42:04.049118] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:19.095 [2024-07-24 23:42:04.049181] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:19.095 [2024-07-24 23:42:04.049187] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17afd20 name raid_bdev1, state offline 00:20:19.095 0 00:20:19.095 23:42:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:19.095 23:42:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # jq length 00:20:19.363 23:42:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:20:19.363 23:42:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:20:19.363 23:42:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:20:19.364 23:42:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:20:19.364 23:42:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:19.364 23:42:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:20:19.364 23:42:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:20:19.364 23:42:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:20:19.364 23:42:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:20:19.364 23:42:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:20:19.364 23:42:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:20:19.364 23:42:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:20:19.364 23:42:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:20:19.623 /dev/nbd0 00:20:19.623 23:42:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:20:19.623 23:42:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:20:19.623 23:42:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:20:19.623 23:42:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # local i 00:20:19.623 23:42:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:20:19.623 23:42:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:20:19.623 23:42:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:20:19.623 23:42:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@873 -- # break 00:20:19.623 23:42:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:20:19.623 23:42:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:20:19.623 23:42:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:20:19.623 1+0 records in 00:20:19.623 1+0 records out 00:20:19.623 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000160409 s, 25.5 MB/s 00:20:19.623 23:42:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:19.623 23:42:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # size=4096 00:20:19.623 23:42:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:19.623 23:42:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:20:19.623 23:42:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@889 -- # return 0 00:20:19.623 23:42:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:20:19.623 23:42:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:20:19.623 23:42:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:20:19.623 23:42:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z '' ']' 00:20:19.623 23:42:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@727 -- # continue 00:20:19.623 23:42:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:20:19.623 23:42:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev3 ']' 00:20:19.623 23:42:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev3 /dev/nbd1 00:20:19.623 23:42:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:19.623 23:42:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev3') 00:20:19.623 23:42:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:20:19.623 23:42:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:20:19.623 23:42:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:20:19.623 23:42:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:20:19.623 23:42:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:20:19.623 23:42:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:20:19.623 23:42:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev3 /dev/nbd1 00:20:19.623 /dev/nbd1 00:20:19.893 23:42:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:20:19.893 23:42:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:20:19.893 23:42:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:20:19.893 23:42:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # local i 00:20:19.893 23:42:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:20:19.893 23:42:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:20:19.893 23:42:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:20:19.893 23:42:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@873 -- # break 00:20:19.893 23:42:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:20:19.893 23:42:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:20:19.893 23:42:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:20:19.893 1+0 records in 00:20:19.893 1+0 records out 00:20:19.893 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000230394 s, 17.8 MB/s 00:20:19.893 23:42:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:19.893 23:42:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # size=4096 00:20:19.893 23:42:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:19.893 23:42:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:20:19.893 23:42:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@889 -- # return 0 00:20:19.893 23:42:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:20:19.893 23:42:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:20:19.893 23:42:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:20:19.893 23:42:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:20:19.893 23:42:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:19.894 23:42:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:20:19.894 23:42:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:20:19.894 23:42:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:20:19.894 23:42:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:20:19.894 23:42:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:20:20.203 23:42:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:20:20.203 23:42:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:20:20.203 23:42:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:20:20.203 23:42:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:20:20.203 23:42:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:20:20.203 23:42:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:20:20.203 23:42:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:20:20.203 23:42:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:20:20.203 23:42:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:20:20.203 23:42:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev4 ']' 00:20:20.203 23:42:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev4 /dev/nbd1 00:20:20.203 23:42:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:20.203 23:42:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev4') 00:20:20.203 23:42:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:20:20.203 23:42:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:20:20.203 23:42:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:20:20.203 23:42:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:20:20.203 23:42:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:20:20.203 23:42:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:20:20.203 23:42:04 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev4 /dev/nbd1 00:20:20.203 /dev/nbd1 00:20:20.203 23:42:05 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:20:20.203 23:42:05 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:20:20.203 23:42:05 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:20:20.203 23:42:05 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # local i 00:20:20.203 23:42:05 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:20:20.203 23:42:05 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:20:20.203 23:42:05 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:20:20.203 23:42:05 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@873 -- # break 00:20:20.203 23:42:05 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:20:20.203 23:42:05 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:20:20.203 23:42:05 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:20:20.203 1+0 records in 00:20:20.203 1+0 records out 00:20:20.203 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000193466 s, 21.2 MB/s 00:20:20.203 23:42:05 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:20.203 23:42:05 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # size=4096 00:20:20.203 23:42:05 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:20.203 23:42:05 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:20:20.203 23:42:05 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@889 -- # return 0 00:20:20.203 23:42:05 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:20:20.203 23:42:05 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:20:20.203 23:42:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:20:20.203 23:42:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:20:20.203 23:42:05 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:20.203 23:42:05 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:20:20.203 23:42:05 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:20:20.203 23:42:05 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:20:20.203 23:42:05 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:20:20.203 23:42:05 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:20:20.462 23:42:05 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:20:20.462 23:42:05 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:20:20.462 23:42:05 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:20:20.462 23:42:05 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:20:20.462 23:42:05 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:20:20.462 23:42:05 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:20:20.462 23:42:05 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:20:20.462 23:42:05 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:20:20.462 23:42:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:20:20.462 23:42:05 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:20.462 23:42:05 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:20:20.462 23:42:05 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:20:20.462 23:42:05 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:20:20.462 23:42:05 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:20:20.462 23:42:05 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:20:20.721 23:42:05 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:20:20.721 23:42:05 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:20:20.721 23:42:05 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:20:20.721 23:42:05 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:20:20.722 23:42:05 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:20:20.722 23:42:05 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:20:20.722 23:42:05 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:20:20.722 23:42:05 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:20:20.722 23:42:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:20:20.722 23:42:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@782 -- # killprocess 370037 00:20:20.722 23:42:05 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@950 -- # '[' -z 370037 ']' 00:20:20.722 23:42:05 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # kill -0 370037 00:20:20.722 23:42:05 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@955 -- # uname 00:20:20.722 23:42:05 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:20:20.722 23:42:05 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 370037 00:20:20.722 23:42:05 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:20:20.722 23:42:05 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:20:20.722 23:42:05 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@968 -- # echo 'killing process with pid 370037' 00:20:20.722 killing process with pid 370037 00:20:20.722 23:42:05 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@969 -- # kill 370037 00:20:20.722 Received shutdown signal, test time was about 12.292866 seconds 00:20:20.722 00:20:20.722 Latency(us) 00:20:20.722 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:20:20.722 =================================================================================================================== 00:20:20.722 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:20:20.722 [2024-07-24 23:42:05.567323] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:20.722 23:42:05 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@974 -- # wait 370037 00:20:20.722 [2024-07-24 23:42:05.601827] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:20.981 23:42:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@784 -- # return 0 00:20:20.981 00:20:20.981 real 0m16.498s 00:20:20.981 user 0m24.811s 00:20:20.981 sys 0m2.193s 00:20:20.981 23:42:05 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1126 -- # xtrace_disable 00:20:20.981 23:42:05 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:20:20.981 ************************************ 00:20:20.981 END TEST raid_rebuild_test_io 00:20:20.981 ************************************ 00:20:20.981 23:42:05 bdev_raid -- bdev/bdev_raid.sh@880 -- # run_test raid_rebuild_test_sb_io raid_rebuild_test raid1 4 true true true 00:20:20.981 23:42:05 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:20:20.981 23:42:05 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:20:20.981 23:42:05 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:20.981 ************************************ 00:20:20.981 START TEST raid_rebuild_test_sb_io 00:20:20.981 ************************************ 00:20:20.981 23:42:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1125 -- # raid_rebuild_test raid1 4 true true true 00:20:20.981 23:42:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:20:20.981 23:42:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:20:20.981 23:42:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:20:20.981 23:42:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:20:20.981 23:42:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:20:20.981 23:42:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:20:20.981 23:42:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:20.981 23:42:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:20:20.981 23:42:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:20:20.981 23:42:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:20.981 23:42:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:20:20.981 23:42:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:20:20.981 23:42:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:20.981 23:42:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:20:20.981 23:42:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:20:20.981 23:42:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:20.981 23:42:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:20:20.981 23:42:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:20:20.981 23:42:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:20:20.981 23:42:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:20:20.981 23:42:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:20:20.981 23:42:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:20:20.981 23:42:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:20:20.981 23:42:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:20:20.981 23:42:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:20:20.981 23:42:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:20:20.981 23:42:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:20:20.981 23:42:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:20:20.981 23:42:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:20:20.981 23:42:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:20:20.981 23:42:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@596 -- # raid_pid=373154 00:20:20.981 23:42:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 373154 /var/tmp/spdk-raid.sock 00:20:20.981 23:42:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:20:20.981 23:42:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@831 -- # '[' -z 373154 ']' 00:20:20.981 23:42:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:20.981 23:42:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@836 -- # local max_retries=100 00:20:20.981 23:42:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:20.981 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:20.981 23:42:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@840 -- # xtrace_disable 00:20:20.981 23:42:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:20:20.981 [2024-07-24 23:42:05.908414] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:20:20.981 [2024-07-24 23:42:05.908454] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid373154 ] 00:20:20.981 I/O size of 3145728 is greater than zero copy threshold (65536). 00:20:20.981 Zero copy mechanism will not be used. 00:20:20.981 [2024-07-24 23:42:05.973111] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:21.240 [2024-07-24 23:42:06.042969] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:21.240 [2024-07-24 23:42:06.095089] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:21.240 [2024-07-24 23:42:06.095109] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:21.807 23:42:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:20:21.807 23:42:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@864 -- # return 0 00:20:21.807 23:42:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:20:21.807 23:42:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:20:22.066 BaseBdev1_malloc 00:20:22.066 23:42:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:20:22.066 [2024-07-24 23:42:07.027672] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:20:22.066 [2024-07-24 23:42:07.027708] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:22.066 [2024-07-24 23:42:07.027721] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22b21c0 00:20:22.066 [2024-07-24 23:42:07.027742] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:22.066 [2024-07-24 23:42:07.028785] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:22.066 [2024-07-24 23:42:07.028803] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:20:22.066 BaseBdev1 00:20:22.066 23:42:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:20:22.066 23:42:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:20:22.325 BaseBdev2_malloc 00:20:22.325 23:42:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:20:22.583 [2024-07-24 23:42:07.363846] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:20:22.583 [2024-07-24 23:42:07.363872] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:22.583 [2024-07-24 23:42:07.363884] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22b2ce0 00:20:22.583 [2024-07-24 23:42:07.363889] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:22.583 [2024-07-24 23:42:07.364820] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:22.583 [2024-07-24 23:42:07.364838] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:20:22.583 BaseBdev2 00:20:22.583 23:42:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:20:22.583 23:42:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:20:22.583 BaseBdev3_malloc 00:20:22.583 23:42:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:20:22.841 [2024-07-24 23:42:07.716321] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:20:22.841 [2024-07-24 23:42:07.716355] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:22.841 [2024-07-24 23:42:07.716365] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x245fd70 00:20:22.841 [2024-07-24 23:42:07.716371] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:22.841 [2024-07-24 23:42:07.717387] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:22.841 [2024-07-24 23:42:07.717408] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:20:22.841 BaseBdev3 00:20:22.841 23:42:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:20:22.841 23:42:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:20:23.100 BaseBdev4_malloc 00:20:23.100 23:42:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:20:23.100 [2024-07-24 23:42:08.088932] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:20:23.100 [2024-07-24 23:42:08.088963] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:23.100 [2024-07-24 23:42:08.088974] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x245ef50 00:20:23.100 [2024-07-24 23:42:08.088979] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:23.100 [2024-07-24 23:42:08.089951] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:23.100 [2024-07-24 23:42:08.089970] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:20:23.100 BaseBdev4 00:20:23.358 23:42:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:20:23.358 spare_malloc 00:20:23.358 23:42:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:20:23.617 spare_delay 00:20:23.617 23:42:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:20:23.876 [2024-07-24 23:42:08.625562] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:20:23.876 [2024-07-24 23:42:08.625591] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:23.876 [2024-07-24 23:42:08.625601] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2463a30 00:20:23.876 [2024-07-24 23:42:08.625607] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:23.876 [2024-07-24 23:42:08.626557] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:23.876 [2024-07-24 23:42:08.626575] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:20:23.876 spare 00:20:23.876 23:42:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:20:23.876 [2024-07-24 23:42:08.806054] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:23.876 [2024-07-24 23:42:08.806852] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:23.876 [2024-07-24 23:42:08.806888] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:23.876 [2024-07-24 23:42:08.806917] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:23.876 [2024-07-24 23:42:08.807051] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x23e2d20 00:20:23.876 [2024-07-24 23:42:08.807058] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:20:23.876 [2024-07-24 23:42:08.807178] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x245d290 00:20:23.876 [2024-07-24 23:42:08.807279] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x23e2d20 00:20:23.876 [2024-07-24 23:42:08.807284] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x23e2d20 00:20:23.876 [2024-07-24 23:42:08.807344] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:23.876 23:42:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:20:23.876 23:42:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:23.876 23:42:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:23.876 23:42:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:23.876 23:42:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:23.876 23:42:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:23.876 23:42:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:23.876 23:42:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:23.876 23:42:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:23.876 23:42:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:23.876 23:42:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:23.876 23:42:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:24.134 23:42:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:24.134 "name": "raid_bdev1", 00:20:24.134 "uuid": "01f2c44b-72f3-43de-9cc9-e8068d169051", 00:20:24.134 "strip_size_kb": 0, 00:20:24.134 "state": "online", 00:20:24.134 "raid_level": "raid1", 00:20:24.134 "superblock": true, 00:20:24.134 "num_base_bdevs": 4, 00:20:24.134 "num_base_bdevs_discovered": 4, 00:20:24.134 "num_base_bdevs_operational": 4, 00:20:24.134 "base_bdevs_list": [ 00:20:24.134 { 00:20:24.134 "name": "BaseBdev1", 00:20:24.134 "uuid": "b317ded4-e630-53d3-9408-b18033821457", 00:20:24.134 "is_configured": true, 00:20:24.134 "data_offset": 2048, 00:20:24.134 "data_size": 63488 00:20:24.134 }, 00:20:24.134 { 00:20:24.134 "name": "BaseBdev2", 00:20:24.134 "uuid": "e3acc244-3781-51d4-ac8b-8117b31d8fa9", 00:20:24.134 "is_configured": true, 00:20:24.134 "data_offset": 2048, 00:20:24.134 "data_size": 63488 00:20:24.134 }, 00:20:24.134 { 00:20:24.134 "name": "BaseBdev3", 00:20:24.134 "uuid": "3634e7dd-b5a1-5e1a-a492-184d583279d8", 00:20:24.134 "is_configured": true, 00:20:24.134 "data_offset": 2048, 00:20:24.135 "data_size": 63488 00:20:24.135 }, 00:20:24.135 { 00:20:24.135 "name": "BaseBdev4", 00:20:24.135 "uuid": "a84d511c-d1cd-5aea-b59e-9797a7396a6d", 00:20:24.135 "is_configured": true, 00:20:24.135 "data_offset": 2048, 00:20:24.135 "data_size": 63488 00:20:24.135 } 00:20:24.135 ] 00:20:24.135 }' 00:20:24.135 23:42:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:24.135 23:42:09 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:20:24.701 23:42:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:24.701 23:42:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:20:24.701 [2024-07-24 23:42:09.636387] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:24.701 23:42:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:20:24.701 23:42:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:24.701 23:42:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:20:24.960 23:42:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:20:24.960 23:42:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:20:24.960 23:42:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:20:24.960 23:42:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:20:24.960 [2024-07-24 23:42:09.906697] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x22b1af0 00:20:24.960 I/O size of 3145728 is greater than zero copy threshold (65536). 00:20:24.960 Zero copy mechanism will not be used. 00:20:24.960 Running I/O for 60 seconds... 00:20:25.219 [2024-07-24 23:42:09.983633] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:25.219 [2024-07-24 23:42:09.988764] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x22b1af0 00:20:25.219 23:42:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:20:25.219 23:42:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:25.219 23:42:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:25.219 23:42:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:25.219 23:42:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:25.219 23:42:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:25.219 23:42:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:25.219 23:42:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:25.219 23:42:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:25.219 23:42:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:25.219 23:42:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:25.219 23:42:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:25.219 23:42:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:25.219 "name": "raid_bdev1", 00:20:25.219 "uuid": "01f2c44b-72f3-43de-9cc9-e8068d169051", 00:20:25.219 "strip_size_kb": 0, 00:20:25.219 "state": "online", 00:20:25.219 "raid_level": "raid1", 00:20:25.219 "superblock": true, 00:20:25.219 "num_base_bdevs": 4, 00:20:25.219 "num_base_bdevs_discovered": 3, 00:20:25.219 "num_base_bdevs_operational": 3, 00:20:25.219 "base_bdevs_list": [ 00:20:25.219 { 00:20:25.219 "name": null, 00:20:25.219 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:25.219 "is_configured": false, 00:20:25.219 "data_offset": 2048, 00:20:25.219 "data_size": 63488 00:20:25.219 }, 00:20:25.219 { 00:20:25.219 "name": "BaseBdev2", 00:20:25.219 "uuid": "e3acc244-3781-51d4-ac8b-8117b31d8fa9", 00:20:25.219 "is_configured": true, 00:20:25.219 "data_offset": 2048, 00:20:25.219 "data_size": 63488 00:20:25.219 }, 00:20:25.219 { 00:20:25.219 "name": "BaseBdev3", 00:20:25.219 "uuid": "3634e7dd-b5a1-5e1a-a492-184d583279d8", 00:20:25.219 "is_configured": true, 00:20:25.219 "data_offset": 2048, 00:20:25.219 "data_size": 63488 00:20:25.219 }, 00:20:25.219 { 00:20:25.219 "name": "BaseBdev4", 00:20:25.219 "uuid": "a84d511c-d1cd-5aea-b59e-9797a7396a6d", 00:20:25.219 "is_configured": true, 00:20:25.219 "data_offset": 2048, 00:20:25.219 "data_size": 63488 00:20:25.219 } 00:20:25.219 ] 00:20:25.219 }' 00:20:25.219 23:42:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:25.219 23:42:10 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:20:25.787 23:42:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:20:26.045 [2024-07-24 23:42:10.880847] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:26.045 [2024-07-24 23:42:10.935888] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23e5080 00:20:26.045 23:42:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:20:26.046 [2024-07-24 23:42:10.937593] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:26.305 [2024-07-24 23:42:11.053868] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:20:26.305 [2024-07-24 23:42:11.054204] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:20:26.305 [2024-07-24 23:42:11.269924] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:20:26.305 [2024-07-24 23:42:11.270117] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:20:26.563 [2024-07-24 23:42:11.508970] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:20:26.563 [2024-07-24 23:42:11.510003] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:20:27.131 23:42:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:27.131 23:42:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:27.131 23:42:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:27.131 23:42:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:27.132 23:42:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:27.132 23:42:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:27.132 23:42:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:27.132 [2024-07-24 23:42:12.104388] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:20:27.132 23:42:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:27.132 "name": "raid_bdev1", 00:20:27.132 "uuid": "01f2c44b-72f3-43de-9cc9-e8068d169051", 00:20:27.132 "strip_size_kb": 0, 00:20:27.132 "state": "online", 00:20:27.132 "raid_level": "raid1", 00:20:27.132 "superblock": true, 00:20:27.132 "num_base_bdevs": 4, 00:20:27.132 "num_base_bdevs_discovered": 4, 00:20:27.132 "num_base_bdevs_operational": 4, 00:20:27.132 "process": { 00:20:27.132 "type": "rebuild", 00:20:27.132 "target": "spare", 00:20:27.132 "progress": { 00:20:27.132 "blocks": 14336, 00:20:27.132 "percent": 22 00:20:27.132 } 00:20:27.132 }, 00:20:27.132 "base_bdevs_list": [ 00:20:27.132 { 00:20:27.132 "name": "spare", 00:20:27.132 "uuid": "2ca84fd9-98d7-508f-8bbe-60f44c80847b", 00:20:27.132 "is_configured": true, 00:20:27.132 "data_offset": 2048, 00:20:27.132 "data_size": 63488 00:20:27.132 }, 00:20:27.132 { 00:20:27.132 "name": "BaseBdev2", 00:20:27.132 "uuid": "e3acc244-3781-51d4-ac8b-8117b31d8fa9", 00:20:27.132 "is_configured": true, 00:20:27.132 "data_offset": 2048, 00:20:27.132 "data_size": 63488 00:20:27.132 }, 00:20:27.132 { 00:20:27.132 "name": "BaseBdev3", 00:20:27.132 "uuid": "3634e7dd-b5a1-5e1a-a492-184d583279d8", 00:20:27.132 "is_configured": true, 00:20:27.132 "data_offset": 2048, 00:20:27.132 "data_size": 63488 00:20:27.132 }, 00:20:27.132 { 00:20:27.132 "name": "BaseBdev4", 00:20:27.132 "uuid": "a84d511c-d1cd-5aea-b59e-9797a7396a6d", 00:20:27.132 "is_configured": true, 00:20:27.132 "data_offset": 2048, 00:20:27.132 "data_size": 63488 00:20:27.132 } 00:20:27.132 ] 00:20:27.132 }' 00:20:27.132 23:42:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:27.391 23:42:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:27.391 23:42:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:27.391 23:42:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:27.391 23:42:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:20:27.391 [2024-07-24 23:42:12.348306] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:20:27.391 [2024-07-24 23:42:12.354959] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:27.391 [2024-07-24 23:42:12.373148] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:20:27.649 [2024-07-24 23:42:12.486288] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:20:27.649 [2024-07-24 23:42:12.494972] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:27.649 [2024-07-24 23:42:12.494992] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:27.650 [2024-07-24 23:42:12.494997] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:20:27.650 [2024-07-24 23:42:12.512284] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x22b1af0 00:20:27.650 23:42:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:20:27.650 23:42:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:27.650 23:42:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:27.650 23:42:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:27.650 23:42:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:27.650 23:42:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:27.650 23:42:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:27.650 23:42:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:27.650 23:42:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:27.650 23:42:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:27.650 23:42:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:27.650 23:42:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:27.909 23:42:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:27.909 "name": "raid_bdev1", 00:20:27.909 "uuid": "01f2c44b-72f3-43de-9cc9-e8068d169051", 00:20:27.909 "strip_size_kb": 0, 00:20:27.909 "state": "online", 00:20:27.909 "raid_level": "raid1", 00:20:27.909 "superblock": true, 00:20:27.909 "num_base_bdevs": 4, 00:20:27.909 "num_base_bdevs_discovered": 3, 00:20:27.909 "num_base_bdevs_operational": 3, 00:20:27.909 "base_bdevs_list": [ 00:20:27.909 { 00:20:27.909 "name": null, 00:20:27.909 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:27.909 "is_configured": false, 00:20:27.909 "data_offset": 2048, 00:20:27.909 "data_size": 63488 00:20:27.909 }, 00:20:27.909 { 00:20:27.909 "name": "BaseBdev2", 00:20:27.909 "uuid": "e3acc244-3781-51d4-ac8b-8117b31d8fa9", 00:20:27.909 "is_configured": true, 00:20:27.909 "data_offset": 2048, 00:20:27.909 "data_size": 63488 00:20:27.909 }, 00:20:27.909 { 00:20:27.909 "name": "BaseBdev3", 00:20:27.909 "uuid": "3634e7dd-b5a1-5e1a-a492-184d583279d8", 00:20:27.909 "is_configured": true, 00:20:27.909 "data_offset": 2048, 00:20:27.909 "data_size": 63488 00:20:27.909 }, 00:20:27.909 { 00:20:27.909 "name": "BaseBdev4", 00:20:27.909 "uuid": "a84d511c-d1cd-5aea-b59e-9797a7396a6d", 00:20:27.909 "is_configured": true, 00:20:27.909 "data_offset": 2048, 00:20:27.909 "data_size": 63488 00:20:27.909 } 00:20:27.909 ] 00:20:27.909 }' 00:20:27.909 23:42:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:27.909 23:42:12 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:20:28.476 23:42:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:28.476 23:42:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:28.476 23:42:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:20:28.476 23:42:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:20:28.476 23:42:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:28.476 23:42:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:28.476 23:42:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:28.476 23:42:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:28.476 "name": "raid_bdev1", 00:20:28.476 "uuid": "01f2c44b-72f3-43de-9cc9-e8068d169051", 00:20:28.476 "strip_size_kb": 0, 00:20:28.476 "state": "online", 00:20:28.476 "raid_level": "raid1", 00:20:28.476 "superblock": true, 00:20:28.476 "num_base_bdevs": 4, 00:20:28.476 "num_base_bdevs_discovered": 3, 00:20:28.476 "num_base_bdevs_operational": 3, 00:20:28.476 "base_bdevs_list": [ 00:20:28.476 { 00:20:28.476 "name": null, 00:20:28.476 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:28.476 "is_configured": false, 00:20:28.476 "data_offset": 2048, 00:20:28.476 "data_size": 63488 00:20:28.476 }, 00:20:28.476 { 00:20:28.476 "name": "BaseBdev2", 00:20:28.476 "uuid": "e3acc244-3781-51d4-ac8b-8117b31d8fa9", 00:20:28.476 "is_configured": true, 00:20:28.476 "data_offset": 2048, 00:20:28.476 "data_size": 63488 00:20:28.476 }, 00:20:28.476 { 00:20:28.476 "name": "BaseBdev3", 00:20:28.476 "uuid": "3634e7dd-b5a1-5e1a-a492-184d583279d8", 00:20:28.476 "is_configured": true, 00:20:28.476 "data_offset": 2048, 00:20:28.476 "data_size": 63488 00:20:28.476 }, 00:20:28.476 { 00:20:28.476 "name": "BaseBdev4", 00:20:28.476 "uuid": "a84d511c-d1cd-5aea-b59e-9797a7396a6d", 00:20:28.476 "is_configured": true, 00:20:28.476 "data_offset": 2048, 00:20:28.476 "data_size": 63488 00:20:28.476 } 00:20:28.476 ] 00:20:28.476 }' 00:20:28.476 23:42:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:28.476 23:42:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:20:28.476 23:42:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:28.734 23:42:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:28.734 23:42:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:20:28.734 [2024-07-24 23:42:13.652996] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:28.734 23:42:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:20:28.734 [2024-07-24 23:42:13.699434] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23e6e30 00:20:28.734 [2024-07-24 23:42:13.700531] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:29.075 [2024-07-24 23:42:13.821665] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:20:29.075 [2024-07-24 23:42:14.038166] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:20:29.075 [2024-07-24 23:42:14.038350] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:20:29.338 [2024-07-24 23:42:14.275184] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:20:29.339 [2024-07-24 23:42:14.275444] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:20:29.600 [2024-07-24 23:42:14.395986] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:20:29.600 [2024-07-24 23:42:14.396519] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:20:29.859 23:42:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:29.859 23:42:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:29.859 23:42:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:29.859 23:42:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:29.859 23:42:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:29.859 23:42:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:29.859 23:42:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:29.859 [2024-07-24 23:42:14.835561] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:20:30.119 23:42:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:30.119 "name": "raid_bdev1", 00:20:30.119 "uuid": "01f2c44b-72f3-43de-9cc9-e8068d169051", 00:20:30.119 "strip_size_kb": 0, 00:20:30.119 "state": "online", 00:20:30.119 "raid_level": "raid1", 00:20:30.119 "superblock": true, 00:20:30.119 "num_base_bdevs": 4, 00:20:30.119 "num_base_bdevs_discovered": 4, 00:20:30.119 "num_base_bdevs_operational": 4, 00:20:30.119 "process": { 00:20:30.119 "type": "rebuild", 00:20:30.119 "target": "spare", 00:20:30.119 "progress": { 00:20:30.119 "blocks": 16384, 00:20:30.119 "percent": 25 00:20:30.119 } 00:20:30.119 }, 00:20:30.119 "base_bdevs_list": [ 00:20:30.119 { 00:20:30.119 "name": "spare", 00:20:30.119 "uuid": "2ca84fd9-98d7-508f-8bbe-60f44c80847b", 00:20:30.119 "is_configured": true, 00:20:30.119 "data_offset": 2048, 00:20:30.119 "data_size": 63488 00:20:30.119 }, 00:20:30.119 { 00:20:30.119 "name": "BaseBdev2", 00:20:30.119 "uuid": "e3acc244-3781-51d4-ac8b-8117b31d8fa9", 00:20:30.119 "is_configured": true, 00:20:30.119 "data_offset": 2048, 00:20:30.119 "data_size": 63488 00:20:30.119 }, 00:20:30.119 { 00:20:30.119 "name": "BaseBdev3", 00:20:30.119 "uuid": "3634e7dd-b5a1-5e1a-a492-184d583279d8", 00:20:30.119 "is_configured": true, 00:20:30.119 "data_offset": 2048, 00:20:30.119 "data_size": 63488 00:20:30.119 }, 00:20:30.119 { 00:20:30.119 "name": "BaseBdev4", 00:20:30.119 "uuid": "a84d511c-d1cd-5aea-b59e-9797a7396a6d", 00:20:30.119 "is_configured": true, 00:20:30.119 "data_offset": 2048, 00:20:30.119 "data_size": 63488 00:20:30.119 } 00:20:30.119 ] 00:20:30.119 }' 00:20:30.119 23:42:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:30.119 23:42:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:30.119 23:42:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:30.119 23:42:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:30.119 23:42:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:20:30.119 23:42:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:20:30.119 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:20:30.119 23:42:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:20:30.119 23:42:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:20:30.119 23:42:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:20:30.119 23:42:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:20:30.119 [2024-07-24 23:42:15.096113] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:20:30.378 [2024-07-24 23:42:15.175411] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:20:30.378 [2024-07-24 23:42:15.276761] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x22b1af0 00:20:30.378 [2024-07-24 23:42:15.276780] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x23e6e30 00:20:30.378 23:42:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:20:30.378 23:42:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:20:30.378 23:42:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:30.378 23:42:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:30.378 23:42:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:30.378 23:42:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:30.378 23:42:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:30.378 23:42:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:30.378 23:42:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:30.637 [2024-07-24 23:42:15.386400] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:20:30.637 23:42:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:30.637 "name": "raid_bdev1", 00:20:30.637 "uuid": "01f2c44b-72f3-43de-9cc9-e8068d169051", 00:20:30.637 "strip_size_kb": 0, 00:20:30.637 "state": "online", 00:20:30.637 "raid_level": "raid1", 00:20:30.637 "superblock": true, 00:20:30.637 "num_base_bdevs": 4, 00:20:30.637 "num_base_bdevs_discovered": 3, 00:20:30.637 "num_base_bdevs_operational": 3, 00:20:30.637 "process": { 00:20:30.637 "type": "rebuild", 00:20:30.637 "target": "spare", 00:20:30.637 "progress": { 00:20:30.637 "blocks": 22528, 00:20:30.637 "percent": 35 00:20:30.637 } 00:20:30.637 }, 00:20:30.637 "base_bdevs_list": [ 00:20:30.637 { 00:20:30.637 "name": "spare", 00:20:30.637 "uuid": "2ca84fd9-98d7-508f-8bbe-60f44c80847b", 00:20:30.637 "is_configured": true, 00:20:30.637 "data_offset": 2048, 00:20:30.637 "data_size": 63488 00:20:30.637 }, 00:20:30.637 { 00:20:30.637 "name": null, 00:20:30.637 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:30.637 "is_configured": false, 00:20:30.637 "data_offset": 2048, 00:20:30.637 "data_size": 63488 00:20:30.637 }, 00:20:30.637 { 00:20:30.637 "name": "BaseBdev3", 00:20:30.637 "uuid": "3634e7dd-b5a1-5e1a-a492-184d583279d8", 00:20:30.637 "is_configured": true, 00:20:30.637 "data_offset": 2048, 00:20:30.637 "data_size": 63488 00:20:30.637 }, 00:20:30.637 { 00:20:30.637 "name": "BaseBdev4", 00:20:30.637 "uuid": "a84d511c-d1cd-5aea-b59e-9797a7396a6d", 00:20:30.637 "is_configured": true, 00:20:30.637 "data_offset": 2048, 00:20:30.637 "data_size": 63488 00:20:30.637 } 00:20:30.637 ] 00:20:30.637 }' 00:20:30.637 23:42:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:30.637 23:42:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:30.637 23:42:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:30.637 23:42:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:30.637 23:42:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@705 -- # local timeout=728 00:20:30.637 23:42:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:20:30.637 23:42:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:30.637 23:42:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:30.637 23:42:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:30.637 23:42:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:30.637 23:42:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:30.637 23:42:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:30.637 23:42:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:30.896 23:42:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:30.896 "name": "raid_bdev1", 00:20:30.896 "uuid": "01f2c44b-72f3-43de-9cc9-e8068d169051", 00:20:30.896 "strip_size_kb": 0, 00:20:30.896 "state": "online", 00:20:30.896 "raid_level": "raid1", 00:20:30.896 "superblock": true, 00:20:30.896 "num_base_bdevs": 4, 00:20:30.896 "num_base_bdevs_discovered": 3, 00:20:30.896 "num_base_bdevs_operational": 3, 00:20:30.896 "process": { 00:20:30.896 "type": "rebuild", 00:20:30.896 "target": "spare", 00:20:30.896 "progress": { 00:20:30.896 "blocks": 26624, 00:20:30.896 "percent": 41 00:20:30.896 } 00:20:30.896 }, 00:20:30.896 "base_bdevs_list": [ 00:20:30.896 { 00:20:30.896 "name": "spare", 00:20:30.896 "uuid": "2ca84fd9-98d7-508f-8bbe-60f44c80847b", 00:20:30.896 "is_configured": true, 00:20:30.896 "data_offset": 2048, 00:20:30.896 "data_size": 63488 00:20:30.896 }, 00:20:30.896 { 00:20:30.896 "name": null, 00:20:30.896 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:30.896 "is_configured": false, 00:20:30.896 "data_offset": 2048, 00:20:30.896 "data_size": 63488 00:20:30.896 }, 00:20:30.896 { 00:20:30.896 "name": "BaseBdev3", 00:20:30.896 "uuid": "3634e7dd-b5a1-5e1a-a492-184d583279d8", 00:20:30.896 "is_configured": true, 00:20:30.896 "data_offset": 2048, 00:20:30.896 "data_size": 63488 00:20:30.896 }, 00:20:30.896 { 00:20:30.896 "name": "BaseBdev4", 00:20:30.896 "uuid": "a84d511c-d1cd-5aea-b59e-9797a7396a6d", 00:20:30.896 "is_configured": true, 00:20:30.896 "data_offset": 2048, 00:20:30.896 "data_size": 63488 00:20:30.896 } 00:20:30.896 ] 00:20:30.896 }' 00:20:30.896 23:42:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:30.896 [2024-07-24 23:42:15.775604] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:20:30.896 23:42:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:30.896 23:42:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:30.896 23:42:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:30.896 23:42:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:20:31.832 [2024-07-24 23:42:16.773071] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 47104 offset_begin: 43008 offset_end: 49152 00:20:31.832 23:42:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:20:31.832 23:42:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:31.832 23:42:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:31.832 23:42:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:31.832 23:42:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:31.832 23:42:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:31.832 23:42:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:31.832 23:42:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:32.090 23:42:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:32.090 "name": "raid_bdev1", 00:20:32.090 "uuid": "01f2c44b-72f3-43de-9cc9-e8068d169051", 00:20:32.090 "strip_size_kb": 0, 00:20:32.090 "state": "online", 00:20:32.090 "raid_level": "raid1", 00:20:32.090 "superblock": true, 00:20:32.090 "num_base_bdevs": 4, 00:20:32.090 "num_base_bdevs_discovered": 3, 00:20:32.090 "num_base_bdevs_operational": 3, 00:20:32.090 "process": { 00:20:32.090 "type": "rebuild", 00:20:32.090 "target": "spare", 00:20:32.090 "progress": { 00:20:32.090 "blocks": 47104, 00:20:32.090 "percent": 74 00:20:32.090 } 00:20:32.090 }, 00:20:32.090 "base_bdevs_list": [ 00:20:32.090 { 00:20:32.090 "name": "spare", 00:20:32.090 "uuid": "2ca84fd9-98d7-508f-8bbe-60f44c80847b", 00:20:32.090 "is_configured": true, 00:20:32.090 "data_offset": 2048, 00:20:32.090 "data_size": 63488 00:20:32.090 }, 00:20:32.090 { 00:20:32.090 "name": null, 00:20:32.090 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:32.090 "is_configured": false, 00:20:32.090 "data_offset": 2048, 00:20:32.090 "data_size": 63488 00:20:32.090 }, 00:20:32.090 { 00:20:32.090 "name": "BaseBdev3", 00:20:32.090 "uuid": "3634e7dd-b5a1-5e1a-a492-184d583279d8", 00:20:32.090 "is_configured": true, 00:20:32.090 "data_offset": 2048, 00:20:32.090 "data_size": 63488 00:20:32.090 }, 00:20:32.090 { 00:20:32.090 "name": "BaseBdev4", 00:20:32.090 "uuid": "a84d511c-d1cd-5aea-b59e-9797a7396a6d", 00:20:32.090 "is_configured": true, 00:20:32.090 "data_offset": 2048, 00:20:32.090 "data_size": 63488 00:20:32.090 } 00:20:32.090 ] 00:20:32.090 }' 00:20:32.090 23:42:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:32.090 23:42:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:32.090 23:42:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:32.349 23:42:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:32.349 23:42:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:20:32.349 [2024-07-24 23:42:17.212092] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 53248 offset_begin: 49152 offset_end: 55296 00:20:32.349 [2024-07-24 23:42:17.212442] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 53248 offset_begin: 49152 offset_end: 55296 00:20:32.917 [2024-07-24 23:42:17.618437] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 59392 offset_begin: 55296 offset_end: 61440 00:20:33.176 [2024-07-24 23:42:17.939547] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:20:33.176 [2024-07-24 23:42:18.039818] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:20:33.176 [2024-07-24 23:42:18.041236] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:33.176 23:42:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:20:33.176 23:42:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:33.176 23:42:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:33.176 23:42:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:33.176 23:42:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:33.176 23:42:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:33.176 23:42:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:33.176 23:42:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:33.433 23:42:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:33.433 "name": "raid_bdev1", 00:20:33.433 "uuid": "01f2c44b-72f3-43de-9cc9-e8068d169051", 00:20:33.433 "strip_size_kb": 0, 00:20:33.433 "state": "online", 00:20:33.433 "raid_level": "raid1", 00:20:33.433 "superblock": true, 00:20:33.433 "num_base_bdevs": 4, 00:20:33.433 "num_base_bdevs_discovered": 3, 00:20:33.433 "num_base_bdevs_operational": 3, 00:20:33.433 "base_bdevs_list": [ 00:20:33.433 { 00:20:33.433 "name": "spare", 00:20:33.433 "uuid": "2ca84fd9-98d7-508f-8bbe-60f44c80847b", 00:20:33.433 "is_configured": true, 00:20:33.433 "data_offset": 2048, 00:20:33.433 "data_size": 63488 00:20:33.433 }, 00:20:33.433 { 00:20:33.433 "name": null, 00:20:33.433 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:33.433 "is_configured": false, 00:20:33.433 "data_offset": 2048, 00:20:33.433 "data_size": 63488 00:20:33.433 }, 00:20:33.433 { 00:20:33.433 "name": "BaseBdev3", 00:20:33.433 "uuid": "3634e7dd-b5a1-5e1a-a492-184d583279d8", 00:20:33.433 "is_configured": true, 00:20:33.433 "data_offset": 2048, 00:20:33.433 "data_size": 63488 00:20:33.433 }, 00:20:33.433 { 00:20:33.433 "name": "BaseBdev4", 00:20:33.433 "uuid": "a84d511c-d1cd-5aea-b59e-9797a7396a6d", 00:20:33.433 "is_configured": true, 00:20:33.433 "data_offset": 2048, 00:20:33.433 "data_size": 63488 00:20:33.433 } 00:20:33.433 ] 00:20:33.433 }' 00:20:33.433 23:42:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:33.433 23:42:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:20:33.433 23:42:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:33.433 23:42:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:20:33.433 23:42:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # break 00:20:33.433 23:42:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:33.433 23:42:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:33.433 23:42:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:20:33.433 23:42:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:20:33.433 23:42:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:33.433 23:42:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:33.433 23:42:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:33.690 23:42:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:33.690 "name": "raid_bdev1", 00:20:33.690 "uuid": "01f2c44b-72f3-43de-9cc9-e8068d169051", 00:20:33.690 "strip_size_kb": 0, 00:20:33.690 "state": "online", 00:20:33.690 "raid_level": "raid1", 00:20:33.690 "superblock": true, 00:20:33.690 "num_base_bdevs": 4, 00:20:33.690 "num_base_bdevs_discovered": 3, 00:20:33.690 "num_base_bdevs_operational": 3, 00:20:33.690 "base_bdevs_list": [ 00:20:33.690 { 00:20:33.690 "name": "spare", 00:20:33.690 "uuid": "2ca84fd9-98d7-508f-8bbe-60f44c80847b", 00:20:33.690 "is_configured": true, 00:20:33.690 "data_offset": 2048, 00:20:33.690 "data_size": 63488 00:20:33.690 }, 00:20:33.690 { 00:20:33.690 "name": null, 00:20:33.690 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:33.690 "is_configured": false, 00:20:33.690 "data_offset": 2048, 00:20:33.690 "data_size": 63488 00:20:33.690 }, 00:20:33.690 { 00:20:33.690 "name": "BaseBdev3", 00:20:33.690 "uuid": "3634e7dd-b5a1-5e1a-a492-184d583279d8", 00:20:33.690 "is_configured": true, 00:20:33.690 "data_offset": 2048, 00:20:33.690 "data_size": 63488 00:20:33.690 }, 00:20:33.690 { 00:20:33.690 "name": "BaseBdev4", 00:20:33.690 "uuid": "a84d511c-d1cd-5aea-b59e-9797a7396a6d", 00:20:33.690 "is_configured": true, 00:20:33.690 "data_offset": 2048, 00:20:33.690 "data_size": 63488 00:20:33.690 } 00:20:33.690 ] 00:20:33.690 }' 00:20:33.690 23:42:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:33.690 23:42:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:20:33.690 23:42:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:33.690 23:42:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:33.690 23:42:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:20:33.690 23:42:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:33.690 23:42:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:33.690 23:42:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:33.690 23:42:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:33.690 23:42:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:33.690 23:42:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:33.691 23:42:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:33.691 23:42:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:33.691 23:42:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:33.691 23:42:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:33.691 23:42:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:33.950 23:42:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:33.950 "name": "raid_bdev1", 00:20:33.950 "uuid": "01f2c44b-72f3-43de-9cc9-e8068d169051", 00:20:33.950 "strip_size_kb": 0, 00:20:33.950 "state": "online", 00:20:33.950 "raid_level": "raid1", 00:20:33.950 "superblock": true, 00:20:33.950 "num_base_bdevs": 4, 00:20:33.950 "num_base_bdevs_discovered": 3, 00:20:33.950 "num_base_bdevs_operational": 3, 00:20:33.950 "base_bdevs_list": [ 00:20:33.950 { 00:20:33.950 "name": "spare", 00:20:33.950 "uuid": "2ca84fd9-98d7-508f-8bbe-60f44c80847b", 00:20:33.950 "is_configured": true, 00:20:33.950 "data_offset": 2048, 00:20:33.950 "data_size": 63488 00:20:33.950 }, 00:20:33.950 { 00:20:33.950 "name": null, 00:20:33.950 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:33.950 "is_configured": false, 00:20:33.950 "data_offset": 2048, 00:20:33.950 "data_size": 63488 00:20:33.950 }, 00:20:33.950 { 00:20:33.950 "name": "BaseBdev3", 00:20:33.950 "uuid": "3634e7dd-b5a1-5e1a-a492-184d583279d8", 00:20:33.950 "is_configured": true, 00:20:33.950 "data_offset": 2048, 00:20:33.950 "data_size": 63488 00:20:33.950 }, 00:20:33.950 { 00:20:33.950 "name": "BaseBdev4", 00:20:33.950 "uuid": "a84d511c-d1cd-5aea-b59e-9797a7396a6d", 00:20:33.950 "is_configured": true, 00:20:33.950 "data_offset": 2048, 00:20:33.950 "data_size": 63488 00:20:33.950 } 00:20:33.950 ] 00:20:33.950 }' 00:20:33.950 23:42:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:33.950 23:42:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:20:34.517 23:42:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:20:34.517 [2024-07-24 23:42:19.413767] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:34.517 [2024-07-24 23:42:19.413796] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:34.517 00:20:34.517 Latency(us) 00:20:34.517 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:20:34.517 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:20:34.517 raid_bdev1 : 9.51 120.40 361.19 0.00 0.00 11911.49 245.76 114344.72 00:20:34.517 =================================================================================================================== 00:20:34.517 Total : 120.40 361.19 0.00 0.00 11911.49 245.76 114344.72 00:20:34.517 [2024-07-24 23:42:19.444526] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:34.517 [2024-07-24 23:42:19.444550] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:34.517 [2024-07-24 23:42:19.444613] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:34.517 [2024-07-24 23:42:19.444620] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23e2d20 name raid_bdev1, state offline 00:20:34.517 0 00:20:34.517 23:42:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:34.517 23:42:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # jq length 00:20:34.776 23:42:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:20:34.776 23:42:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:20:34.776 23:42:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:20:34.776 23:42:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:20:34.776 23:42:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:34.776 23:42:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:20:34.776 23:42:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:20:34.776 23:42:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:20:34.776 23:42:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:20:34.776 23:42:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:20:34.776 23:42:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:20:34.776 23:42:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:20:34.776 23:42:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:20:35.035 /dev/nbd0 00:20:35.035 23:42:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:20:35.035 23:42:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:20:35.035 23:42:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:20:35.035 23:42:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # local i 00:20:35.035 23:42:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:20:35.035 23:42:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:20:35.035 23:42:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:20:35.035 23:42:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@873 -- # break 00:20:35.035 23:42:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:20:35.035 23:42:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:20:35.035 23:42:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:20:35.035 1+0 records in 00:20:35.035 1+0 records out 00:20:35.035 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000196265 s, 20.9 MB/s 00:20:35.035 23:42:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:35.035 23:42:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # size=4096 00:20:35.035 23:42:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:35.035 23:42:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:20:35.035 23:42:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@889 -- # return 0 00:20:35.035 23:42:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:20:35.035 23:42:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:20:35.035 23:42:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:20:35.035 23:42:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z '' ']' 00:20:35.035 23:42:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@727 -- # continue 00:20:35.035 23:42:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:20:35.035 23:42:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev3 ']' 00:20:35.035 23:42:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev3 /dev/nbd1 00:20:35.035 23:42:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:35.035 23:42:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev3') 00:20:35.035 23:42:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:20:35.035 23:42:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:20:35.035 23:42:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:20:35.035 23:42:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:20:35.035 23:42:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:20:35.035 23:42:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:20:35.035 23:42:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev3 /dev/nbd1 00:20:35.035 /dev/nbd1 00:20:35.035 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:20:35.035 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:20:35.035 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:20:35.035 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # local i 00:20:35.035 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:20:35.035 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:20:35.035 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:20:35.035 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@873 -- # break 00:20:35.035 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:20:35.294 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:20:35.294 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:20:35.294 1+0 records in 00:20:35.294 1+0 records out 00:20:35.294 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000221095 s, 18.5 MB/s 00:20:35.294 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:35.294 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # size=4096 00:20:35.294 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:35.294 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:20:35.294 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@889 -- # return 0 00:20:35.294 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:20:35.294 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:20:35.294 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:20:35.294 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:20:35.294 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:35.294 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:20:35.294 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:20:35.294 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:20:35.294 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:20:35.294 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:20:35.294 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:20:35.294 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:20:35.294 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:20:35.294 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:20:35.294 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:20:35.294 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:20:35.553 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:20:35.553 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:20:35.553 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:20:35.553 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev4 ']' 00:20:35.553 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev4 /dev/nbd1 00:20:35.553 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:35.553 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev4') 00:20:35.553 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:20:35.553 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:20:35.553 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:20:35.553 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:20:35.553 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:20:35.553 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:20:35.553 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev4 /dev/nbd1 00:20:35.553 /dev/nbd1 00:20:35.553 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:20:35.553 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:20:35.553 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:20:35.553 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # local i 00:20:35.553 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:20:35.553 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:20:35.553 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:20:35.553 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@873 -- # break 00:20:35.553 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:20:35.553 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:20:35.553 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:20:35.553 1+0 records in 00:20:35.553 1+0 records out 00:20:35.553 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000187339 s, 21.9 MB/s 00:20:35.553 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:35.553 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # size=4096 00:20:35.554 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:35.554 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:20:35.554 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@889 -- # return 0 00:20:35.554 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:20:35.554 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:20:35.554 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:20:35.554 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:20:35.554 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:35.554 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:20:35.554 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:20:35.554 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:20:35.554 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:20:35.554 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:20:35.812 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:20:35.812 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:20:35.812 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:20:35.812 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:20:35.812 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:20:35.812 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:20:35.812 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:20:35.812 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:20:35.812 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:20:35.812 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:35.812 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:20:35.812 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:20:35.812 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:20:35.812 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:20:35.812 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:20:36.070 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:20:36.070 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:20:36.070 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:20:36.070 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:20:36.070 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:20:36.070 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:20:36.070 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:20:36.070 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:20:36.070 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:20:36.070 23:42:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:20:36.329 23:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:20:36.329 [2024-07-24 23:42:21.252299] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:20:36.329 [2024-07-24 23:42:21.252332] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:36.329 [2024-07-24 23:42:21.252344] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23e4800 00:20:36.329 [2024-07-24 23:42:21.252365] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:36.329 [2024-07-24 23:42:21.253545] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:36.329 [2024-07-24 23:42:21.253567] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:20:36.329 [2024-07-24 23:42:21.253621] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:20:36.329 [2024-07-24 23:42:21.253642] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:36.329 [2024-07-24 23:42:21.253722] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:36.329 [2024-07-24 23:42:21.253771] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:36.329 spare 00:20:36.329 23:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:20:36.329 23:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:36.329 23:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:36.329 23:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:36.329 23:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:36.329 23:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:36.329 23:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:36.329 23:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:36.329 23:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:36.329 23:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:36.329 23:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:36.329 23:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:36.588 [2024-07-24 23:42:21.354064] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x23e4130 00:20:36.588 [2024-07-24 23:42:21.354073] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:20:36.588 [2024-07-24 23:42:21.354203] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x245cf80 00:20:36.588 [2024-07-24 23:42:21.354299] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x23e4130 00:20:36.588 [2024-07-24 23:42:21.354304] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x23e4130 00:20:36.588 [2024-07-24 23:42:21.354370] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:36.588 23:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:36.588 "name": "raid_bdev1", 00:20:36.588 "uuid": "01f2c44b-72f3-43de-9cc9-e8068d169051", 00:20:36.588 "strip_size_kb": 0, 00:20:36.588 "state": "online", 00:20:36.588 "raid_level": "raid1", 00:20:36.588 "superblock": true, 00:20:36.588 "num_base_bdevs": 4, 00:20:36.588 "num_base_bdevs_discovered": 3, 00:20:36.588 "num_base_bdevs_operational": 3, 00:20:36.588 "base_bdevs_list": [ 00:20:36.588 { 00:20:36.588 "name": "spare", 00:20:36.588 "uuid": "2ca84fd9-98d7-508f-8bbe-60f44c80847b", 00:20:36.588 "is_configured": true, 00:20:36.588 "data_offset": 2048, 00:20:36.588 "data_size": 63488 00:20:36.588 }, 00:20:36.588 { 00:20:36.588 "name": null, 00:20:36.588 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:36.588 "is_configured": false, 00:20:36.588 "data_offset": 2048, 00:20:36.588 "data_size": 63488 00:20:36.588 }, 00:20:36.588 { 00:20:36.588 "name": "BaseBdev3", 00:20:36.588 "uuid": "3634e7dd-b5a1-5e1a-a492-184d583279d8", 00:20:36.588 "is_configured": true, 00:20:36.588 "data_offset": 2048, 00:20:36.588 "data_size": 63488 00:20:36.588 }, 00:20:36.588 { 00:20:36.588 "name": "BaseBdev4", 00:20:36.588 "uuid": "a84d511c-d1cd-5aea-b59e-9797a7396a6d", 00:20:36.588 "is_configured": true, 00:20:36.588 "data_offset": 2048, 00:20:36.588 "data_size": 63488 00:20:36.588 } 00:20:36.588 ] 00:20:36.588 }' 00:20:36.588 23:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:36.588 23:42:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:20:37.156 23:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:37.156 23:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:37.156 23:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:20:37.156 23:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:20:37.156 23:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:37.156 23:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:37.156 23:42:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:37.156 23:42:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:37.156 "name": "raid_bdev1", 00:20:37.156 "uuid": "01f2c44b-72f3-43de-9cc9-e8068d169051", 00:20:37.156 "strip_size_kb": 0, 00:20:37.156 "state": "online", 00:20:37.156 "raid_level": "raid1", 00:20:37.156 "superblock": true, 00:20:37.156 "num_base_bdevs": 4, 00:20:37.156 "num_base_bdevs_discovered": 3, 00:20:37.156 "num_base_bdevs_operational": 3, 00:20:37.156 "base_bdevs_list": [ 00:20:37.156 { 00:20:37.156 "name": "spare", 00:20:37.156 "uuid": "2ca84fd9-98d7-508f-8bbe-60f44c80847b", 00:20:37.156 "is_configured": true, 00:20:37.156 "data_offset": 2048, 00:20:37.156 "data_size": 63488 00:20:37.156 }, 00:20:37.156 { 00:20:37.156 "name": null, 00:20:37.156 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:37.156 "is_configured": false, 00:20:37.156 "data_offset": 2048, 00:20:37.156 "data_size": 63488 00:20:37.156 }, 00:20:37.156 { 00:20:37.156 "name": "BaseBdev3", 00:20:37.156 "uuid": "3634e7dd-b5a1-5e1a-a492-184d583279d8", 00:20:37.156 "is_configured": true, 00:20:37.156 "data_offset": 2048, 00:20:37.156 "data_size": 63488 00:20:37.156 }, 00:20:37.156 { 00:20:37.156 "name": "BaseBdev4", 00:20:37.156 "uuid": "a84d511c-d1cd-5aea-b59e-9797a7396a6d", 00:20:37.156 "is_configured": true, 00:20:37.156 "data_offset": 2048, 00:20:37.156 "data_size": 63488 00:20:37.156 } 00:20:37.156 ] 00:20:37.156 }' 00:20:37.156 23:42:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:37.156 23:42:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:20:37.156 23:42:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:37.414 23:42:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:37.414 23:42:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:37.414 23:42:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:20:37.414 23:42:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:20:37.414 23:42:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:20:37.673 [2024-07-24 23:42:22.483715] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:37.673 23:42:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:20:37.673 23:42:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:37.673 23:42:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:37.673 23:42:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:37.673 23:42:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:37.673 23:42:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:20:37.673 23:42:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:37.673 23:42:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:37.673 23:42:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:37.673 23:42:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:37.673 23:42:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:37.673 23:42:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:37.673 23:42:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:37.673 "name": "raid_bdev1", 00:20:37.673 "uuid": "01f2c44b-72f3-43de-9cc9-e8068d169051", 00:20:37.673 "strip_size_kb": 0, 00:20:37.673 "state": "online", 00:20:37.673 "raid_level": "raid1", 00:20:37.673 "superblock": true, 00:20:37.673 "num_base_bdevs": 4, 00:20:37.673 "num_base_bdevs_discovered": 2, 00:20:37.673 "num_base_bdevs_operational": 2, 00:20:37.673 "base_bdevs_list": [ 00:20:37.673 { 00:20:37.673 "name": null, 00:20:37.673 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:37.673 "is_configured": false, 00:20:37.673 "data_offset": 2048, 00:20:37.673 "data_size": 63488 00:20:37.673 }, 00:20:37.673 { 00:20:37.673 "name": null, 00:20:37.673 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:37.673 "is_configured": false, 00:20:37.673 "data_offset": 2048, 00:20:37.673 "data_size": 63488 00:20:37.673 }, 00:20:37.673 { 00:20:37.673 "name": "BaseBdev3", 00:20:37.673 "uuid": "3634e7dd-b5a1-5e1a-a492-184d583279d8", 00:20:37.673 "is_configured": true, 00:20:37.673 "data_offset": 2048, 00:20:37.673 "data_size": 63488 00:20:37.673 }, 00:20:37.673 { 00:20:37.673 "name": "BaseBdev4", 00:20:37.673 "uuid": "a84d511c-d1cd-5aea-b59e-9797a7396a6d", 00:20:37.673 "is_configured": true, 00:20:37.673 "data_offset": 2048, 00:20:37.673 "data_size": 63488 00:20:37.673 } 00:20:37.673 ] 00:20:37.673 }' 00:20:37.673 23:42:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:37.674 23:42:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:20:38.242 23:42:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:20:38.508 [2024-07-24 23:42:23.309930] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:38.508 [2024-07-24 23:42:23.310053] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:20:38.508 [2024-07-24 23:42:23.310062] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:20:38.508 [2024-07-24 23:42:23.310082] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:38.508 [2024-07-24 23:42:23.313992] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23e95d0 00:20:38.508 [2024-07-24 23:42:23.315476] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:38.508 23:42:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@755 -- # sleep 1 00:20:39.444 23:42:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:39.444 23:42:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:39.444 23:42:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:39.444 23:42:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:39.444 23:42:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:39.444 23:42:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:39.444 23:42:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:39.703 23:42:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:39.703 "name": "raid_bdev1", 00:20:39.703 "uuid": "01f2c44b-72f3-43de-9cc9-e8068d169051", 00:20:39.703 "strip_size_kb": 0, 00:20:39.703 "state": "online", 00:20:39.703 "raid_level": "raid1", 00:20:39.703 "superblock": true, 00:20:39.703 "num_base_bdevs": 4, 00:20:39.703 "num_base_bdevs_discovered": 3, 00:20:39.703 "num_base_bdevs_operational": 3, 00:20:39.703 "process": { 00:20:39.703 "type": "rebuild", 00:20:39.703 "target": "spare", 00:20:39.703 "progress": { 00:20:39.703 "blocks": 22528, 00:20:39.703 "percent": 35 00:20:39.703 } 00:20:39.703 }, 00:20:39.703 "base_bdevs_list": [ 00:20:39.703 { 00:20:39.703 "name": "spare", 00:20:39.703 "uuid": "2ca84fd9-98d7-508f-8bbe-60f44c80847b", 00:20:39.703 "is_configured": true, 00:20:39.703 "data_offset": 2048, 00:20:39.703 "data_size": 63488 00:20:39.703 }, 00:20:39.703 { 00:20:39.703 "name": null, 00:20:39.703 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:39.703 "is_configured": false, 00:20:39.703 "data_offset": 2048, 00:20:39.703 "data_size": 63488 00:20:39.703 }, 00:20:39.703 { 00:20:39.703 "name": "BaseBdev3", 00:20:39.703 "uuid": "3634e7dd-b5a1-5e1a-a492-184d583279d8", 00:20:39.703 "is_configured": true, 00:20:39.703 "data_offset": 2048, 00:20:39.703 "data_size": 63488 00:20:39.703 }, 00:20:39.703 { 00:20:39.703 "name": "BaseBdev4", 00:20:39.703 "uuid": "a84d511c-d1cd-5aea-b59e-9797a7396a6d", 00:20:39.703 "is_configured": true, 00:20:39.703 "data_offset": 2048, 00:20:39.703 "data_size": 63488 00:20:39.703 } 00:20:39.703 ] 00:20:39.703 }' 00:20:39.703 23:42:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:39.703 23:42:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:39.703 23:42:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:39.703 23:42:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:39.703 23:42:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:20:39.962 [2024-07-24 23:42:24.742197] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:39.962 [2024-07-24 23:42:24.825989] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:20:39.962 [2024-07-24 23:42:24.826021] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:39.962 [2024-07-24 23:42:24.826031] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:39.962 [2024-07-24 23:42:24.826035] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:20:39.962 23:42:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:20:39.962 23:42:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:39.962 23:42:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:39.962 23:42:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:39.962 23:42:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:39.962 23:42:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:20:39.962 23:42:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:39.962 23:42:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:39.962 23:42:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:39.962 23:42:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:39.962 23:42:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:39.962 23:42:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:40.221 23:42:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:40.221 "name": "raid_bdev1", 00:20:40.221 "uuid": "01f2c44b-72f3-43de-9cc9-e8068d169051", 00:20:40.221 "strip_size_kb": 0, 00:20:40.221 "state": "online", 00:20:40.221 "raid_level": "raid1", 00:20:40.221 "superblock": true, 00:20:40.221 "num_base_bdevs": 4, 00:20:40.221 "num_base_bdevs_discovered": 2, 00:20:40.221 "num_base_bdevs_operational": 2, 00:20:40.221 "base_bdevs_list": [ 00:20:40.221 { 00:20:40.221 "name": null, 00:20:40.221 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:40.221 "is_configured": false, 00:20:40.221 "data_offset": 2048, 00:20:40.221 "data_size": 63488 00:20:40.221 }, 00:20:40.221 { 00:20:40.221 "name": null, 00:20:40.221 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:40.221 "is_configured": false, 00:20:40.221 "data_offset": 2048, 00:20:40.221 "data_size": 63488 00:20:40.221 }, 00:20:40.221 { 00:20:40.221 "name": "BaseBdev3", 00:20:40.221 "uuid": "3634e7dd-b5a1-5e1a-a492-184d583279d8", 00:20:40.221 "is_configured": true, 00:20:40.221 "data_offset": 2048, 00:20:40.221 "data_size": 63488 00:20:40.221 }, 00:20:40.221 { 00:20:40.221 "name": "BaseBdev4", 00:20:40.221 "uuid": "a84d511c-d1cd-5aea-b59e-9797a7396a6d", 00:20:40.221 "is_configured": true, 00:20:40.221 "data_offset": 2048, 00:20:40.221 "data_size": 63488 00:20:40.221 } 00:20:40.221 ] 00:20:40.221 }' 00:20:40.221 23:42:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:40.221 23:42:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:20:40.788 23:42:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:20:40.788 [2024-07-24 23:42:25.644040] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:20:40.788 [2024-07-24 23:42:25.644081] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:40.788 [2024-07-24 23:42:25.644112] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2347a60 00:20:40.788 [2024-07-24 23:42:25.644124] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:40.788 [2024-07-24 23:42:25.644415] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:40.788 [2024-07-24 23:42:25.644427] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:20:40.788 [2024-07-24 23:42:25.644496] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:20:40.788 [2024-07-24 23:42:25.644504] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:20:40.788 [2024-07-24 23:42:25.644509] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:20:40.788 [2024-07-24 23:42:25.644521] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:40.788 [2024-07-24 23:42:25.648406] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1fb8b50 00:20:40.788 spare 00:20:40.788 [2024-07-24 23:42:25.649465] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:40.788 23:42:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@762 -- # sleep 1 00:20:41.723 23:42:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:41.723 23:42:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:41.723 23:42:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:20:41.723 23:42:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:20:41.723 23:42:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:41.723 23:42:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:41.723 23:42:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:41.982 23:42:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:41.982 "name": "raid_bdev1", 00:20:41.982 "uuid": "01f2c44b-72f3-43de-9cc9-e8068d169051", 00:20:41.982 "strip_size_kb": 0, 00:20:41.982 "state": "online", 00:20:41.982 "raid_level": "raid1", 00:20:41.982 "superblock": true, 00:20:41.982 "num_base_bdevs": 4, 00:20:41.982 "num_base_bdevs_discovered": 3, 00:20:41.982 "num_base_bdevs_operational": 3, 00:20:41.982 "process": { 00:20:41.982 "type": "rebuild", 00:20:41.982 "target": "spare", 00:20:41.982 "progress": { 00:20:41.982 "blocks": 22528, 00:20:41.982 "percent": 35 00:20:41.982 } 00:20:41.982 }, 00:20:41.982 "base_bdevs_list": [ 00:20:41.982 { 00:20:41.982 "name": "spare", 00:20:41.982 "uuid": "2ca84fd9-98d7-508f-8bbe-60f44c80847b", 00:20:41.982 "is_configured": true, 00:20:41.982 "data_offset": 2048, 00:20:41.982 "data_size": 63488 00:20:41.982 }, 00:20:41.982 { 00:20:41.982 "name": null, 00:20:41.982 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:41.982 "is_configured": false, 00:20:41.982 "data_offset": 2048, 00:20:41.982 "data_size": 63488 00:20:41.982 }, 00:20:41.982 { 00:20:41.982 "name": "BaseBdev3", 00:20:41.982 "uuid": "3634e7dd-b5a1-5e1a-a492-184d583279d8", 00:20:41.982 "is_configured": true, 00:20:41.982 "data_offset": 2048, 00:20:41.982 "data_size": 63488 00:20:41.982 }, 00:20:41.982 { 00:20:41.982 "name": "BaseBdev4", 00:20:41.982 "uuid": "a84d511c-d1cd-5aea-b59e-9797a7396a6d", 00:20:41.982 "is_configured": true, 00:20:41.982 "data_offset": 2048, 00:20:41.982 "data_size": 63488 00:20:41.982 } 00:20:41.982 ] 00:20:41.982 }' 00:20:41.982 23:42:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:41.982 23:42:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:41.982 23:42:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:41.982 23:42:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:20:41.982 23:42:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:20:42.241 [2024-07-24 23:42:27.076614] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:42.241 [2024-07-24 23:42:27.160119] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:20:42.241 [2024-07-24 23:42:27.160155] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:42.241 [2024-07-24 23:42:27.160170] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:42.241 [2024-07-24 23:42:27.160175] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:20:42.241 23:42:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:20:42.241 23:42:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:42.241 23:42:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:42.241 23:42:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:42.241 23:42:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:42.241 23:42:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:20:42.241 23:42:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:42.241 23:42:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:42.241 23:42:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:42.241 23:42:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:42.241 23:42:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:42.241 23:42:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:42.500 23:42:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:42.500 "name": "raid_bdev1", 00:20:42.500 "uuid": "01f2c44b-72f3-43de-9cc9-e8068d169051", 00:20:42.500 "strip_size_kb": 0, 00:20:42.500 "state": "online", 00:20:42.500 "raid_level": "raid1", 00:20:42.500 "superblock": true, 00:20:42.500 "num_base_bdevs": 4, 00:20:42.500 "num_base_bdevs_discovered": 2, 00:20:42.500 "num_base_bdevs_operational": 2, 00:20:42.500 "base_bdevs_list": [ 00:20:42.500 { 00:20:42.500 "name": null, 00:20:42.500 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:42.500 "is_configured": false, 00:20:42.500 "data_offset": 2048, 00:20:42.500 "data_size": 63488 00:20:42.500 }, 00:20:42.500 { 00:20:42.500 "name": null, 00:20:42.500 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:42.500 "is_configured": false, 00:20:42.500 "data_offset": 2048, 00:20:42.500 "data_size": 63488 00:20:42.500 }, 00:20:42.500 { 00:20:42.500 "name": "BaseBdev3", 00:20:42.500 "uuid": "3634e7dd-b5a1-5e1a-a492-184d583279d8", 00:20:42.500 "is_configured": true, 00:20:42.500 "data_offset": 2048, 00:20:42.500 "data_size": 63488 00:20:42.500 }, 00:20:42.500 { 00:20:42.500 "name": "BaseBdev4", 00:20:42.500 "uuid": "a84d511c-d1cd-5aea-b59e-9797a7396a6d", 00:20:42.500 "is_configured": true, 00:20:42.500 "data_offset": 2048, 00:20:42.500 "data_size": 63488 00:20:42.500 } 00:20:42.500 ] 00:20:42.500 }' 00:20:42.500 23:42:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:42.500 23:42:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:20:43.068 23:42:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:43.068 23:42:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:43.068 23:42:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:20:43.068 23:42:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:20:43.068 23:42:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:43.068 23:42:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:43.068 23:42:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:43.068 23:42:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:43.068 "name": "raid_bdev1", 00:20:43.068 "uuid": "01f2c44b-72f3-43de-9cc9-e8068d169051", 00:20:43.068 "strip_size_kb": 0, 00:20:43.068 "state": "online", 00:20:43.068 "raid_level": "raid1", 00:20:43.068 "superblock": true, 00:20:43.068 "num_base_bdevs": 4, 00:20:43.068 "num_base_bdevs_discovered": 2, 00:20:43.068 "num_base_bdevs_operational": 2, 00:20:43.068 "base_bdevs_list": [ 00:20:43.068 { 00:20:43.068 "name": null, 00:20:43.068 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:43.068 "is_configured": false, 00:20:43.068 "data_offset": 2048, 00:20:43.068 "data_size": 63488 00:20:43.068 }, 00:20:43.068 { 00:20:43.068 "name": null, 00:20:43.068 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:43.068 "is_configured": false, 00:20:43.068 "data_offset": 2048, 00:20:43.068 "data_size": 63488 00:20:43.068 }, 00:20:43.068 { 00:20:43.068 "name": "BaseBdev3", 00:20:43.068 "uuid": "3634e7dd-b5a1-5e1a-a492-184d583279d8", 00:20:43.068 "is_configured": true, 00:20:43.068 "data_offset": 2048, 00:20:43.068 "data_size": 63488 00:20:43.068 }, 00:20:43.068 { 00:20:43.068 "name": "BaseBdev4", 00:20:43.068 "uuid": "a84d511c-d1cd-5aea-b59e-9797a7396a6d", 00:20:43.068 "is_configured": true, 00:20:43.068 "data_offset": 2048, 00:20:43.068 "data_size": 63488 00:20:43.068 } 00:20:43.068 ] 00:20:43.068 }' 00:20:43.068 23:42:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:43.068 23:42:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:20:43.068 23:42:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:43.327 23:42:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:43.327 23:42:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:20:43.327 23:42:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:20:43.586 [2024-07-24 23:42:28.407024] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:20:43.586 [2024-07-24 23:42:28.407061] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:43.586 [2024-07-24 23:42:28.407084] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22b23f0 00:20:43.586 [2024-07-24 23:42:28.407106] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:43.586 [2024-07-24 23:42:28.407366] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:43.586 [2024-07-24 23:42:28.407377] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:20:43.586 [2024-07-24 23:42:28.407421] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:20:43.586 [2024-07-24 23:42:28.407430] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:20:43.586 [2024-07-24 23:42:28.407435] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:20:43.586 BaseBdev1 00:20:43.586 23:42:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@773 -- # sleep 1 00:20:44.521 23:42:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:20:44.521 23:42:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:44.521 23:42:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:44.521 23:42:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:44.521 23:42:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:44.521 23:42:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:20:44.521 23:42:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:44.521 23:42:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:44.521 23:42:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:44.521 23:42:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:44.521 23:42:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:44.521 23:42:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:44.779 23:42:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:44.779 "name": "raid_bdev1", 00:20:44.779 "uuid": "01f2c44b-72f3-43de-9cc9-e8068d169051", 00:20:44.779 "strip_size_kb": 0, 00:20:44.779 "state": "online", 00:20:44.779 "raid_level": "raid1", 00:20:44.779 "superblock": true, 00:20:44.779 "num_base_bdevs": 4, 00:20:44.779 "num_base_bdevs_discovered": 2, 00:20:44.779 "num_base_bdevs_operational": 2, 00:20:44.779 "base_bdevs_list": [ 00:20:44.779 { 00:20:44.779 "name": null, 00:20:44.779 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:44.779 "is_configured": false, 00:20:44.779 "data_offset": 2048, 00:20:44.779 "data_size": 63488 00:20:44.779 }, 00:20:44.779 { 00:20:44.779 "name": null, 00:20:44.779 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:44.779 "is_configured": false, 00:20:44.779 "data_offset": 2048, 00:20:44.779 "data_size": 63488 00:20:44.779 }, 00:20:44.779 { 00:20:44.779 "name": "BaseBdev3", 00:20:44.779 "uuid": "3634e7dd-b5a1-5e1a-a492-184d583279d8", 00:20:44.779 "is_configured": true, 00:20:44.779 "data_offset": 2048, 00:20:44.779 "data_size": 63488 00:20:44.779 }, 00:20:44.779 { 00:20:44.779 "name": "BaseBdev4", 00:20:44.779 "uuid": "a84d511c-d1cd-5aea-b59e-9797a7396a6d", 00:20:44.779 "is_configured": true, 00:20:44.779 "data_offset": 2048, 00:20:44.779 "data_size": 63488 00:20:44.779 } 00:20:44.779 ] 00:20:44.779 }' 00:20:44.779 23:42:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:44.779 23:42:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:20:45.345 23:42:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:45.345 23:42:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:45.345 23:42:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:20:45.345 23:42:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:20:45.345 23:42:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:45.345 23:42:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:45.345 23:42:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:45.345 23:42:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:45.345 "name": "raid_bdev1", 00:20:45.345 "uuid": "01f2c44b-72f3-43de-9cc9-e8068d169051", 00:20:45.345 "strip_size_kb": 0, 00:20:45.345 "state": "online", 00:20:45.345 "raid_level": "raid1", 00:20:45.345 "superblock": true, 00:20:45.345 "num_base_bdevs": 4, 00:20:45.345 "num_base_bdevs_discovered": 2, 00:20:45.345 "num_base_bdevs_operational": 2, 00:20:45.345 "base_bdevs_list": [ 00:20:45.345 { 00:20:45.345 "name": null, 00:20:45.345 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:45.345 "is_configured": false, 00:20:45.345 "data_offset": 2048, 00:20:45.345 "data_size": 63488 00:20:45.345 }, 00:20:45.345 { 00:20:45.345 "name": null, 00:20:45.346 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:45.346 "is_configured": false, 00:20:45.346 "data_offset": 2048, 00:20:45.346 "data_size": 63488 00:20:45.346 }, 00:20:45.346 { 00:20:45.346 "name": "BaseBdev3", 00:20:45.346 "uuid": "3634e7dd-b5a1-5e1a-a492-184d583279d8", 00:20:45.346 "is_configured": true, 00:20:45.346 "data_offset": 2048, 00:20:45.346 "data_size": 63488 00:20:45.346 }, 00:20:45.346 { 00:20:45.346 "name": "BaseBdev4", 00:20:45.346 "uuid": "a84d511c-d1cd-5aea-b59e-9797a7396a6d", 00:20:45.346 "is_configured": true, 00:20:45.346 "data_offset": 2048, 00:20:45.346 "data_size": 63488 00:20:45.346 } 00:20:45.346 ] 00:20:45.346 }' 00:20:45.346 23:42:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:45.346 23:42:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:20:45.346 23:42:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:45.681 23:42:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:45.681 23:42:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:20:45.681 23:42:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@650 -- # local es=0 00:20:45.681 23:42:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:20:45.681 23:42:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:45.681 23:42:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:20:45.681 23:42:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:45.681 23:42:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:20:45.681 23:42:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:45.681 23:42:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:20:45.681 23:42:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:45.681 23:42:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:20:45.681 23:42:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:20:45.681 [2024-07-24 23:42:30.504734] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:45.681 [2024-07-24 23:42:30.504838] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:20:45.681 [2024-07-24 23:42:30.504847] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:20:45.681 request: 00:20:45.681 { 00:20:45.681 "base_bdev": "BaseBdev1", 00:20:45.681 "raid_bdev": "raid_bdev1", 00:20:45.681 "method": "bdev_raid_add_base_bdev", 00:20:45.681 "req_id": 1 00:20:45.681 } 00:20:45.681 Got JSON-RPC error response 00:20:45.681 response: 00:20:45.681 { 00:20:45.681 "code": -22, 00:20:45.681 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:20:45.681 } 00:20:45.681 23:42:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@653 -- # es=1 00:20:45.681 23:42:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:20:45.681 23:42:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:20:45.681 23:42:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:20:45.681 23:42:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@777 -- # sleep 1 00:20:46.615 23:42:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:20:46.615 23:42:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:46.615 23:42:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:46.615 23:42:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:46.615 23:42:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:46.615 23:42:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:20:46.615 23:42:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:46.615 23:42:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:46.615 23:42:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:46.615 23:42:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:46.615 23:42:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:46.615 23:42:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:46.873 23:42:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:46.873 "name": "raid_bdev1", 00:20:46.873 "uuid": "01f2c44b-72f3-43de-9cc9-e8068d169051", 00:20:46.873 "strip_size_kb": 0, 00:20:46.873 "state": "online", 00:20:46.873 "raid_level": "raid1", 00:20:46.873 "superblock": true, 00:20:46.873 "num_base_bdevs": 4, 00:20:46.873 "num_base_bdevs_discovered": 2, 00:20:46.873 "num_base_bdevs_operational": 2, 00:20:46.873 "base_bdevs_list": [ 00:20:46.873 { 00:20:46.873 "name": null, 00:20:46.873 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:46.873 "is_configured": false, 00:20:46.873 "data_offset": 2048, 00:20:46.873 "data_size": 63488 00:20:46.873 }, 00:20:46.873 { 00:20:46.873 "name": null, 00:20:46.873 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:46.873 "is_configured": false, 00:20:46.873 "data_offset": 2048, 00:20:46.873 "data_size": 63488 00:20:46.873 }, 00:20:46.873 { 00:20:46.873 "name": "BaseBdev3", 00:20:46.873 "uuid": "3634e7dd-b5a1-5e1a-a492-184d583279d8", 00:20:46.873 "is_configured": true, 00:20:46.873 "data_offset": 2048, 00:20:46.873 "data_size": 63488 00:20:46.873 }, 00:20:46.874 { 00:20:46.874 "name": "BaseBdev4", 00:20:46.874 "uuid": "a84d511c-d1cd-5aea-b59e-9797a7396a6d", 00:20:46.874 "is_configured": true, 00:20:46.874 "data_offset": 2048, 00:20:46.874 "data_size": 63488 00:20:46.874 } 00:20:46.874 ] 00:20:46.874 }' 00:20:46.874 23:42:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:46.874 23:42:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:20:47.440 23:42:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:47.440 23:42:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:20:47.440 23:42:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:20:47.440 23:42:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:20:47.440 23:42:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:20:47.440 23:42:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:47.440 23:42:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:47.440 23:42:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:20:47.440 "name": "raid_bdev1", 00:20:47.440 "uuid": "01f2c44b-72f3-43de-9cc9-e8068d169051", 00:20:47.440 "strip_size_kb": 0, 00:20:47.440 "state": "online", 00:20:47.440 "raid_level": "raid1", 00:20:47.440 "superblock": true, 00:20:47.440 "num_base_bdevs": 4, 00:20:47.440 "num_base_bdevs_discovered": 2, 00:20:47.440 "num_base_bdevs_operational": 2, 00:20:47.440 "base_bdevs_list": [ 00:20:47.440 { 00:20:47.440 "name": null, 00:20:47.440 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:47.440 "is_configured": false, 00:20:47.440 "data_offset": 2048, 00:20:47.440 "data_size": 63488 00:20:47.440 }, 00:20:47.440 { 00:20:47.440 "name": null, 00:20:47.440 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:47.440 "is_configured": false, 00:20:47.440 "data_offset": 2048, 00:20:47.440 "data_size": 63488 00:20:47.440 }, 00:20:47.440 { 00:20:47.440 "name": "BaseBdev3", 00:20:47.440 "uuid": "3634e7dd-b5a1-5e1a-a492-184d583279d8", 00:20:47.440 "is_configured": true, 00:20:47.440 "data_offset": 2048, 00:20:47.440 "data_size": 63488 00:20:47.440 }, 00:20:47.440 { 00:20:47.440 "name": "BaseBdev4", 00:20:47.440 "uuid": "a84d511c-d1cd-5aea-b59e-9797a7396a6d", 00:20:47.440 "is_configured": true, 00:20:47.440 "data_offset": 2048, 00:20:47.440 "data_size": 63488 00:20:47.440 } 00:20:47.440 ] 00:20:47.440 }' 00:20:47.440 23:42:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:20:47.440 23:42:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:20:47.440 23:42:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:20:47.440 23:42:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:47.440 23:42:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@782 -- # killprocess 373154 00:20:47.440 23:42:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@950 -- # '[' -z 373154 ']' 00:20:47.440 23:42:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # kill -0 373154 00:20:47.440 23:42:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@955 -- # uname 00:20:47.440 23:42:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:20:47.440 23:42:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 373154 00:20:47.699 23:42:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:20:47.699 23:42:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:20:47.699 23:42:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@968 -- # echo 'killing process with pid 373154' 00:20:47.699 killing process with pid 373154 00:20:47.699 23:42:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@969 -- # kill 373154 00:20:47.699 Received shutdown signal, test time was about 22.506893 seconds 00:20:47.699 00:20:47.699 Latency(us) 00:20:47.699 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:20:47.699 =================================================================================================================== 00:20:47.699 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:20:47.699 [2024-07-24 23:42:32.470168] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:47.699 [2024-07-24 23:42:32.470246] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:47.699 [2024-07-24 23:42:32.470293] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:47.699 [2024-07-24 23:42:32.470300] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23e4130 name raid_bdev1, state offline 00:20:47.699 23:42:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@974 -- # wait 373154 00:20:47.699 [2024-07-24 23:42:32.505072] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:47.699 23:42:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@784 -- # return 0 00:20:47.699 00:20:47.699 real 0m26.841s 00:20:47.699 user 0m41.609s 00:20:47.699 sys 0m3.402s 00:20:47.699 23:42:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1126 -- # xtrace_disable 00:20:47.699 23:42:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:20:47.699 ************************************ 00:20:47.699 END TEST raid_rebuild_test_sb_io 00:20:47.699 ************************************ 00:20:47.956 23:42:32 bdev_raid -- bdev/bdev_raid.sh@884 -- # '[' n == y ']' 00:20:47.956 23:42:32 bdev_raid -- bdev/bdev_raid.sh@896 -- # base_blocklen=4096 00:20:47.956 23:42:32 bdev_raid -- bdev/bdev_raid.sh@898 -- # run_test raid_state_function_test_sb_4k raid_state_function_test raid1 2 true 00:20:47.956 23:42:32 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:20:47.956 23:42:32 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:20:47.956 23:42:32 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:47.956 ************************************ 00:20:47.956 START TEST raid_state_function_test_sb_4k 00:20:47.956 ************************************ 00:20:47.956 23:42:32 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@1125 -- # raid_state_function_test raid1 2 true 00:20:47.956 23:42:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:20:47.956 23:42:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:20:47.956 23:42:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:20:47.956 23:42:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:20:47.956 23:42:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:20:47.956 23:42:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:47.956 23:42:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:20:47.956 23:42:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:47.956 23:42:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:47.956 23:42:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:20:47.956 23:42:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:47.956 23:42:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:47.956 23:42:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:20:47.956 23:42:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:20:47.956 23:42:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:20:47.956 23:42:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # local strip_size 00:20:47.956 23:42:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:20:47.956 23:42:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:20:47.956 23:42:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:20:47.956 23:42:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:20:47.956 23:42:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:20:47.956 23:42:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:20:47.956 23:42:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@244 -- # raid_pid=378494 00:20:47.956 23:42:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 378494' 00:20:47.956 Process raid pid: 378494 00:20:47.956 23:42:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:20:47.956 23:42:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@246 -- # waitforlisten 378494 /var/tmp/spdk-raid.sock 00:20:47.956 23:42:32 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@831 -- # '[' -z 378494 ']' 00:20:47.956 23:42:32 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:47.956 23:42:32 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@836 -- # local max_retries=100 00:20:47.956 23:42:32 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:47.956 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:47.956 23:42:32 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@840 -- # xtrace_disable 00:20:47.956 23:42:32 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:20:47.956 [2024-07-24 23:42:32.817250] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:20:47.956 [2024-07-24 23:42:32.817290] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:20:47.956 [2024-07-24 23:42:32.880747] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:48.214 [2024-07-24 23:42:32.957921] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:48.214 [2024-07-24 23:42:33.009853] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:48.214 [2024-07-24 23:42:33.009877] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:48.780 23:42:33 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:20:48.780 23:42:33 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@864 -- # return 0 00:20:48.780 23:42:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:20:48.780 [2024-07-24 23:42:33.748366] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:48.780 [2024-07-24 23:42:33.748396] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:48.780 [2024-07-24 23:42:33.748402] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:48.780 [2024-07-24 23:42:33.748407] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:48.780 23:42:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:20:48.780 23:42:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:48.780 23:42:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:48.780 23:42:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:48.780 23:42:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:48.780 23:42:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:20:48.780 23:42:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:48.780 23:42:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:48.780 23:42:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:48.780 23:42:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:48.780 23:42:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:48.780 23:42:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:49.038 23:42:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:49.038 "name": "Existed_Raid", 00:20:49.038 "uuid": "5ac34ed4-59c5-48ac-8230-ce3b3499cf32", 00:20:49.038 "strip_size_kb": 0, 00:20:49.038 "state": "configuring", 00:20:49.038 "raid_level": "raid1", 00:20:49.038 "superblock": true, 00:20:49.038 "num_base_bdevs": 2, 00:20:49.038 "num_base_bdevs_discovered": 0, 00:20:49.038 "num_base_bdevs_operational": 2, 00:20:49.038 "base_bdevs_list": [ 00:20:49.038 { 00:20:49.038 "name": "BaseBdev1", 00:20:49.038 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:49.038 "is_configured": false, 00:20:49.038 "data_offset": 0, 00:20:49.038 "data_size": 0 00:20:49.038 }, 00:20:49.038 { 00:20:49.038 "name": "BaseBdev2", 00:20:49.038 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:49.038 "is_configured": false, 00:20:49.038 "data_offset": 0, 00:20:49.038 "data_size": 0 00:20:49.038 } 00:20:49.038 ] 00:20:49.038 }' 00:20:49.038 23:42:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:49.038 23:42:33 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:20:49.603 23:42:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:49.603 [2024-07-24 23:42:34.554367] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:49.603 [2024-07-24 23:42:34.554388] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14a2b10 name Existed_Raid, state configuring 00:20:49.603 23:42:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:20:49.861 [2024-07-24 23:42:34.722815] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:49.861 [2024-07-24 23:42:34.722833] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:49.861 [2024-07-24 23:42:34.722838] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:49.861 [2024-07-24 23:42:34.722843] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:49.861 23:42:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev1 00:20:50.120 [2024-07-24 23:42:34.891288] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:50.120 BaseBdev1 00:20:50.120 23:42:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:20:50.120 23:42:34 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:20:50.120 23:42:34 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:20:50.120 23:42:34 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@901 -- # local i 00:20:50.121 23:42:34 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:20:50.121 23:42:34 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:20:50.121 23:42:34 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:50.121 23:42:35 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:20:50.379 [ 00:20:50.379 { 00:20:50.379 "name": "BaseBdev1", 00:20:50.379 "aliases": [ 00:20:50.379 "0e03fb2b-fe26-48d4-afad-91a602eee7d0" 00:20:50.379 ], 00:20:50.379 "product_name": "Malloc disk", 00:20:50.379 "block_size": 4096, 00:20:50.379 "num_blocks": 8192, 00:20:50.379 "uuid": "0e03fb2b-fe26-48d4-afad-91a602eee7d0", 00:20:50.379 "assigned_rate_limits": { 00:20:50.379 "rw_ios_per_sec": 0, 00:20:50.379 "rw_mbytes_per_sec": 0, 00:20:50.379 "r_mbytes_per_sec": 0, 00:20:50.379 "w_mbytes_per_sec": 0 00:20:50.379 }, 00:20:50.379 "claimed": true, 00:20:50.379 "claim_type": "exclusive_write", 00:20:50.379 "zoned": false, 00:20:50.379 "supported_io_types": { 00:20:50.379 "read": true, 00:20:50.379 "write": true, 00:20:50.379 "unmap": true, 00:20:50.379 "flush": true, 00:20:50.379 "reset": true, 00:20:50.379 "nvme_admin": false, 00:20:50.379 "nvme_io": false, 00:20:50.379 "nvme_io_md": false, 00:20:50.379 "write_zeroes": true, 00:20:50.379 "zcopy": true, 00:20:50.379 "get_zone_info": false, 00:20:50.379 "zone_management": false, 00:20:50.379 "zone_append": false, 00:20:50.379 "compare": false, 00:20:50.379 "compare_and_write": false, 00:20:50.379 "abort": true, 00:20:50.379 "seek_hole": false, 00:20:50.379 "seek_data": false, 00:20:50.379 "copy": true, 00:20:50.379 "nvme_iov_md": false 00:20:50.379 }, 00:20:50.379 "memory_domains": [ 00:20:50.379 { 00:20:50.379 "dma_device_id": "system", 00:20:50.379 "dma_device_type": 1 00:20:50.379 }, 00:20:50.379 { 00:20:50.379 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:50.379 "dma_device_type": 2 00:20:50.379 } 00:20:50.379 ], 00:20:50.379 "driver_specific": {} 00:20:50.379 } 00:20:50.379 ] 00:20:50.379 23:42:35 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@907 -- # return 0 00:20:50.379 23:42:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:20:50.379 23:42:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:50.379 23:42:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:50.379 23:42:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:50.379 23:42:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:50.379 23:42:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:20:50.379 23:42:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:50.379 23:42:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:50.379 23:42:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:50.379 23:42:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:50.379 23:42:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:50.379 23:42:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:50.638 23:42:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:50.638 "name": "Existed_Raid", 00:20:50.638 "uuid": "839d9db2-a199-4728-9714-242804fd193b", 00:20:50.638 "strip_size_kb": 0, 00:20:50.638 "state": "configuring", 00:20:50.638 "raid_level": "raid1", 00:20:50.638 "superblock": true, 00:20:50.638 "num_base_bdevs": 2, 00:20:50.638 "num_base_bdevs_discovered": 1, 00:20:50.638 "num_base_bdevs_operational": 2, 00:20:50.638 "base_bdevs_list": [ 00:20:50.638 { 00:20:50.638 "name": "BaseBdev1", 00:20:50.638 "uuid": "0e03fb2b-fe26-48d4-afad-91a602eee7d0", 00:20:50.638 "is_configured": true, 00:20:50.638 "data_offset": 256, 00:20:50.638 "data_size": 7936 00:20:50.638 }, 00:20:50.638 { 00:20:50.638 "name": "BaseBdev2", 00:20:50.638 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:50.638 "is_configured": false, 00:20:50.638 "data_offset": 0, 00:20:50.638 "data_size": 0 00:20:50.638 } 00:20:50.638 ] 00:20:50.638 }' 00:20:50.638 23:42:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:50.638 23:42:35 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:20:51.204 23:42:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:51.204 [2024-07-24 23:42:36.074456] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:51.204 [2024-07-24 23:42:36.074490] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14a23a0 name Existed_Raid, state configuring 00:20:51.204 23:42:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:20:51.463 [2024-07-24 23:42:36.238901] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:51.463 [2024-07-24 23:42:36.239869] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:51.463 [2024-07-24 23:42:36.239900] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:51.463 23:42:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:20:51.463 23:42:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:51.463 23:42:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:20:51.463 23:42:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:51.463 23:42:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:51.463 23:42:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:51.463 23:42:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:51.463 23:42:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:20:51.463 23:42:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:51.463 23:42:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:51.463 23:42:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:51.463 23:42:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:51.463 23:42:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:51.463 23:42:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:51.463 23:42:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:51.463 "name": "Existed_Raid", 00:20:51.463 "uuid": "83d26241-4e84-4a54-bed7-8a11464dece2", 00:20:51.463 "strip_size_kb": 0, 00:20:51.463 "state": "configuring", 00:20:51.463 "raid_level": "raid1", 00:20:51.463 "superblock": true, 00:20:51.463 "num_base_bdevs": 2, 00:20:51.463 "num_base_bdevs_discovered": 1, 00:20:51.463 "num_base_bdevs_operational": 2, 00:20:51.463 "base_bdevs_list": [ 00:20:51.463 { 00:20:51.463 "name": "BaseBdev1", 00:20:51.463 "uuid": "0e03fb2b-fe26-48d4-afad-91a602eee7d0", 00:20:51.463 "is_configured": true, 00:20:51.463 "data_offset": 256, 00:20:51.463 "data_size": 7936 00:20:51.463 }, 00:20:51.463 { 00:20:51.463 "name": "BaseBdev2", 00:20:51.463 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:51.463 "is_configured": false, 00:20:51.463 "data_offset": 0, 00:20:51.463 "data_size": 0 00:20:51.463 } 00:20:51.463 ] 00:20:51.463 }' 00:20:51.463 23:42:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:51.463 23:42:36 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:20:52.030 23:42:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev2 00:20:52.289 [2024-07-24 23:42:37.079682] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:52.289 [2024-07-24 23:42:37.079798] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x14a3050 00:20:52.289 [2024-07-24 23:42:37.079808] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:20:52.289 [2024-07-24 23:42:37.079925] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14a6eb0 00:20:52.289 [2024-07-24 23:42:37.080011] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x14a3050 00:20:52.289 [2024-07-24 23:42:37.080019] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x14a3050 00:20:52.289 [2024-07-24 23:42:37.080080] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:52.289 BaseBdev2 00:20:52.289 23:42:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:20:52.289 23:42:37 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:20:52.289 23:42:37 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:20:52.289 23:42:37 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@901 -- # local i 00:20:52.289 23:42:37 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:20:52.289 23:42:37 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:20:52.289 23:42:37 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:52.289 23:42:37 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:20:52.548 [ 00:20:52.548 { 00:20:52.548 "name": "BaseBdev2", 00:20:52.548 "aliases": [ 00:20:52.548 "068fcdad-5e71-40d9-8f44-c50aee6cd527" 00:20:52.548 ], 00:20:52.548 "product_name": "Malloc disk", 00:20:52.548 "block_size": 4096, 00:20:52.548 "num_blocks": 8192, 00:20:52.548 "uuid": "068fcdad-5e71-40d9-8f44-c50aee6cd527", 00:20:52.548 "assigned_rate_limits": { 00:20:52.548 "rw_ios_per_sec": 0, 00:20:52.548 "rw_mbytes_per_sec": 0, 00:20:52.548 "r_mbytes_per_sec": 0, 00:20:52.548 "w_mbytes_per_sec": 0 00:20:52.548 }, 00:20:52.548 "claimed": true, 00:20:52.548 "claim_type": "exclusive_write", 00:20:52.548 "zoned": false, 00:20:52.548 "supported_io_types": { 00:20:52.548 "read": true, 00:20:52.548 "write": true, 00:20:52.548 "unmap": true, 00:20:52.548 "flush": true, 00:20:52.548 "reset": true, 00:20:52.548 "nvme_admin": false, 00:20:52.548 "nvme_io": false, 00:20:52.548 "nvme_io_md": false, 00:20:52.548 "write_zeroes": true, 00:20:52.548 "zcopy": true, 00:20:52.548 "get_zone_info": false, 00:20:52.548 "zone_management": false, 00:20:52.548 "zone_append": false, 00:20:52.548 "compare": false, 00:20:52.548 "compare_and_write": false, 00:20:52.548 "abort": true, 00:20:52.548 "seek_hole": false, 00:20:52.548 "seek_data": false, 00:20:52.548 "copy": true, 00:20:52.548 "nvme_iov_md": false 00:20:52.548 }, 00:20:52.548 "memory_domains": [ 00:20:52.548 { 00:20:52.548 "dma_device_id": "system", 00:20:52.548 "dma_device_type": 1 00:20:52.548 }, 00:20:52.548 { 00:20:52.548 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:52.548 "dma_device_type": 2 00:20:52.548 } 00:20:52.548 ], 00:20:52.548 "driver_specific": {} 00:20:52.548 } 00:20:52.548 ] 00:20:52.548 23:42:37 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@907 -- # return 0 00:20:52.548 23:42:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:52.548 23:42:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:52.548 23:42:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:20:52.548 23:42:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:52.548 23:42:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:52.548 23:42:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:52.548 23:42:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:52.548 23:42:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:20:52.548 23:42:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:52.548 23:42:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:52.548 23:42:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:52.548 23:42:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:52.548 23:42:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:52.548 23:42:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:52.807 23:42:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:52.807 "name": "Existed_Raid", 00:20:52.807 "uuid": "83d26241-4e84-4a54-bed7-8a11464dece2", 00:20:52.807 "strip_size_kb": 0, 00:20:52.807 "state": "online", 00:20:52.807 "raid_level": "raid1", 00:20:52.807 "superblock": true, 00:20:52.807 "num_base_bdevs": 2, 00:20:52.807 "num_base_bdevs_discovered": 2, 00:20:52.807 "num_base_bdevs_operational": 2, 00:20:52.807 "base_bdevs_list": [ 00:20:52.807 { 00:20:52.807 "name": "BaseBdev1", 00:20:52.807 "uuid": "0e03fb2b-fe26-48d4-afad-91a602eee7d0", 00:20:52.807 "is_configured": true, 00:20:52.807 "data_offset": 256, 00:20:52.807 "data_size": 7936 00:20:52.807 }, 00:20:52.807 { 00:20:52.807 "name": "BaseBdev2", 00:20:52.807 "uuid": "068fcdad-5e71-40d9-8f44-c50aee6cd527", 00:20:52.807 "is_configured": true, 00:20:52.807 "data_offset": 256, 00:20:52.807 "data_size": 7936 00:20:52.807 } 00:20:52.807 ] 00:20:52.807 }' 00:20:52.807 23:42:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:52.807 23:42:37 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:20:53.375 23:42:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:20:53.375 23:42:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:20:53.375 23:42:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:53.375 23:42:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:53.375 23:42:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:53.375 23:42:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@198 -- # local name 00:20:53.375 23:42:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:20:53.375 23:42:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:53.375 [2024-07-24 23:42:38.226919] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:53.375 23:42:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:53.375 "name": "Existed_Raid", 00:20:53.375 "aliases": [ 00:20:53.375 "83d26241-4e84-4a54-bed7-8a11464dece2" 00:20:53.375 ], 00:20:53.375 "product_name": "Raid Volume", 00:20:53.375 "block_size": 4096, 00:20:53.375 "num_blocks": 7936, 00:20:53.375 "uuid": "83d26241-4e84-4a54-bed7-8a11464dece2", 00:20:53.375 "assigned_rate_limits": { 00:20:53.375 "rw_ios_per_sec": 0, 00:20:53.375 "rw_mbytes_per_sec": 0, 00:20:53.375 "r_mbytes_per_sec": 0, 00:20:53.375 "w_mbytes_per_sec": 0 00:20:53.375 }, 00:20:53.375 "claimed": false, 00:20:53.375 "zoned": false, 00:20:53.375 "supported_io_types": { 00:20:53.375 "read": true, 00:20:53.375 "write": true, 00:20:53.375 "unmap": false, 00:20:53.375 "flush": false, 00:20:53.375 "reset": true, 00:20:53.375 "nvme_admin": false, 00:20:53.375 "nvme_io": false, 00:20:53.375 "nvme_io_md": false, 00:20:53.375 "write_zeroes": true, 00:20:53.375 "zcopy": false, 00:20:53.375 "get_zone_info": false, 00:20:53.375 "zone_management": false, 00:20:53.375 "zone_append": false, 00:20:53.375 "compare": false, 00:20:53.375 "compare_and_write": false, 00:20:53.375 "abort": false, 00:20:53.375 "seek_hole": false, 00:20:53.375 "seek_data": false, 00:20:53.375 "copy": false, 00:20:53.375 "nvme_iov_md": false 00:20:53.375 }, 00:20:53.375 "memory_domains": [ 00:20:53.375 { 00:20:53.375 "dma_device_id": "system", 00:20:53.375 "dma_device_type": 1 00:20:53.375 }, 00:20:53.375 { 00:20:53.375 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:53.375 "dma_device_type": 2 00:20:53.375 }, 00:20:53.375 { 00:20:53.375 "dma_device_id": "system", 00:20:53.375 "dma_device_type": 1 00:20:53.375 }, 00:20:53.375 { 00:20:53.375 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:53.375 "dma_device_type": 2 00:20:53.375 } 00:20:53.375 ], 00:20:53.375 "driver_specific": { 00:20:53.375 "raid": { 00:20:53.375 "uuid": "83d26241-4e84-4a54-bed7-8a11464dece2", 00:20:53.375 "strip_size_kb": 0, 00:20:53.375 "state": "online", 00:20:53.375 "raid_level": "raid1", 00:20:53.375 "superblock": true, 00:20:53.375 "num_base_bdevs": 2, 00:20:53.375 "num_base_bdevs_discovered": 2, 00:20:53.375 "num_base_bdevs_operational": 2, 00:20:53.375 "base_bdevs_list": [ 00:20:53.375 { 00:20:53.375 "name": "BaseBdev1", 00:20:53.375 "uuid": "0e03fb2b-fe26-48d4-afad-91a602eee7d0", 00:20:53.375 "is_configured": true, 00:20:53.375 "data_offset": 256, 00:20:53.375 "data_size": 7936 00:20:53.375 }, 00:20:53.375 { 00:20:53.375 "name": "BaseBdev2", 00:20:53.375 "uuid": "068fcdad-5e71-40d9-8f44-c50aee6cd527", 00:20:53.375 "is_configured": true, 00:20:53.375 "data_offset": 256, 00:20:53.375 "data_size": 7936 00:20:53.375 } 00:20:53.375 ] 00:20:53.375 } 00:20:53.375 } 00:20:53.375 }' 00:20:53.375 23:42:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:53.375 23:42:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:20:53.375 BaseBdev2' 00:20:53.375 23:42:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:53.375 23:42:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:20:53.375 23:42:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:53.634 23:42:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:53.634 "name": "BaseBdev1", 00:20:53.634 "aliases": [ 00:20:53.634 "0e03fb2b-fe26-48d4-afad-91a602eee7d0" 00:20:53.634 ], 00:20:53.634 "product_name": "Malloc disk", 00:20:53.634 "block_size": 4096, 00:20:53.634 "num_blocks": 8192, 00:20:53.634 "uuid": "0e03fb2b-fe26-48d4-afad-91a602eee7d0", 00:20:53.634 "assigned_rate_limits": { 00:20:53.634 "rw_ios_per_sec": 0, 00:20:53.634 "rw_mbytes_per_sec": 0, 00:20:53.634 "r_mbytes_per_sec": 0, 00:20:53.634 "w_mbytes_per_sec": 0 00:20:53.634 }, 00:20:53.634 "claimed": true, 00:20:53.634 "claim_type": "exclusive_write", 00:20:53.634 "zoned": false, 00:20:53.634 "supported_io_types": { 00:20:53.634 "read": true, 00:20:53.634 "write": true, 00:20:53.634 "unmap": true, 00:20:53.634 "flush": true, 00:20:53.634 "reset": true, 00:20:53.634 "nvme_admin": false, 00:20:53.634 "nvme_io": false, 00:20:53.634 "nvme_io_md": false, 00:20:53.634 "write_zeroes": true, 00:20:53.634 "zcopy": true, 00:20:53.634 "get_zone_info": false, 00:20:53.634 "zone_management": false, 00:20:53.634 "zone_append": false, 00:20:53.634 "compare": false, 00:20:53.634 "compare_and_write": false, 00:20:53.634 "abort": true, 00:20:53.634 "seek_hole": false, 00:20:53.634 "seek_data": false, 00:20:53.634 "copy": true, 00:20:53.634 "nvme_iov_md": false 00:20:53.634 }, 00:20:53.634 "memory_domains": [ 00:20:53.634 { 00:20:53.634 "dma_device_id": "system", 00:20:53.634 "dma_device_type": 1 00:20:53.634 }, 00:20:53.634 { 00:20:53.634 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:53.634 "dma_device_type": 2 00:20:53.634 } 00:20:53.634 ], 00:20:53.634 "driver_specific": {} 00:20:53.634 }' 00:20:53.634 23:42:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:53.634 23:42:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:53.634 23:42:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:20:53.634 23:42:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:53.634 23:42:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:53.634 23:42:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:53.634 23:42:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:53.893 23:42:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:53.893 23:42:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:53.893 23:42:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:53.893 23:42:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:53.893 23:42:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:53.893 23:42:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:53.893 23:42:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:20:53.893 23:42:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:54.150 23:42:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:54.150 "name": "BaseBdev2", 00:20:54.150 "aliases": [ 00:20:54.150 "068fcdad-5e71-40d9-8f44-c50aee6cd527" 00:20:54.150 ], 00:20:54.151 "product_name": "Malloc disk", 00:20:54.151 "block_size": 4096, 00:20:54.151 "num_blocks": 8192, 00:20:54.151 "uuid": "068fcdad-5e71-40d9-8f44-c50aee6cd527", 00:20:54.151 "assigned_rate_limits": { 00:20:54.151 "rw_ios_per_sec": 0, 00:20:54.151 "rw_mbytes_per_sec": 0, 00:20:54.151 "r_mbytes_per_sec": 0, 00:20:54.151 "w_mbytes_per_sec": 0 00:20:54.151 }, 00:20:54.151 "claimed": true, 00:20:54.151 "claim_type": "exclusive_write", 00:20:54.151 "zoned": false, 00:20:54.151 "supported_io_types": { 00:20:54.151 "read": true, 00:20:54.151 "write": true, 00:20:54.151 "unmap": true, 00:20:54.151 "flush": true, 00:20:54.151 "reset": true, 00:20:54.151 "nvme_admin": false, 00:20:54.151 "nvme_io": false, 00:20:54.151 "nvme_io_md": false, 00:20:54.151 "write_zeroes": true, 00:20:54.151 "zcopy": true, 00:20:54.151 "get_zone_info": false, 00:20:54.151 "zone_management": false, 00:20:54.151 "zone_append": false, 00:20:54.151 "compare": false, 00:20:54.151 "compare_and_write": false, 00:20:54.151 "abort": true, 00:20:54.151 "seek_hole": false, 00:20:54.151 "seek_data": false, 00:20:54.151 "copy": true, 00:20:54.151 "nvme_iov_md": false 00:20:54.151 }, 00:20:54.151 "memory_domains": [ 00:20:54.151 { 00:20:54.151 "dma_device_id": "system", 00:20:54.151 "dma_device_type": 1 00:20:54.151 }, 00:20:54.151 { 00:20:54.151 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:54.151 "dma_device_type": 2 00:20:54.151 } 00:20:54.151 ], 00:20:54.151 "driver_specific": {} 00:20:54.151 }' 00:20:54.151 23:42:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:54.151 23:42:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:54.151 23:42:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:20:54.151 23:42:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:54.151 23:42:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:54.151 23:42:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:54.151 23:42:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:54.151 23:42:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:54.409 23:42:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:54.409 23:42:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:54.409 23:42:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:54.409 23:42:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:54.409 23:42:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:20:54.667 [2024-07-24 23:42:39.413843] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:54.667 23:42:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@275 -- # local expected_state 00:20:54.667 23:42:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:20:54.667 23:42:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@213 -- # case $1 in 00:20:54.667 23:42:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@214 -- # return 0 00:20:54.667 23:42:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:20:54.667 23:42:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:20:54.667 23:42:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:54.667 23:42:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:54.667 23:42:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:54.667 23:42:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:54.667 23:42:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:20:54.667 23:42:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:54.668 23:42:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:54.668 23:42:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:54.668 23:42:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:54.668 23:42:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:54.668 23:42:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:54.668 23:42:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:54.668 "name": "Existed_Raid", 00:20:54.668 "uuid": "83d26241-4e84-4a54-bed7-8a11464dece2", 00:20:54.668 "strip_size_kb": 0, 00:20:54.668 "state": "online", 00:20:54.668 "raid_level": "raid1", 00:20:54.668 "superblock": true, 00:20:54.668 "num_base_bdevs": 2, 00:20:54.668 "num_base_bdevs_discovered": 1, 00:20:54.668 "num_base_bdevs_operational": 1, 00:20:54.668 "base_bdevs_list": [ 00:20:54.668 { 00:20:54.668 "name": null, 00:20:54.668 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:54.668 "is_configured": false, 00:20:54.668 "data_offset": 256, 00:20:54.668 "data_size": 7936 00:20:54.668 }, 00:20:54.668 { 00:20:54.668 "name": "BaseBdev2", 00:20:54.668 "uuid": "068fcdad-5e71-40d9-8f44-c50aee6cd527", 00:20:54.668 "is_configured": true, 00:20:54.668 "data_offset": 256, 00:20:54.668 "data_size": 7936 00:20:54.668 } 00:20:54.668 ] 00:20:54.668 }' 00:20:54.668 23:42:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:54.668 23:42:39 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:20:55.261 23:42:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:20:55.261 23:42:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:55.261 23:42:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:55.261 23:42:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:55.261 23:42:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:55.261 23:42:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:55.261 23:42:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:20:55.519 [2024-07-24 23:42:40.365216] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:20:55.519 [2024-07-24 23:42:40.365282] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:55.519 [2024-07-24 23:42:40.375409] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:55.519 [2024-07-24 23:42:40.375434] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:55.519 [2024-07-24 23:42:40.375440] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14a3050 name Existed_Raid, state offline 00:20:55.519 23:42:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:55.519 23:42:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:55.519 23:42:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:55.519 23:42:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:20:55.778 23:42:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:20:55.778 23:42:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:20:55.778 23:42:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:20:55.778 23:42:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@341 -- # killprocess 378494 00:20:55.778 23:42:40 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@950 -- # '[' -z 378494 ']' 00:20:55.778 23:42:40 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@954 -- # kill -0 378494 00:20:55.778 23:42:40 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@955 -- # uname 00:20:55.778 23:42:40 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:20:55.778 23:42:40 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 378494 00:20:55.778 23:42:40 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:20:55.778 23:42:40 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:20:55.778 23:42:40 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@968 -- # echo 'killing process with pid 378494' 00:20:55.778 killing process with pid 378494 00:20:55.778 23:42:40 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@969 -- # kill 378494 00:20:55.778 [2024-07-24 23:42:40.602699] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:55.778 23:42:40 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@974 -- # wait 378494 00:20:55.778 [2024-07-24 23:42:40.603475] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:55.778 23:42:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@343 -- # return 0 00:20:55.778 00:20:55.778 real 0m8.013s 00:20:55.778 user 0m14.351s 00:20:55.778 sys 0m1.312s 00:20:55.778 23:42:40 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@1126 -- # xtrace_disable 00:20:55.778 23:42:40 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:20:55.778 ************************************ 00:20:55.778 END TEST raid_state_function_test_sb_4k 00:20:55.778 ************************************ 00:20:56.037 23:42:40 bdev_raid -- bdev/bdev_raid.sh@899 -- # run_test raid_superblock_test_4k raid_superblock_test raid1 2 00:20:56.037 23:42:40 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:20:56.037 23:42:40 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:20:56.037 23:42:40 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:56.037 ************************************ 00:20:56.037 START TEST raid_superblock_test_4k 00:20:56.037 ************************************ 00:20:56.037 23:42:40 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@1125 -- # raid_superblock_test raid1 2 00:20:56.037 23:42:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:20:56.037 23:42:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:20:56.037 23:42:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:20:56.037 23:42:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:20:56.037 23:42:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:20:56.037 23:42:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:20:56.037 23:42:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:20:56.037 23:42:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:20:56.037 23:42:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:20:56.037 23:42:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@398 -- # local strip_size 00:20:56.037 23:42:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:20:56.037 23:42:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:20:56.037 23:42:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:20:56.037 23:42:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:20:56.037 23:42:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:20:56.037 23:42:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@411 -- # raid_pid=379933 00:20:56.037 23:42:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@412 -- # waitforlisten 379933 /var/tmp/spdk-raid.sock 00:20:56.037 23:42:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:20:56.037 23:42:40 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@831 -- # '[' -z 379933 ']' 00:20:56.037 23:42:40 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:56.037 23:42:40 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@836 -- # local max_retries=100 00:20:56.037 23:42:40 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:56.037 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:56.037 23:42:40 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@840 -- # xtrace_disable 00:20:56.037 23:42:40 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:20:56.037 [2024-07-24 23:42:40.888609] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:20:56.038 [2024-07-24 23:42:40.888647] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid379933 ] 00:20:56.038 [2024-07-24 23:42:40.952505] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:56.038 [2024-07-24 23:42:41.031604] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:56.296 [2024-07-24 23:42:41.087320] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:56.296 [2024-07-24 23:42:41.087349] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:56.863 23:42:41 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:20:56.863 23:42:41 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@864 -- # return 0 00:20:56.863 23:42:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:20:56.863 23:42:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:20:56.863 23:42:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:20:56.863 23:42:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:20:56.863 23:42:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:20:56.863 23:42:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:20:56.863 23:42:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:20:56.863 23:42:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:20:56.863 23:42:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b malloc1 00:20:56.863 malloc1 00:20:56.863 23:42:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:20:57.122 [2024-07-24 23:42:42.007141] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:20:57.122 [2024-07-24 23:42:42.007176] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:57.122 [2024-07-24 23:42:42.007189] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a74dd0 00:20:57.122 [2024-07-24 23:42:42.007195] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:57.122 [2024-07-24 23:42:42.008319] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:57.122 [2024-07-24 23:42:42.008339] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:20:57.122 pt1 00:20:57.122 23:42:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:20:57.122 23:42:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:20:57.122 23:42:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:20:57.122 23:42:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:20:57.122 23:42:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:20:57.122 23:42:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:20:57.122 23:42:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:20:57.122 23:42:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:20:57.122 23:42:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b malloc2 00:20:57.380 malloc2 00:20:57.380 23:42:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:20:57.380 [2024-07-24 23:42:42.335600] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:20:57.380 [2024-07-24 23:42:42.335634] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:57.380 [2024-07-24 23:42:42.335646] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a758d0 00:20:57.380 [2024-07-24 23:42:42.335652] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:57.381 [2024-07-24 23:42:42.336718] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:57.381 [2024-07-24 23:42:42.336739] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:20:57.381 pt2 00:20:57.381 23:42:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:20:57.381 23:42:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:20:57.381 23:42:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:20:57.681 [2024-07-24 23:42:42.500042] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:20:57.681 [2024-07-24 23:42:42.500930] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:20:57.681 [2024-07-24 23:42:42.501031] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1a782d0 00:20:57.681 [2024-07-24 23:42:42.501039] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:20:57.681 [2024-07-24 23:42:42.501168] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a77730 00:20:57.681 [2024-07-24 23:42:42.501273] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1a782d0 00:20:57.681 [2024-07-24 23:42:42.501278] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1a782d0 00:20:57.681 [2024-07-24 23:42:42.501343] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:57.681 23:42:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:20:57.681 23:42:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:57.681 23:42:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:57.681 23:42:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:57.681 23:42:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:57.681 23:42:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:20:57.681 23:42:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:57.681 23:42:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:57.681 23:42:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:57.681 23:42:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:57.681 23:42:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:57.681 23:42:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:57.940 23:42:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:57.940 "name": "raid_bdev1", 00:20:57.940 "uuid": "1bf47b22-e964-4134-a8c1-7a8b98351286", 00:20:57.940 "strip_size_kb": 0, 00:20:57.940 "state": "online", 00:20:57.940 "raid_level": "raid1", 00:20:57.940 "superblock": true, 00:20:57.940 "num_base_bdevs": 2, 00:20:57.940 "num_base_bdevs_discovered": 2, 00:20:57.940 "num_base_bdevs_operational": 2, 00:20:57.940 "base_bdevs_list": [ 00:20:57.940 { 00:20:57.940 "name": "pt1", 00:20:57.940 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:57.940 "is_configured": true, 00:20:57.940 "data_offset": 256, 00:20:57.940 "data_size": 7936 00:20:57.940 }, 00:20:57.940 { 00:20:57.940 "name": "pt2", 00:20:57.940 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:57.940 "is_configured": true, 00:20:57.940 "data_offset": 256, 00:20:57.940 "data_size": 7936 00:20:57.940 } 00:20:57.940 ] 00:20:57.940 }' 00:20:57.940 23:42:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:57.940 23:42:42 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:20:58.198 23:42:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:20:58.198 23:42:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:20:58.198 23:42:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:58.198 23:42:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:58.198 23:42:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:58.198 23:42:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@198 -- # local name 00:20:58.198 23:42:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:58.198 23:42:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:58.457 [2024-07-24 23:42:43.298234] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:58.457 23:42:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:58.457 "name": "raid_bdev1", 00:20:58.457 "aliases": [ 00:20:58.457 "1bf47b22-e964-4134-a8c1-7a8b98351286" 00:20:58.457 ], 00:20:58.457 "product_name": "Raid Volume", 00:20:58.457 "block_size": 4096, 00:20:58.457 "num_blocks": 7936, 00:20:58.457 "uuid": "1bf47b22-e964-4134-a8c1-7a8b98351286", 00:20:58.457 "assigned_rate_limits": { 00:20:58.457 "rw_ios_per_sec": 0, 00:20:58.457 "rw_mbytes_per_sec": 0, 00:20:58.457 "r_mbytes_per_sec": 0, 00:20:58.457 "w_mbytes_per_sec": 0 00:20:58.457 }, 00:20:58.457 "claimed": false, 00:20:58.457 "zoned": false, 00:20:58.457 "supported_io_types": { 00:20:58.457 "read": true, 00:20:58.457 "write": true, 00:20:58.457 "unmap": false, 00:20:58.457 "flush": false, 00:20:58.457 "reset": true, 00:20:58.457 "nvme_admin": false, 00:20:58.457 "nvme_io": false, 00:20:58.457 "nvme_io_md": false, 00:20:58.457 "write_zeroes": true, 00:20:58.457 "zcopy": false, 00:20:58.457 "get_zone_info": false, 00:20:58.457 "zone_management": false, 00:20:58.457 "zone_append": false, 00:20:58.457 "compare": false, 00:20:58.457 "compare_and_write": false, 00:20:58.457 "abort": false, 00:20:58.457 "seek_hole": false, 00:20:58.457 "seek_data": false, 00:20:58.457 "copy": false, 00:20:58.457 "nvme_iov_md": false 00:20:58.457 }, 00:20:58.457 "memory_domains": [ 00:20:58.457 { 00:20:58.457 "dma_device_id": "system", 00:20:58.457 "dma_device_type": 1 00:20:58.457 }, 00:20:58.457 { 00:20:58.457 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:58.457 "dma_device_type": 2 00:20:58.457 }, 00:20:58.457 { 00:20:58.457 "dma_device_id": "system", 00:20:58.457 "dma_device_type": 1 00:20:58.457 }, 00:20:58.457 { 00:20:58.457 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:58.457 "dma_device_type": 2 00:20:58.457 } 00:20:58.457 ], 00:20:58.457 "driver_specific": { 00:20:58.457 "raid": { 00:20:58.457 "uuid": "1bf47b22-e964-4134-a8c1-7a8b98351286", 00:20:58.457 "strip_size_kb": 0, 00:20:58.457 "state": "online", 00:20:58.457 "raid_level": "raid1", 00:20:58.457 "superblock": true, 00:20:58.457 "num_base_bdevs": 2, 00:20:58.457 "num_base_bdevs_discovered": 2, 00:20:58.457 "num_base_bdevs_operational": 2, 00:20:58.457 "base_bdevs_list": [ 00:20:58.457 { 00:20:58.457 "name": "pt1", 00:20:58.457 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:58.457 "is_configured": true, 00:20:58.457 "data_offset": 256, 00:20:58.457 "data_size": 7936 00:20:58.457 }, 00:20:58.457 { 00:20:58.457 "name": "pt2", 00:20:58.457 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:58.457 "is_configured": true, 00:20:58.457 "data_offset": 256, 00:20:58.457 "data_size": 7936 00:20:58.457 } 00:20:58.457 ] 00:20:58.457 } 00:20:58.457 } 00:20:58.457 }' 00:20:58.457 23:42:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:58.457 23:42:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:20:58.457 pt2' 00:20:58.457 23:42:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:58.457 23:42:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:20:58.457 23:42:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:58.716 23:42:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:58.716 "name": "pt1", 00:20:58.716 "aliases": [ 00:20:58.716 "00000000-0000-0000-0000-000000000001" 00:20:58.716 ], 00:20:58.716 "product_name": "passthru", 00:20:58.716 "block_size": 4096, 00:20:58.716 "num_blocks": 8192, 00:20:58.716 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:58.716 "assigned_rate_limits": { 00:20:58.716 "rw_ios_per_sec": 0, 00:20:58.716 "rw_mbytes_per_sec": 0, 00:20:58.716 "r_mbytes_per_sec": 0, 00:20:58.716 "w_mbytes_per_sec": 0 00:20:58.716 }, 00:20:58.716 "claimed": true, 00:20:58.716 "claim_type": "exclusive_write", 00:20:58.716 "zoned": false, 00:20:58.716 "supported_io_types": { 00:20:58.716 "read": true, 00:20:58.716 "write": true, 00:20:58.716 "unmap": true, 00:20:58.716 "flush": true, 00:20:58.716 "reset": true, 00:20:58.716 "nvme_admin": false, 00:20:58.716 "nvme_io": false, 00:20:58.716 "nvme_io_md": false, 00:20:58.716 "write_zeroes": true, 00:20:58.716 "zcopy": true, 00:20:58.716 "get_zone_info": false, 00:20:58.716 "zone_management": false, 00:20:58.716 "zone_append": false, 00:20:58.716 "compare": false, 00:20:58.716 "compare_and_write": false, 00:20:58.716 "abort": true, 00:20:58.716 "seek_hole": false, 00:20:58.716 "seek_data": false, 00:20:58.716 "copy": true, 00:20:58.716 "nvme_iov_md": false 00:20:58.716 }, 00:20:58.716 "memory_domains": [ 00:20:58.716 { 00:20:58.716 "dma_device_id": "system", 00:20:58.716 "dma_device_type": 1 00:20:58.716 }, 00:20:58.716 { 00:20:58.716 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:58.716 "dma_device_type": 2 00:20:58.716 } 00:20:58.716 ], 00:20:58.716 "driver_specific": { 00:20:58.716 "passthru": { 00:20:58.716 "name": "pt1", 00:20:58.716 "base_bdev_name": "malloc1" 00:20:58.716 } 00:20:58.716 } 00:20:58.716 }' 00:20:58.716 23:42:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:58.716 23:42:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:58.716 23:42:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:20:58.716 23:42:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:58.716 23:42:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:58.716 23:42:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:58.716 23:42:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:58.716 23:42:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:58.974 23:42:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:58.974 23:42:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:58.974 23:42:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:58.974 23:42:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:58.974 23:42:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:58.974 23:42:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:20:58.974 23:42:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:59.232 23:42:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:59.232 "name": "pt2", 00:20:59.232 "aliases": [ 00:20:59.232 "00000000-0000-0000-0000-000000000002" 00:20:59.232 ], 00:20:59.232 "product_name": "passthru", 00:20:59.232 "block_size": 4096, 00:20:59.232 "num_blocks": 8192, 00:20:59.232 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:59.232 "assigned_rate_limits": { 00:20:59.232 "rw_ios_per_sec": 0, 00:20:59.232 "rw_mbytes_per_sec": 0, 00:20:59.232 "r_mbytes_per_sec": 0, 00:20:59.232 "w_mbytes_per_sec": 0 00:20:59.232 }, 00:20:59.232 "claimed": true, 00:20:59.232 "claim_type": "exclusive_write", 00:20:59.232 "zoned": false, 00:20:59.232 "supported_io_types": { 00:20:59.232 "read": true, 00:20:59.232 "write": true, 00:20:59.232 "unmap": true, 00:20:59.232 "flush": true, 00:20:59.232 "reset": true, 00:20:59.232 "nvme_admin": false, 00:20:59.232 "nvme_io": false, 00:20:59.232 "nvme_io_md": false, 00:20:59.232 "write_zeroes": true, 00:20:59.232 "zcopy": true, 00:20:59.232 "get_zone_info": false, 00:20:59.232 "zone_management": false, 00:20:59.232 "zone_append": false, 00:20:59.232 "compare": false, 00:20:59.232 "compare_and_write": false, 00:20:59.232 "abort": true, 00:20:59.232 "seek_hole": false, 00:20:59.232 "seek_data": false, 00:20:59.232 "copy": true, 00:20:59.232 "nvme_iov_md": false 00:20:59.232 }, 00:20:59.232 "memory_domains": [ 00:20:59.232 { 00:20:59.232 "dma_device_id": "system", 00:20:59.232 "dma_device_type": 1 00:20:59.232 }, 00:20:59.232 { 00:20:59.232 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:59.232 "dma_device_type": 2 00:20:59.232 } 00:20:59.232 ], 00:20:59.232 "driver_specific": { 00:20:59.232 "passthru": { 00:20:59.232 "name": "pt2", 00:20:59.232 "base_bdev_name": "malloc2" 00:20:59.232 } 00:20:59.232 } 00:20:59.232 }' 00:20:59.232 23:42:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:59.232 23:42:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:59.232 23:42:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:20:59.232 23:42:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:59.232 23:42:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:59.232 23:42:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:59.232 23:42:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:59.232 23:42:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:59.232 23:42:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:59.491 23:42:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:59.491 23:42:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:59.491 23:42:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:59.491 23:42:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:59.491 23:42:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:20:59.491 [2024-07-24 23:42:44.465221] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:59.491 23:42:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=1bf47b22-e964-4134-a8c1-7a8b98351286 00:20:59.491 23:42:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@435 -- # '[' -z 1bf47b22-e964-4134-a8c1-7a8b98351286 ']' 00:20:59.491 23:42:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:20:59.750 [2024-07-24 23:42:44.641522] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:59.750 [2024-07-24 23:42:44.641541] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:59.750 [2024-07-24 23:42:44.641578] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:59.750 [2024-07-24 23:42:44.641614] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:59.750 [2024-07-24 23:42:44.641619] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a782d0 name raid_bdev1, state offline 00:20:59.750 23:42:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:59.750 23:42:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:21:00.008 23:42:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:21:00.008 23:42:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:21:00.008 23:42:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:21:00.008 23:42:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:21:00.008 23:42:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:21:00.008 23:42:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:21:00.267 23:42:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:21:00.267 23:42:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:21:00.526 23:42:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:21:00.526 23:42:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:21:00.526 23:42:45 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@650 -- # local es=0 00:21:00.526 23:42:45 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:21:00.526 23:42:45 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:00.526 23:42:45 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:21:00.526 23:42:45 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:00.526 23:42:45 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:21:00.526 23:42:45 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:00.526 23:42:45 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:21:00.526 23:42:45 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:00.526 23:42:45 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:21:00.526 23:42:45 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:21:00.526 [2024-07-24 23:42:45.467663] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:21:00.526 [2024-07-24 23:42:45.468677] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:21:00.526 [2024-07-24 23:42:45.468717] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:21:00.526 [2024-07-24 23:42:45.468744] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:21:00.526 [2024-07-24 23:42:45.468754] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:00.526 [2024-07-24 23:42:45.468759] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a78010 name raid_bdev1, state configuring 00:21:00.526 request: 00:21:00.526 { 00:21:00.526 "name": "raid_bdev1", 00:21:00.526 "raid_level": "raid1", 00:21:00.526 "base_bdevs": [ 00:21:00.526 "malloc1", 00:21:00.526 "malloc2" 00:21:00.526 ], 00:21:00.526 "superblock": false, 00:21:00.526 "method": "bdev_raid_create", 00:21:00.526 "req_id": 1 00:21:00.526 } 00:21:00.526 Got JSON-RPC error response 00:21:00.526 response: 00:21:00.526 { 00:21:00.526 "code": -17, 00:21:00.526 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:21:00.526 } 00:21:00.526 23:42:45 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@653 -- # es=1 00:21:00.526 23:42:45 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:21:00.526 23:42:45 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:21:00.526 23:42:45 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:21:00.526 23:42:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:00.526 23:42:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:21:00.785 23:42:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:21:00.785 23:42:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:21:00.785 23:42:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:21:01.044 [2024-07-24 23:42:45.796499] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:21:01.044 [2024-07-24 23:42:45.796532] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:01.044 [2024-07-24 23:42:45.796543] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a778d0 00:21:01.044 [2024-07-24 23:42:45.796549] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:01.044 [2024-07-24 23:42:45.797722] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:01.044 [2024-07-24 23:42:45.797746] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:21:01.044 [2024-07-24 23:42:45.797792] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:21:01.044 [2024-07-24 23:42:45.797810] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:21:01.044 pt1 00:21:01.045 23:42:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:21:01.045 23:42:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:01.045 23:42:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:01.045 23:42:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:01.045 23:42:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:01.045 23:42:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:01.045 23:42:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:01.045 23:42:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:01.045 23:42:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:01.045 23:42:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:01.045 23:42:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:01.045 23:42:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:01.045 23:42:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:01.045 "name": "raid_bdev1", 00:21:01.045 "uuid": "1bf47b22-e964-4134-a8c1-7a8b98351286", 00:21:01.045 "strip_size_kb": 0, 00:21:01.045 "state": "configuring", 00:21:01.045 "raid_level": "raid1", 00:21:01.045 "superblock": true, 00:21:01.045 "num_base_bdevs": 2, 00:21:01.045 "num_base_bdevs_discovered": 1, 00:21:01.045 "num_base_bdevs_operational": 2, 00:21:01.045 "base_bdevs_list": [ 00:21:01.045 { 00:21:01.045 "name": "pt1", 00:21:01.045 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:01.045 "is_configured": true, 00:21:01.045 "data_offset": 256, 00:21:01.045 "data_size": 7936 00:21:01.045 }, 00:21:01.045 { 00:21:01.045 "name": null, 00:21:01.045 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:01.045 "is_configured": false, 00:21:01.045 "data_offset": 256, 00:21:01.045 "data_size": 7936 00:21:01.045 } 00:21:01.045 ] 00:21:01.045 }' 00:21:01.045 23:42:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:01.045 23:42:45 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:21:01.613 23:42:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:21:01.613 23:42:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:21:01.613 23:42:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:21:01.613 23:42:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:21:01.613 [2024-07-24 23:42:46.610602] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:21:01.613 [2024-07-24 23:42:46.610641] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:01.613 [2024-07-24 23:42:46.610652] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b38510 00:21:01.613 [2024-07-24 23:42:46.610657] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:01.613 [2024-07-24 23:42:46.610915] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:01.613 [2024-07-24 23:42:46.610927] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:21:01.613 [2024-07-24 23:42:46.610970] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:21:01.613 [2024-07-24 23:42:46.610983] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:21:01.613 [2024-07-24 23:42:46.611057] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1a77400 00:21:01.613 [2024-07-24 23:42:46.611066] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:21:01.613 [2024-07-24 23:42:46.611183] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a6ef50 00:21:01.613 [2024-07-24 23:42:46.611272] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1a77400 00:21:01.613 [2024-07-24 23:42:46.611277] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1a77400 00:21:01.613 [2024-07-24 23:42:46.611343] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:01.873 pt2 00:21:01.873 23:42:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:21:01.873 23:42:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:21:01.873 23:42:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:01.873 23:42:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:01.873 23:42:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:01.873 23:42:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:01.873 23:42:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:01.873 23:42:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:01.873 23:42:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:01.873 23:42:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:01.873 23:42:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:01.873 23:42:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:01.873 23:42:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:01.873 23:42:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:01.873 23:42:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:01.873 "name": "raid_bdev1", 00:21:01.873 "uuid": "1bf47b22-e964-4134-a8c1-7a8b98351286", 00:21:01.873 "strip_size_kb": 0, 00:21:01.873 "state": "online", 00:21:01.873 "raid_level": "raid1", 00:21:01.873 "superblock": true, 00:21:01.873 "num_base_bdevs": 2, 00:21:01.873 "num_base_bdevs_discovered": 2, 00:21:01.873 "num_base_bdevs_operational": 2, 00:21:01.873 "base_bdevs_list": [ 00:21:01.873 { 00:21:01.873 "name": "pt1", 00:21:01.873 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:01.873 "is_configured": true, 00:21:01.873 "data_offset": 256, 00:21:01.873 "data_size": 7936 00:21:01.873 }, 00:21:01.873 { 00:21:01.873 "name": "pt2", 00:21:01.873 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:01.873 "is_configured": true, 00:21:01.873 "data_offset": 256, 00:21:01.873 "data_size": 7936 00:21:01.873 } 00:21:01.873 ] 00:21:01.873 }' 00:21:01.873 23:42:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:01.873 23:42:46 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:21:02.440 23:42:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:21:02.440 23:42:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:21:02.440 23:42:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:02.440 23:42:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:02.440 23:42:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:02.440 23:42:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@198 -- # local name 00:21:02.440 23:42:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:02.440 23:42:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:02.440 [2024-07-24 23:42:47.428862] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:02.698 23:42:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:02.698 "name": "raid_bdev1", 00:21:02.698 "aliases": [ 00:21:02.698 "1bf47b22-e964-4134-a8c1-7a8b98351286" 00:21:02.698 ], 00:21:02.698 "product_name": "Raid Volume", 00:21:02.698 "block_size": 4096, 00:21:02.698 "num_blocks": 7936, 00:21:02.698 "uuid": "1bf47b22-e964-4134-a8c1-7a8b98351286", 00:21:02.698 "assigned_rate_limits": { 00:21:02.698 "rw_ios_per_sec": 0, 00:21:02.698 "rw_mbytes_per_sec": 0, 00:21:02.698 "r_mbytes_per_sec": 0, 00:21:02.698 "w_mbytes_per_sec": 0 00:21:02.698 }, 00:21:02.698 "claimed": false, 00:21:02.698 "zoned": false, 00:21:02.698 "supported_io_types": { 00:21:02.698 "read": true, 00:21:02.698 "write": true, 00:21:02.698 "unmap": false, 00:21:02.698 "flush": false, 00:21:02.698 "reset": true, 00:21:02.698 "nvme_admin": false, 00:21:02.698 "nvme_io": false, 00:21:02.698 "nvme_io_md": false, 00:21:02.698 "write_zeroes": true, 00:21:02.698 "zcopy": false, 00:21:02.698 "get_zone_info": false, 00:21:02.698 "zone_management": false, 00:21:02.698 "zone_append": false, 00:21:02.698 "compare": false, 00:21:02.698 "compare_and_write": false, 00:21:02.698 "abort": false, 00:21:02.698 "seek_hole": false, 00:21:02.698 "seek_data": false, 00:21:02.698 "copy": false, 00:21:02.698 "nvme_iov_md": false 00:21:02.698 }, 00:21:02.698 "memory_domains": [ 00:21:02.698 { 00:21:02.698 "dma_device_id": "system", 00:21:02.698 "dma_device_type": 1 00:21:02.698 }, 00:21:02.698 { 00:21:02.698 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:02.698 "dma_device_type": 2 00:21:02.698 }, 00:21:02.698 { 00:21:02.698 "dma_device_id": "system", 00:21:02.698 "dma_device_type": 1 00:21:02.698 }, 00:21:02.698 { 00:21:02.698 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:02.698 "dma_device_type": 2 00:21:02.698 } 00:21:02.698 ], 00:21:02.698 "driver_specific": { 00:21:02.698 "raid": { 00:21:02.698 "uuid": "1bf47b22-e964-4134-a8c1-7a8b98351286", 00:21:02.698 "strip_size_kb": 0, 00:21:02.698 "state": "online", 00:21:02.698 "raid_level": "raid1", 00:21:02.698 "superblock": true, 00:21:02.698 "num_base_bdevs": 2, 00:21:02.698 "num_base_bdevs_discovered": 2, 00:21:02.698 "num_base_bdevs_operational": 2, 00:21:02.698 "base_bdevs_list": [ 00:21:02.698 { 00:21:02.698 "name": "pt1", 00:21:02.698 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:02.698 "is_configured": true, 00:21:02.698 "data_offset": 256, 00:21:02.698 "data_size": 7936 00:21:02.698 }, 00:21:02.698 { 00:21:02.698 "name": "pt2", 00:21:02.698 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:02.698 "is_configured": true, 00:21:02.698 "data_offset": 256, 00:21:02.698 "data_size": 7936 00:21:02.698 } 00:21:02.698 ] 00:21:02.698 } 00:21:02.698 } 00:21:02.698 }' 00:21:02.698 23:42:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:02.698 23:42:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:21:02.698 pt2' 00:21:02.698 23:42:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:02.698 23:42:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:21:02.698 23:42:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:02.698 23:42:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:02.698 "name": "pt1", 00:21:02.698 "aliases": [ 00:21:02.698 "00000000-0000-0000-0000-000000000001" 00:21:02.698 ], 00:21:02.698 "product_name": "passthru", 00:21:02.698 "block_size": 4096, 00:21:02.698 "num_blocks": 8192, 00:21:02.698 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:02.698 "assigned_rate_limits": { 00:21:02.698 "rw_ios_per_sec": 0, 00:21:02.698 "rw_mbytes_per_sec": 0, 00:21:02.698 "r_mbytes_per_sec": 0, 00:21:02.698 "w_mbytes_per_sec": 0 00:21:02.698 }, 00:21:02.698 "claimed": true, 00:21:02.698 "claim_type": "exclusive_write", 00:21:02.698 "zoned": false, 00:21:02.698 "supported_io_types": { 00:21:02.698 "read": true, 00:21:02.698 "write": true, 00:21:02.698 "unmap": true, 00:21:02.698 "flush": true, 00:21:02.698 "reset": true, 00:21:02.698 "nvme_admin": false, 00:21:02.698 "nvme_io": false, 00:21:02.699 "nvme_io_md": false, 00:21:02.699 "write_zeroes": true, 00:21:02.699 "zcopy": true, 00:21:02.699 "get_zone_info": false, 00:21:02.699 "zone_management": false, 00:21:02.699 "zone_append": false, 00:21:02.699 "compare": false, 00:21:02.699 "compare_and_write": false, 00:21:02.699 "abort": true, 00:21:02.699 "seek_hole": false, 00:21:02.699 "seek_data": false, 00:21:02.699 "copy": true, 00:21:02.699 "nvme_iov_md": false 00:21:02.699 }, 00:21:02.699 "memory_domains": [ 00:21:02.699 { 00:21:02.699 "dma_device_id": "system", 00:21:02.699 "dma_device_type": 1 00:21:02.699 }, 00:21:02.699 { 00:21:02.699 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:02.699 "dma_device_type": 2 00:21:02.699 } 00:21:02.699 ], 00:21:02.699 "driver_specific": { 00:21:02.699 "passthru": { 00:21:02.699 "name": "pt1", 00:21:02.699 "base_bdev_name": "malloc1" 00:21:02.699 } 00:21:02.699 } 00:21:02.699 }' 00:21:02.699 23:42:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:02.958 23:42:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:02.958 23:42:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:21:02.958 23:42:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:02.958 23:42:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:02.958 23:42:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:02.958 23:42:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:02.958 23:42:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:02.958 23:42:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:02.958 23:42:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:02.958 23:42:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:03.216 23:42:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:03.216 23:42:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:03.216 23:42:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:21:03.216 23:42:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:03.216 23:42:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:03.216 "name": "pt2", 00:21:03.216 "aliases": [ 00:21:03.216 "00000000-0000-0000-0000-000000000002" 00:21:03.216 ], 00:21:03.216 "product_name": "passthru", 00:21:03.216 "block_size": 4096, 00:21:03.216 "num_blocks": 8192, 00:21:03.216 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:03.216 "assigned_rate_limits": { 00:21:03.216 "rw_ios_per_sec": 0, 00:21:03.216 "rw_mbytes_per_sec": 0, 00:21:03.216 "r_mbytes_per_sec": 0, 00:21:03.216 "w_mbytes_per_sec": 0 00:21:03.216 }, 00:21:03.216 "claimed": true, 00:21:03.216 "claim_type": "exclusive_write", 00:21:03.216 "zoned": false, 00:21:03.216 "supported_io_types": { 00:21:03.216 "read": true, 00:21:03.216 "write": true, 00:21:03.216 "unmap": true, 00:21:03.216 "flush": true, 00:21:03.216 "reset": true, 00:21:03.216 "nvme_admin": false, 00:21:03.216 "nvme_io": false, 00:21:03.216 "nvme_io_md": false, 00:21:03.216 "write_zeroes": true, 00:21:03.216 "zcopy": true, 00:21:03.216 "get_zone_info": false, 00:21:03.216 "zone_management": false, 00:21:03.216 "zone_append": false, 00:21:03.216 "compare": false, 00:21:03.216 "compare_and_write": false, 00:21:03.216 "abort": true, 00:21:03.216 "seek_hole": false, 00:21:03.216 "seek_data": false, 00:21:03.216 "copy": true, 00:21:03.216 "nvme_iov_md": false 00:21:03.216 }, 00:21:03.216 "memory_domains": [ 00:21:03.216 { 00:21:03.216 "dma_device_id": "system", 00:21:03.216 "dma_device_type": 1 00:21:03.216 }, 00:21:03.216 { 00:21:03.216 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:03.216 "dma_device_type": 2 00:21:03.216 } 00:21:03.216 ], 00:21:03.216 "driver_specific": { 00:21:03.216 "passthru": { 00:21:03.216 "name": "pt2", 00:21:03.216 "base_bdev_name": "malloc2" 00:21:03.216 } 00:21:03.216 } 00:21:03.216 }' 00:21:03.216 23:42:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:03.216 23:42:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:03.475 23:42:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:21:03.475 23:42:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:03.475 23:42:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:03.475 23:42:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:03.475 23:42:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:03.475 23:42:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:03.475 23:42:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:03.475 23:42:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:03.475 23:42:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:03.475 23:42:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:03.475 23:42:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:21:03.475 23:42:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:03.733 [2024-07-24 23:42:48.595883] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:03.733 23:42:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # '[' 1bf47b22-e964-4134-a8c1-7a8b98351286 '!=' 1bf47b22-e964-4134-a8c1-7a8b98351286 ']' 00:21:03.733 23:42:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:21:03.733 23:42:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@213 -- # case $1 in 00:21:03.733 23:42:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@214 -- # return 0 00:21:03.733 23:42:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:21:03.990 [2024-07-24 23:42:48.764183] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:21:03.990 23:42:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:03.990 23:42:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:03.990 23:42:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:03.990 23:42:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:03.990 23:42:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:03.991 23:42:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:03.991 23:42:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:03.991 23:42:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:03.991 23:42:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:03.991 23:42:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:03.991 23:42:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:03.991 23:42:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:03.991 23:42:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:03.991 "name": "raid_bdev1", 00:21:03.991 "uuid": "1bf47b22-e964-4134-a8c1-7a8b98351286", 00:21:03.991 "strip_size_kb": 0, 00:21:03.991 "state": "online", 00:21:03.991 "raid_level": "raid1", 00:21:03.991 "superblock": true, 00:21:03.991 "num_base_bdevs": 2, 00:21:03.991 "num_base_bdevs_discovered": 1, 00:21:03.991 "num_base_bdevs_operational": 1, 00:21:03.991 "base_bdevs_list": [ 00:21:03.991 { 00:21:03.991 "name": null, 00:21:03.991 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:03.991 "is_configured": false, 00:21:03.991 "data_offset": 256, 00:21:03.991 "data_size": 7936 00:21:03.991 }, 00:21:03.991 { 00:21:03.991 "name": "pt2", 00:21:03.991 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:03.991 "is_configured": true, 00:21:03.991 "data_offset": 256, 00:21:03.991 "data_size": 7936 00:21:03.991 } 00:21:03.991 ] 00:21:03.991 }' 00:21:03.991 23:42:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:03.991 23:42:48 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:21:04.557 23:42:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:04.815 [2024-07-24 23:42:49.582279] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:04.815 [2024-07-24 23:42:49.582299] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:04.815 [2024-07-24 23:42:49.582339] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:04.815 [2024-07-24 23:42:49.582370] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:04.815 [2024-07-24 23:42:49.582376] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a77400 name raid_bdev1, state offline 00:21:04.815 23:42:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:21:04.815 23:42:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:04.815 23:42:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:21:04.815 23:42:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:21:04.815 23:42:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:21:04.815 23:42:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:21:04.815 23:42:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:21:05.074 23:42:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:21:05.074 23:42:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:21:05.074 23:42:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:21:05.074 23:42:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:21:05.074 23:42:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@518 -- # i=1 00:21:05.074 23:42:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:21:05.333 [2024-07-24 23:42:50.099732] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:21:05.333 [2024-07-24 23:42:50.099770] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:05.333 [2024-07-24 23:42:50.099780] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b38890 00:21:05.333 [2024-07-24 23:42:50.099786] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:05.333 [2024-07-24 23:42:50.100953] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:05.333 [2024-07-24 23:42:50.100973] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:21:05.333 [2024-07-24 23:42:50.101018] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:21:05.333 [2024-07-24 23:42:50.101037] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:21:05.333 [2024-07-24 23:42:50.101096] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1a6eb50 00:21:05.333 [2024-07-24 23:42:50.101102] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:21:05.333 [2024-07-24 23:42:50.101217] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a6ef30 00:21:05.333 [2024-07-24 23:42:50.101301] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1a6eb50 00:21:05.333 [2024-07-24 23:42:50.101306] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1a6eb50 00:21:05.334 [2024-07-24 23:42:50.101373] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:05.334 pt2 00:21:05.334 23:42:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:05.334 23:42:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:05.334 23:42:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:05.334 23:42:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:05.334 23:42:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:05.334 23:42:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:05.334 23:42:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:05.334 23:42:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:05.334 23:42:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:05.334 23:42:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:05.334 23:42:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:05.334 23:42:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:05.334 23:42:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:05.334 "name": "raid_bdev1", 00:21:05.334 "uuid": "1bf47b22-e964-4134-a8c1-7a8b98351286", 00:21:05.334 "strip_size_kb": 0, 00:21:05.334 "state": "online", 00:21:05.334 "raid_level": "raid1", 00:21:05.334 "superblock": true, 00:21:05.334 "num_base_bdevs": 2, 00:21:05.334 "num_base_bdevs_discovered": 1, 00:21:05.334 "num_base_bdevs_operational": 1, 00:21:05.334 "base_bdevs_list": [ 00:21:05.334 { 00:21:05.334 "name": null, 00:21:05.334 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:05.334 "is_configured": false, 00:21:05.334 "data_offset": 256, 00:21:05.334 "data_size": 7936 00:21:05.334 }, 00:21:05.334 { 00:21:05.334 "name": "pt2", 00:21:05.334 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:05.334 "is_configured": true, 00:21:05.334 "data_offset": 256, 00:21:05.334 "data_size": 7936 00:21:05.334 } 00:21:05.334 ] 00:21:05.334 }' 00:21:05.334 23:42:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:05.334 23:42:50 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:21:05.900 23:42:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:05.900 [2024-07-24 23:42:50.869795] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:05.900 [2024-07-24 23:42:50.869813] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:05.900 [2024-07-24 23:42:50.869846] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:05.900 [2024-07-24 23:42:50.869876] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:05.900 [2024-07-24 23:42:50.869881] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a6eb50 name raid_bdev1, state offline 00:21:05.900 23:42:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:05.900 23:42:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:21:06.158 23:42:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:21:06.158 23:42:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:21:06.158 23:42:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:21:06.158 23:42:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:21:06.416 [2024-07-24 23:42:51.214678] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:21:06.416 [2024-07-24 23:42:51.214712] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:06.416 [2024-07-24 23:42:51.214721] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a6b720 00:21:06.416 [2024-07-24 23:42:51.214727] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:06.416 [2024-07-24 23:42:51.215868] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:06.416 [2024-07-24 23:42:51.215895] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:21:06.416 [2024-07-24 23:42:51.215939] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:21:06.416 [2024-07-24 23:42:51.215958] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:21:06.416 [2024-07-24 23:42:51.216026] bdev_raid.c:3639:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:21:06.416 [2024-07-24 23:42:51.216033] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:06.416 [2024-07-24 23:42:51.216042] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a6cd10 name raid_bdev1, state configuring 00:21:06.416 [2024-07-24 23:42:51.216056] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:21:06.416 [2024-07-24 23:42:51.216096] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1a6cf90 00:21:06.416 [2024-07-24 23:42:51.216102] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:21:06.416 [2024-07-24 23:42:51.216213] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a6cad0 00:21:06.416 [2024-07-24 23:42:51.216296] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1a6cf90 00:21:06.416 [2024-07-24 23:42:51.216301] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1a6cf90 00:21:06.416 [2024-07-24 23:42:51.216370] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:06.416 pt1 00:21:06.416 23:42:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:21:06.416 23:42:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:06.416 23:42:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:06.416 23:42:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:06.416 23:42:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:06.416 23:42:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:06.417 23:42:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:06.417 23:42:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:06.417 23:42:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:06.417 23:42:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:06.417 23:42:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:06.417 23:42:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:06.417 23:42:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:06.417 23:42:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:06.417 "name": "raid_bdev1", 00:21:06.417 "uuid": "1bf47b22-e964-4134-a8c1-7a8b98351286", 00:21:06.417 "strip_size_kb": 0, 00:21:06.417 "state": "online", 00:21:06.417 "raid_level": "raid1", 00:21:06.417 "superblock": true, 00:21:06.417 "num_base_bdevs": 2, 00:21:06.417 "num_base_bdevs_discovered": 1, 00:21:06.417 "num_base_bdevs_operational": 1, 00:21:06.417 "base_bdevs_list": [ 00:21:06.417 { 00:21:06.417 "name": null, 00:21:06.417 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:06.417 "is_configured": false, 00:21:06.417 "data_offset": 256, 00:21:06.417 "data_size": 7936 00:21:06.417 }, 00:21:06.417 { 00:21:06.417 "name": "pt2", 00:21:06.417 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:06.417 "is_configured": true, 00:21:06.417 "data_offset": 256, 00:21:06.417 "data_size": 7936 00:21:06.417 } 00:21:06.417 ] 00:21:06.417 }' 00:21:06.417 23:42:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:06.417 23:42:51 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:21:06.983 23:42:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:21:06.983 23:42:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:21:07.241 23:42:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:21:07.241 23:42:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:07.241 23:42:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:21:07.241 [2024-07-24 23:42:52.173314] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:07.241 23:42:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # '[' 1bf47b22-e964-4134-a8c1-7a8b98351286 '!=' 1bf47b22-e964-4134-a8c1-7a8b98351286 ']' 00:21:07.241 23:42:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@562 -- # killprocess 379933 00:21:07.241 23:42:52 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@950 -- # '[' -z 379933 ']' 00:21:07.241 23:42:52 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@954 -- # kill -0 379933 00:21:07.241 23:42:52 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@955 -- # uname 00:21:07.241 23:42:52 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:21:07.241 23:42:52 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 379933 00:21:07.241 23:42:52 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:21:07.241 23:42:52 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:21:07.241 23:42:52 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@968 -- # echo 'killing process with pid 379933' 00:21:07.241 killing process with pid 379933 00:21:07.241 23:42:52 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@969 -- # kill 379933 00:21:07.241 [2024-07-24 23:42:52.229100] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:07.241 [2024-07-24 23:42:52.229138] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:07.241 [2024-07-24 23:42:52.229171] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:07.241 [2024-07-24 23:42:52.229177] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a6cf90 name raid_bdev1, state offline 00:21:07.241 23:42:52 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@974 -- # wait 379933 00:21:07.500 [2024-07-24 23:42:52.244232] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:07.500 23:42:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@564 -- # return 0 00:21:07.500 00:21:07.500 real 0m11.576s 00:21:07.500 user 0m21.305s 00:21:07.500 sys 0m1.792s 00:21:07.500 23:42:52 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@1126 -- # xtrace_disable 00:21:07.500 23:42:52 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:21:07.500 ************************************ 00:21:07.500 END TEST raid_superblock_test_4k 00:21:07.500 ************************************ 00:21:07.500 23:42:52 bdev_raid -- bdev/bdev_raid.sh@900 -- # '[' true = true ']' 00:21:07.501 23:42:52 bdev_raid -- bdev/bdev_raid.sh@901 -- # run_test raid_rebuild_test_sb_4k raid_rebuild_test raid1 2 true false true 00:21:07.501 23:42:52 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:21:07.501 23:42:52 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:21:07.501 23:42:52 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:07.501 ************************************ 00:21:07.501 START TEST raid_rebuild_test_sb_4k 00:21:07.501 ************************************ 00:21:07.501 23:42:52 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@1125 -- # raid_rebuild_test raid1 2 true false true 00:21:07.501 23:42:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:21:07.501 23:42:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:21:07.501 23:42:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:21:07.501 23:42:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:21:07.501 23:42:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@572 -- # local verify=true 00:21:07.501 23:42:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:21:07.501 23:42:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:21:07.501 23:42:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:21:07.501 23:42:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:21:07.501 23:42:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:21:07.501 23:42:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:21:07.501 23:42:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:21:07.501 23:42:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:21:07.501 23:42:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:21:07.501 23:42:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:21:07.501 23:42:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:21:07.501 23:42:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # local strip_size 00:21:07.501 23:42:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@576 -- # local create_arg 00:21:07.501 23:42:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:21:07.501 23:42:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@578 -- # local data_offset 00:21:07.501 23:42:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:21:07.501 23:42:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:21:07.501 23:42:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:21:07.501 23:42:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:21:07.501 23:42:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@596 -- # raid_pid=382284 00:21:07.501 23:42:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@597 -- # waitforlisten 382284 /var/tmp/spdk-raid.sock 00:21:07.501 23:42:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:21:07.501 23:42:52 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@831 -- # '[' -z 382284 ']' 00:21:07.501 23:42:52 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:07.501 23:42:52 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@836 -- # local max_retries=100 00:21:07.501 23:42:52 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:07.501 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:07.501 23:42:52 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@840 -- # xtrace_disable 00:21:07.501 23:42:52 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:21:07.760 [2024-07-24 23:42:52.532198] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:21:07.760 [2024-07-24 23:42:52.532235] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid382284 ] 00:21:07.760 I/O size of 3145728 is greater than zero copy threshold (65536). 00:21:07.760 Zero copy mechanism will not be used. 00:21:07.760 [2024-07-24 23:42:52.594161] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:07.760 [2024-07-24 23:42:52.672350] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:21:07.760 [2024-07-24 23:42:52.722168] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:07.760 [2024-07-24 23:42:52.722195] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:08.326 23:42:53 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:21:08.326 23:42:53 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@864 -- # return 0 00:21:08.326 23:42:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:21:08.326 23:42:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev1_malloc 00:21:08.583 BaseBdev1_malloc 00:21:08.583 23:42:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:21:08.840 [2024-07-24 23:42:53.645610] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:21:08.840 [2024-07-24 23:42:53.645644] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:08.840 [2024-07-24 23:42:53.645658] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11db1c0 00:21:08.840 [2024-07-24 23:42:53.645664] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:08.840 [2024-07-24 23:42:53.646827] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:08.840 [2024-07-24 23:42:53.646848] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:21:08.840 BaseBdev1 00:21:08.840 23:42:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:21:08.840 23:42:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev2_malloc 00:21:08.840 BaseBdev2_malloc 00:21:08.840 23:42:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:21:09.098 [2024-07-24 23:42:53.969955] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:21:09.098 [2024-07-24 23:42:53.969987] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:09.098 [2024-07-24 23:42:53.970001] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11dbce0 00:21:09.098 [2024-07-24 23:42:53.970007] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:09.098 [2024-07-24 23:42:53.971080] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:09.098 [2024-07-24 23:42:53.971100] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:21:09.098 BaseBdev2 00:21:09.098 23:42:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b spare_malloc 00:21:09.358 spare_malloc 00:21:09.358 23:42:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:21:09.358 spare_delay 00:21:09.358 23:42:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:21:09.617 [2024-07-24 23:42:54.466660] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:21:09.617 [2024-07-24 23:42:54.466690] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:09.617 [2024-07-24 23:42:54.466702] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x138a340 00:21:09.617 [2024-07-24 23:42:54.466708] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:09.617 [2024-07-24 23:42:54.467790] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:09.617 [2024-07-24 23:42:54.467810] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:21:09.617 spare 00:21:09.617 23:42:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:21:09.876 [2024-07-24 23:42:54.631115] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:09.876 [2024-07-24 23:42:54.632019] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:09.876 [2024-07-24 23:42:54.632140] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x138b4f0 00:21:09.876 [2024-07-24 23:42:54.632149] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:21:09.876 [2024-07-24 23:42:54.632283] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1384910 00:21:09.876 [2024-07-24 23:42:54.632383] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x138b4f0 00:21:09.876 [2024-07-24 23:42:54.632388] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x138b4f0 00:21:09.876 [2024-07-24 23:42:54.632458] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:09.876 23:42:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:09.876 23:42:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:09.876 23:42:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:09.876 23:42:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:09.876 23:42:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:09.876 23:42:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:09.876 23:42:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:09.876 23:42:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:09.876 23:42:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:09.876 23:42:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:09.876 23:42:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:09.876 23:42:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:09.876 23:42:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:09.876 "name": "raid_bdev1", 00:21:09.876 "uuid": "975f8426-3ecd-4388-9eca-f644e743c773", 00:21:09.876 "strip_size_kb": 0, 00:21:09.876 "state": "online", 00:21:09.876 "raid_level": "raid1", 00:21:09.876 "superblock": true, 00:21:09.876 "num_base_bdevs": 2, 00:21:09.876 "num_base_bdevs_discovered": 2, 00:21:09.876 "num_base_bdevs_operational": 2, 00:21:09.876 "base_bdevs_list": [ 00:21:09.876 { 00:21:09.876 "name": "BaseBdev1", 00:21:09.876 "uuid": "c661c6c9-d74e-5f07-8c84-5b30980ad245", 00:21:09.876 "is_configured": true, 00:21:09.876 "data_offset": 256, 00:21:09.876 "data_size": 7936 00:21:09.876 }, 00:21:09.876 { 00:21:09.876 "name": "BaseBdev2", 00:21:09.876 "uuid": "db527a73-a852-55b0-9fd9-310b6094c567", 00:21:09.876 "is_configured": true, 00:21:09.876 "data_offset": 256, 00:21:09.876 "data_size": 7936 00:21:09.876 } 00:21:09.876 ] 00:21:09.876 }' 00:21:09.876 23:42:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:09.876 23:42:54 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:21:10.443 23:42:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:10.443 23:42:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:21:10.443 [2024-07-24 23:42:55.429306] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:10.702 23:42:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:21:10.702 23:42:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:10.702 23:42:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:21:10.702 23:42:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:21:10.702 23:42:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:21:10.702 23:42:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:21:10.702 23:42:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:21:10.702 23:42:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:21:10.702 23:42:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:10.702 23:42:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:21:10.702 23:42:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # local bdev_list 00:21:10.702 23:42:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:21:10.702 23:42:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # local nbd_list 00:21:10.702 23:42:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@12 -- # local i 00:21:10.702 23:42:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:21:10.702 23:42:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:10.702 23:42:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:21:10.961 [2024-07-24 23:42:55.774161] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1384910 00:21:10.961 /dev/nbd0 00:21:10.961 23:42:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:21:10.961 23:42:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:21:10.961 23:42:55 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:21:10.961 23:42:55 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # local i 00:21:10.961 23:42:55 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:21:10.961 23:42:55 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:21:10.961 23:42:55 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:21:10.961 23:42:55 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@873 -- # break 00:21:10.961 23:42:55 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:21:10.961 23:42:55 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:21:10.961 23:42:55 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:21:10.961 1+0 records in 00:21:10.961 1+0 records out 00:21:10.961 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000222416 s, 18.4 MB/s 00:21:10.961 23:42:55 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:10.961 23:42:55 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # size=4096 00:21:10.961 23:42:55 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:10.961 23:42:55 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:21:10.961 23:42:55 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@889 -- # return 0 00:21:10.961 23:42:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:21:10.961 23:42:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:10.961 23:42:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:21:10.961 23:42:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:21:10.961 23:42:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=4096 count=7936 oflag=direct 00:21:11.531 7936+0 records in 00:21:11.531 7936+0 records out 00:21:11.531 32505856 bytes (33 MB, 31 MiB) copied, 0.510566 s, 63.7 MB/s 00:21:11.531 23:42:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:21:11.531 23:42:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:11.531 23:42:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:21:11.531 23:42:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # local nbd_list 00:21:11.531 23:42:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@51 -- # local i 00:21:11.531 23:42:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:21:11.531 23:42:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:21:11.531 23:42:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:21:11.826 23:42:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:21:11.826 23:42:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:21:11.826 23:42:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:21:11.826 23:42:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:21:11.826 23:42:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:21:11.826 [2024-07-24 23:42:56.535099] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:11.826 23:42:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:21:11.826 23:42:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:21:11.826 23:42:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:21:11.826 [2024-07-24 23:42:56.695541] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:11.826 23:42:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:11.826 23:42:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:11.826 23:42:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:11.826 23:42:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:11.826 23:42:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:11.826 23:42:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:11.826 23:42:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:11.826 23:42:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:11.827 23:42:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:11.827 23:42:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:11.827 23:42:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:11.827 23:42:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:12.085 23:42:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:12.085 "name": "raid_bdev1", 00:21:12.085 "uuid": "975f8426-3ecd-4388-9eca-f644e743c773", 00:21:12.085 "strip_size_kb": 0, 00:21:12.085 "state": "online", 00:21:12.085 "raid_level": "raid1", 00:21:12.085 "superblock": true, 00:21:12.085 "num_base_bdevs": 2, 00:21:12.085 "num_base_bdevs_discovered": 1, 00:21:12.085 "num_base_bdevs_operational": 1, 00:21:12.085 "base_bdevs_list": [ 00:21:12.085 { 00:21:12.085 "name": null, 00:21:12.085 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:12.085 "is_configured": false, 00:21:12.085 "data_offset": 256, 00:21:12.085 "data_size": 7936 00:21:12.085 }, 00:21:12.085 { 00:21:12.085 "name": "BaseBdev2", 00:21:12.085 "uuid": "db527a73-a852-55b0-9fd9-310b6094c567", 00:21:12.085 "is_configured": true, 00:21:12.085 "data_offset": 256, 00:21:12.085 "data_size": 7936 00:21:12.085 } 00:21:12.085 ] 00:21:12.085 }' 00:21:12.085 23:42:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:12.085 23:42:56 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:21:12.651 23:42:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:21:12.651 [2024-07-24 23:42:57.509642] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:12.651 [2024-07-24 23:42:57.513937] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1384910 00:21:12.651 [2024-07-24 23:42:57.515321] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:21:12.651 23:42:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@646 -- # sleep 1 00:21:13.586 23:42:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:13.586 23:42:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:13.586 23:42:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:13.586 23:42:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:13.586 23:42:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:13.586 23:42:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:13.586 23:42:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:13.844 23:42:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:13.844 "name": "raid_bdev1", 00:21:13.844 "uuid": "975f8426-3ecd-4388-9eca-f644e743c773", 00:21:13.844 "strip_size_kb": 0, 00:21:13.844 "state": "online", 00:21:13.844 "raid_level": "raid1", 00:21:13.844 "superblock": true, 00:21:13.844 "num_base_bdevs": 2, 00:21:13.844 "num_base_bdevs_discovered": 2, 00:21:13.844 "num_base_bdevs_operational": 2, 00:21:13.844 "process": { 00:21:13.844 "type": "rebuild", 00:21:13.844 "target": "spare", 00:21:13.844 "progress": { 00:21:13.844 "blocks": 2816, 00:21:13.844 "percent": 35 00:21:13.844 } 00:21:13.844 }, 00:21:13.844 "base_bdevs_list": [ 00:21:13.844 { 00:21:13.844 "name": "spare", 00:21:13.844 "uuid": "d15224cc-1c6f-553d-a81a-b873b869a136", 00:21:13.844 "is_configured": true, 00:21:13.844 "data_offset": 256, 00:21:13.844 "data_size": 7936 00:21:13.844 }, 00:21:13.844 { 00:21:13.844 "name": "BaseBdev2", 00:21:13.844 "uuid": "db527a73-a852-55b0-9fd9-310b6094c567", 00:21:13.844 "is_configured": true, 00:21:13.844 "data_offset": 256, 00:21:13.844 "data_size": 7936 00:21:13.844 } 00:21:13.844 ] 00:21:13.844 }' 00:21:13.844 23:42:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:13.844 23:42:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:13.844 23:42:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:13.844 23:42:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:13.844 23:42:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:21:14.103 [2024-07-24 23:42:58.950361] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:14.103 [2024-07-24 23:42:59.025839] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:21:14.103 [2024-07-24 23:42:59.025872] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:14.103 [2024-07-24 23:42:59.025882] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:14.103 [2024-07-24 23:42:59.025886] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:21:14.103 23:42:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:14.103 23:42:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:14.103 23:42:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:14.103 23:42:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:14.103 23:42:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:14.103 23:42:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:14.103 23:42:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:14.103 23:42:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:14.103 23:42:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:14.103 23:42:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:14.103 23:42:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:14.103 23:42:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:14.362 23:42:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:14.362 "name": "raid_bdev1", 00:21:14.362 "uuid": "975f8426-3ecd-4388-9eca-f644e743c773", 00:21:14.362 "strip_size_kb": 0, 00:21:14.362 "state": "online", 00:21:14.362 "raid_level": "raid1", 00:21:14.362 "superblock": true, 00:21:14.362 "num_base_bdevs": 2, 00:21:14.362 "num_base_bdevs_discovered": 1, 00:21:14.362 "num_base_bdevs_operational": 1, 00:21:14.362 "base_bdevs_list": [ 00:21:14.362 { 00:21:14.362 "name": null, 00:21:14.362 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:14.362 "is_configured": false, 00:21:14.362 "data_offset": 256, 00:21:14.362 "data_size": 7936 00:21:14.362 }, 00:21:14.362 { 00:21:14.362 "name": "BaseBdev2", 00:21:14.362 "uuid": "db527a73-a852-55b0-9fd9-310b6094c567", 00:21:14.362 "is_configured": true, 00:21:14.362 "data_offset": 256, 00:21:14.362 "data_size": 7936 00:21:14.362 } 00:21:14.362 ] 00:21:14.362 }' 00:21:14.362 23:42:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:14.362 23:42:59 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:21:14.928 23:42:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:14.928 23:42:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:14.928 23:42:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:14.928 23:42:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:14.928 23:42:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:14.928 23:42:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:14.928 23:42:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:14.928 23:42:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:14.928 "name": "raid_bdev1", 00:21:14.928 "uuid": "975f8426-3ecd-4388-9eca-f644e743c773", 00:21:14.928 "strip_size_kb": 0, 00:21:14.928 "state": "online", 00:21:14.928 "raid_level": "raid1", 00:21:14.928 "superblock": true, 00:21:14.928 "num_base_bdevs": 2, 00:21:14.928 "num_base_bdevs_discovered": 1, 00:21:14.928 "num_base_bdevs_operational": 1, 00:21:14.928 "base_bdevs_list": [ 00:21:14.928 { 00:21:14.928 "name": null, 00:21:14.928 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:14.928 "is_configured": false, 00:21:14.928 "data_offset": 256, 00:21:14.928 "data_size": 7936 00:21:14.928 }, 00:21:14.928 { 00:21:14.928 "name": "BaseBdev2", 00:21:14.928 "uuid": "db527a73-a852-55b0-9fd9-310b6094c567", 00:21:14.928 "is_configured": true, 00:21:14.928 "data_offset": 256, 00:21:14.928 "data_size": 7936 00:21:14.928 } 00:21:14.928 ] 00:21:14.928 }' 00:21:14.928 23:42:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:15.187 23:42:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:15.187 23:42:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:15.187 23:42:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:15.187 23:42:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:21:15.187 [2024-07-24 23:43:00.128770] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:15.187 [2024-07-24 23:43:00.133024] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1384910 00:21:15.187 [2024-07-24 23:43:00.134062] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:21:15.187 23:43:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@662 -- # sleep 1 00:21:16.561 23:43:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:16.561 23:43:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:16.561 23:43:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:16.561 23:43:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:16.561 23:43:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:16.561 23:43:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:16.561 23:43:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:16.562 23:43:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:16.562 "name": "raid_bdev1", 00:21:16.562 "uuid": "975f8426-3ecd-4388-9eca-f644e743c773", 00:21:16.562 "strip_size_kb": 0, 00:21:16.562 "state": "online", 00:21:16.562 "raid_level": "raid1", 00:21:16.562 "superblock": true, 00:21:16.562 "num_base_bdevs": 2, 00:21:16.562 "num_base_bdevs_discovered": 2, 00:21:16.562 "num_base_bdevs_operational": 2, 00:21:16.562 "process": { 00:21:16.562 "type": "rebuild", 00:21:16.562 "target": "spare", 00:21:16.562 "progress": { 00:21:16.562 "blocks": 2816, 00:21:16.562 "percent": 35 00:21:16.562 } 00:21:16.562 }, 00:21:16.562 "base_bdevs_list": [ 00:21:16.562 { 00:21:16.562 "name": "spare", 00:21:16.562 "uuid": "d15224cc-1c6f-553d-a81a-b873b869a136", 00:21:16.562 "is_configured": true, 00:21:16.562 "data_offset": 256, 00:21:16.562 "data_size": 7936 00:21:16.562 }, 00:21:16.562 { 00:21:16.562 "name": "BaseBdev2", 00:21:16.562 "uuid": "db527a73-a852-55b0-9fd9-310b6094c567", 00:21:16.562 "is_configured": true, 00:21:16.562 "data_offset": 256, 00:21:16.562 "data_size": 7936 00:21:16.562 } 00:21:16.562 ] 00:21:16.562 }' 00:21:16.562 23:43:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:16.562 23:43:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:16.562 23:43:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:16.562 23:43:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:16.562 23:43:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:21:16.562 23:43:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:21:16.562 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:21:16.562 23:43:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:21:16.562 23:43:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:21:16.562 23:43:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:21:16.562 23:43:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@705 -- # local timeout=774 00:21:16.562 23:43:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:21:16.562 23:43:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:16.562 23:43:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:16.562 23:43:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:16.562 23:43:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:16.562 23:43:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:16.562 23:43:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:16.562 23:43:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:16.562 23:43:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:16.562 "name": "raid_bdev1", 00:21:16.562 "uuid": "975f8426-3ecd-4388-9eca-f644e743c773", 00:21:16.562 "strip_size_kb": 0, 00:21:16.562 "state": "online", 00:21:16.562 "raid_level": "raid1", 00:21:16.562 "superblock": true, 00:21:16.562 "num_base_bdevs": 2, 00:21:16.562 "num_base_bdevs_discovered": 2, 00:21:16.562 "num_base_bdevs_operational": 2, 00:21:16.562 "process": { 00:21:16.562 "type": "rebuild", 00:21:16.562 "target": "spare", 00:21:16.562 "progress": { 00:21:16.562 "blocks": 3584, 00:21:16.562 "percent": 45 00:21:16.562 } 00:21:16.562 }, 00:21:16.562 "base_bdevs_list": [ 00:21:16.562 { 00:21:16.562 "name": "spare", 00:21:16.562 "uuid": "d15224cc-1c6f-553d-a81a-b873b869a136", 00:21:16.562 "is_configured": true, 00:21:16.562 "data_offset": 256, 00:21:16.562 "data_size": 7936 00:21:16.562 }, 00:21:16.562 { 00:21:16.562 "name": "BaseBdev2", 00:21:16.562 "uuid": "db527a73-a852-55b0-9fd9-310b6094c567", 00:21:16.562 "is_configured": true, 00:21:16.562 "data_offset": 256, 00:21:16.562 "data_size": 7936 00:21:16.562 } 00:21:16.562 ] 00:21:16.562 }' 00:21:16.820 23:43:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:16.820 23:43:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:16.820 23:43:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:16.820 23:43:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:16.820 23:43:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@710 -- # sleep 1 00:21:17.754 23:43:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:21:17.754 23:43:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:17.754 23:43:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:17.754 23:43:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:17.754 23:43:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:17.754 23:43:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:17.754 23:43:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:17.754 23:43:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:18.013 23:43:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:18.013 "name": "raid_bdev1", 00:21:18.013 "uuid": "975f8426-3ecd-4388-9eca-f644e743c773", 00:21:18.013 "strip_size_kb": 0, 00:21:18.013 "state": "online", 00:21:18.013 "raid_level": "raid1", 00:21:18.013 "superblock": true, 00:21:18.013 "num_base_bdevs": 2, 00:21:18.013 "num_base_bdevs_discovered": 2, 00:21:18.013 "num_base_bdevs_operational": 2, 00:21:18.013 "process": { 00:21:18.013 "type": "rebuild", 00:21:18.013 "target": "spare", 00:21:18.013 "progress": { 00:21:18.013 "blocks": 6656, 00:21:18.013 "percent": 83 00:21:18.013 } 00:21:18.013 }, 00:21:18.013 "base_bdevs_list": [ 00:21:18.013 { 00:21:18.013 "name": "spare", 00:21:18.013 "uuid": "d15224cc-1c6f-553d-a81a-b873b869a136", 00:21:18.013 "is_configured": true, 00:21:18.013 "data_offset": 256, 00:21:18.013 "data_size": 7936 00:21:18.013 }, 00:21:18.013 { 00:21:18.013 "name": "BaseBdev2", 00:21:18.013 "uuid": "db527a73-a852-55b0-9fd9-310b6094c567", 00:21:18.013 "is_configured": true, 00:21:18.013 "data_offset": 256, 00:21:18.013 "data_size": 7936 00:21:18.013 } 00:21:18.013 ] 00:21:18.013 }' 00:21:18.013 23:43:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:18.013 23:43:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:18.013 23:43:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:18.013 23:43:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:18.013 23:43:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@710 -- # sleep 1 00:21:18.272 [2024-07-24 23:43:03.255617] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:21:18.272 [2024-07-24 23:43:03.255656] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:21:18.272 [2024-07-24 23:43:03.255727] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:19.205 23:43:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:21:19.205 23:43:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:19.205 23:43:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:19.205 23:43:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:19.205 23:43:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:19.205 23:43:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:19.205 23:43:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:19.205 23:43:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:19.205 23:43:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:19.205 "name": "raid_bdev1", 00:21:19.205 "uuid": "975f8426-3ecd-4388-9eca-f644e743c773", 00:21:19.205 "strip_size_kb": 0, 00:21:19.205 "state": "online", 00:21:19.205 "raid_level": "raid1", 00:21:19.205 "superblock": true, 00:21:19.205 "num_base_bdevs": 2, 00:21:19.205 "num_base_bdevs_discovered": 2, 00:21:19.205 "num_base_bdevs_operational": 2, 00:21:19.205 "base_bdevs_list": [ 00:21:19.205 { 00:21:19.205 "name": "spare", 00:21:19.205 "uuid": "d15224cc-1c6f-553d-a81a-b873b869a136", 00:21:19.205 "is_configured": true, 00:21:19.205 "data_offset": 256, 00:21:19.205 "data_size": 7936 00:21:19.205 }, 00:21:19.205 { 00:21:19.205 "name": "BaseBdev2", 00:21:19.205 "uuid": "db527a73-a852-55b0-9fd9-310b6094c567", 00:21:19.205 "is_configured": true, 00:21:19.205 "data_offset": 256, 00:21:19.205 "data_size": 7936 00:21:19.205 } 00:21:19.205 ] 00:21:19.205 }' 00:21:19.205 23:43:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:19.205 23:43:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:21:19.205 23:43:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:19.205 23:43:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:21:19.205 23:43:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@708 -- # break 00:21:19.205 23:43:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:19.205 23:43:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:19.205 23:43:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:19.205 23:43:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:19.205 23:43:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:19.205 23:43:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:19.205 23:43:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:19.464 23:43:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:19.464 "name": "raid_bdev1", 00:21:19.464 "uuid": "975f8426-3ecd-4388-9eca-f644e743c773", 00:21:19.464 "strip_size_kb": 0, 00:21:19.464 "state": "online", 00:21:19.464 "raid_level": "raid1", 00:21:19.464 "superblock": true, 00:21:19.464 "num_base_bdevs": 2, 00:21:19.464 "num_base_bdevs_discovered": 2, 00:21:19.464 "num_base_bdevs_operational": 2, 00:21:19.464 "base_bdevs_list": [ 00:21:19.464 { 00:21:19.464 "name": "spare", 00:21:19.464 "uuid": "d15224cc-1c6f-553d-a81a-b873b869a136", 00:21:19.464 "is_configured": true, 00:21:19.464 "data_offset": 256, 00:21:19.464 "data_size": 7936 00:21:19.464 }, 00:21:19.464 { 00:21:19.464 "name": "BaseBdev2", 00:21:19.464 "uuid": "db527a73-a852-55b0-9fd9-310b6094c567", 00:21:19.464 "is_configured": true, 00:21:19.464 "data_offset": 256, 00:21:19.464 "data_size": 7936 00:21:19.464 } 00:21:19.464 ] 00:21:19.464 }' 00:21:19.464 23:43:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:19.464 23:43:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:19.464 23:43:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:19.464 23:43:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:19.464 23:43:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:19.464 23:43:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:19.464 23:43:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:19.464 23:43:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:19.464 23:43:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:19.464 23:43:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:19.464 23:43:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:19.464 23:43:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:19.464 23:43:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:19.464 23:43:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:19.464 23:43:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:19.464 23:43:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:19.723 23:43:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:19.723 "name": "raid_bdev1", 00:21:19.723 "uuid": "975f8426-3ecd-4388-9eca-f644e743c773", 00:21:19.723 "strip_size_kb": 0, 00:21:19.723 "state": "online", 00:21:19.723 "raid_level": "raid1", 00:21:19.723 "superblock": true, 00:21:19.723 "num_base_bdevs": 2, 00:21:19.723 "num_base_bdevs_discovered": 2, 00:21:19.723 "num_base_bdevs_operational": 2, 00:21:19.723 "base_bdevs_list": [ 00:21:19.723 { 00:21:19.723 "name": "spare", 00:21:19.723 "uuid": "d15224cc-1c6f-553d-a81a-b873b869a136", 00:21:19.723 "is_configured": true, 00:21:19.723 "data_offset": 256, 00:21:19.723 "data_size": 7936 00:21:19.723 }, 00:21:19.723 { 00:21:19.723 "name": "BaseBdev2", 00:21:19.723 "uuid": "db527a73-a852-55b0-9fd9-310b6094c567", 00:21:19.723 "is_configured": true, 00:21:19.723 "data_offset": 256, 00:21:19.723 "data_size": 7936 00:21:19.723 } 00:21:19.723 ] 00:21:19.723 }' 00:21:19.723 23:43:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:19.723 23:43:04 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:21:20.292 23:43:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:20.292 [2024-07-24 23:43:05.200502] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:20.292 [2024-07-24 23:43:05.200522] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:20.292 [2024-07-24 23:43:05.200567] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:20.292 [2024-07-24 23:43:05.200605] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:20.292 [2024-07-24 23:43:05.200610] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x138b4f0 name raid_bdev1, state offline 00:21:20.292 23:43:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:20.292 23:43:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # jq length 00:21:20.550 23:43:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:21:20.550 23:43:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:21:20.550 23:43:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:21:20.550 23:43:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:21:20.550 23:43:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:20.550 23:43:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:21:20.550 23:43:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # local bdev_list 00:21:20.550 23:43:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:21:20.550 23:43:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # local nbd_list 00:21:20.550 23:43:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@12 -- # local i 00:21:20.550 23:43:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:21:20.550 23:43:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:21:20.550 23:43:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:21:20.550 /dev/nbd0 00:21:20.550 23:43:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:21:20.809 23:43:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:21:20.809 23:43:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:21:20.809 23:43:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # local i 00:21:20.809 23:43:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:21:20.809 23:43:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:21:20.809 23:43:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:21:20.809 23:43:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@873 -- # break 00:21:20.809 23:43:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:21:20.809 23:43:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:21:20.809 23:43:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:21:20.809 1+0 records in 00:21:20.809 1+0 records out 00:21:20.809 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000226953 s, 18.0 MB/s 00:21:20.809 23:43:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:20.809 23:43:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # size=4096 00:21:20.809 23:43:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:20.809 23:43:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:21:20.809 23:43:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@889 -- # return 0 00:21:20.809 23:43:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:21:20.809 23:43:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:21:20.809 23:43:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:21:20.809 /dev/nbd1 00:21:20.809 23:43:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:21:20.809 23:43:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:21:20.809 23:43:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:21:20.809 23:43:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # local i 00:21:20.809 23:43:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:21:20.809 23:43:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:21:20.809 23:43:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:21:20.809 23:43:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@873 -- # break 00:21:20.809 23:43:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:21:20.809 23:43:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:21:20.809 23:43:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:21:20.809 1+0 records in 00:21:20.809 1+0 records out 00:21:20.809 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000237008 s, 17.3 MB/s 00:21:20.809 23:43:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:20.809 23:43:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # size=4096 00:21:20.809 23:43:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:20.809 23:43:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:21:20.809 23:43:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@889 -- # return 0 00:21:20.809 23:43:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:21:20.809 23:43:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:21:20.809 23:43:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:21:21.068 23:43:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:21:21.068 23:43:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:21.068 23:43:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:21:21.068 23:43:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # local nbd_list 00:21:21.068 23:43:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@51 -- # local i 00:21:21.068 23:43:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:21:21.068 23:43:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:21:21.068 23:43:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:21:21.068 23:43:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:21:21.068 23:43:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:21:21.068 23:43:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:21:21.068 23:43:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:21:21.068 23:43:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:21:21.068 23:43:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:21:21.068 23:43:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:21:21.068 23:43:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:21:21.068 23:43:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:21:21.326 23:43:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:21:21.326 23:43:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:21:21.326 23:43:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:21:21.326 23:43:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:21:21.326 23:43:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:21:21.326 23:43:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:21:21.326 23:43:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:21:21.326 23:43:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:21:21.326 23:43:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:21:21.326 23:43:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:21:21.585 23:43:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:21:21.585 [2024-07-24 23:43:06.538270] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:21:21.585 [2024-07-24 23:43:06.538300] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:21.585 [2024-07-24 23:43:06.538314] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12f70c0 00:21:21.585 [2024-07-24 23:43:06.538336] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:21.585 [2024-07-24 23:43:06.539500] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:21.585 [2024-07-24 23:43:06.539519] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:21:21.585 [2024-07-24 23:43:06.539569] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:21:21.585 [2024-07-24 23:43:06.539588] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:21.585 [2024-07-24 23:43:06.539655] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:21.585 spare 00:21:21.585 23:43:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:21.585 23:43:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:21.585 23:43:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:21.585 23:43:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:21.585 23:43:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:21.585 23:43:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:21.585 23:43:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:21.585 23:43:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:21.585 23:43:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:21.585 23:43:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:21.585 23:43:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:21.585 23:43:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:21.844 [2024-07-24 23:43:06.639949] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x13896e0 00:21:21.844 [2024-07-24 23:43:06.639963] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:21:21.844 [2024-07-24 23:43:06.640085] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x138d850 00:21:21.844 [2024-07-24 23:43:06.640183] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x13896e0 00:21:21.844 [2024-07-24 23:43:06.640189] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x13896e0 00:21:21.844 [2024-07-24 23:43:06.640253] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:21.844 23:43:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:21.844 "name": "raid_bdev1", 00:21:21.844 "uuid": "975f8426-3ecd-4388-9eca-f644e743c773", 00:21:21.844 "strip_size_kb": 0, 00:21:21.844 "state": "online", 00:21:21.844 "raid_level": "raid1", 00:21:21.844 "superblock": true, 00:21:21.844 "num_base_bdevs": 2, 00:21:21.844 "num_base_bdevs_discovered": 2, 00:21:21.844 "num_base_bdevs_operational": 2, 00:21:21.844 "base_bdevs_list": [ 00:21:21.844 { 00:21:21.844 "name": "spare", 00:21:21.844 "uuid": "d15224cc-1c6f-553d-a81a-b873b869a136", 00:21:21.844 "is_configured": true, 00:21:21.844 "data_offset": 256, 00:21:21.844 "data_size": 7936 00:21:21.844 }, 00:21:21.844 { 00:21:21.844 "name": "BaseBdev2", 00:21:21.844 "uuid": "db527a73-a852-55b0-9fd9-310b6094c567", 00:21:21.844 "is_configured": true, 00:21:21.844 "data_offset": 256, 00:21:21.844 "data_size": 7936 00:21:21.844 } 00:21:21.844 ] 00:21:21.844 }' 00:21:21.844 23:43:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:21.844 23:43:06 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:21:22.411 23:43:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:22.411 23:43:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:22.411 23:43:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:22.411 23:43:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:22.411 23:43:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:22.411 23:43:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:22.411 23:43:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:22.411 23:43:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:22.411 "name": "raid_bdev1", 00:21:22.411 "uuid": "975f8426-3ecd-4388-9eca-f644e743c773", 00:21:22.411 "strip_size_kb": 0, 00:21:22.411 "state": "online", 00:21:22.411 "raid_level": "raid1", 00:21:22.411 "superblock": true, 00:21:22.411 "num_base_bdevs": 2, 00:21:22.411 "num_base_bdevs_discovered": 2, 00:21:22.411 "num_base_bdevs_operational": 2, 00:21:22.411 "base_bdevs_list": [ 00:21:22.411 { 00:21:22.411 "name": "spare", 00:21:22.411 "uuid": "d15224cc-1c6f-553d-a81a-b873b869a136", 00:21:22.411 "is_configured": true, 00:21:22.411 "data_offset": 256, 00:21:22.411 "data_size": 7936 00:21:22.411 }, 00:21:22.411 { 00:21:22.411 "name": "BaseBdev2", 00:21:22.411 "uuid": "db527a73-a852-55b0-9fd9-310b6094c567", 00:21:22.411 "is_configured": true, 00:21:22.411 "data_offset": 256, 00:21:22.411 "data_size": 7936 00:21:22.411 } 00:21:22.411 ] 00:21:22.411 }' 00:21:22.411 23:43:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:22.669 23:43:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:22.669 23:43:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:22.669 23:43:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:22.669 23:43:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:22.669 23:43:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:21:22.669 23:43:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:21:22.669 23:43:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:21:22.928 [2024-07-24 23:43:07.785578] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:22.928 23:43:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:22.928 23:43:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:22.928 23:43:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:22.928 23:43:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:22.928 23:43:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:22.928 23:43:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:22.928 23:43:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:22.928 23:43:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:22.928 23:43:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:22.928 23:43:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:22.928 23:43:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:22.928 23:43:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:23.187 23:43:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:23.187 "name": "raid_bdev1", 00:21:23.187 "uuid": "975f8426-3ecd-4388-9eca-f644e743c773", 00:21:23.187 "strip_size_kb": 0, 00:21:23.187 "state": "online", 00:21:23.187 "raid_level": "raid1", 00:21:23.187 "superblock": true, 00:21:23.187 "num_base_bdevs": 2, 00:21:23.187 "num_base_bdevs_discovered": 1, 00:21:23.187 "num_base_bdevs_operational": 1, 00:21:23.187 "base_bdevs_list": [ 00:21:23.187 { 00:21:23.187 "name": null, 00:21:23.187 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:23.187 "is_configured": false, 00:21:23.187 "data_offset": 256, 00:21:23.187 "data_size": 7936 00:21:23.187 }, 00:21:23.187 { 00:21:23.187 "name": "BaseBdev2", 00:21:23.187 "uuid": "db527a73-a852-55b0-9fd9-310b6094c567", 00:21:23.187 "is_configured": true, 00:21:23.187 "data_offset": 256, 00:21:23.187 "data_size": 7936 00:21:23.187 } 00:21:23.187 ] 00:21:23.187 }' 00:21:23.187 23:43:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:23.187 23:43:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:21:23.445 23:43:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:21:23.704 [2024-07-24 23:43:08.591678] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:23.704 [2024-07-24 23:43:08.591786] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:21:23.704 [2024-07-24 23:43:08.591796] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:21:23.704 [2024-07-24 23:43:08.591815] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:23.704 [2024-07-24 23:43:08.596054] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13899c0 00:21:23.704 [2024-07-24 23:43:08.597595] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:21:23.704 23:43:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@755 -- # sleep 1 00:21:24.639 23:43:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:24.639 23:43:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:24.639 23:43:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:24.639 23:43:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:24.639 23:43:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:24.639 23:43:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:24.639 23:43:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:24.897 23:43:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:24.897 "name": "raid_bdev1", 00:21:24.897 "uuid": "975f8426-3ecd-4388-9eca-f644e743c773", 00:21:24.897 "strip_size_kb": 0, 00:21:24.897 "state": "online", 00:21:24.897 "raid_level": "raid1", 00:21:24.897 "superblock": true, 00:21:24.897 "num_base_bdevs": 2, 00:21:24.897 "num_base_bdevs_discovered": 2, 00:21:24.897 "num_base_bdevs_operational": 2, 00:21:24.897 "process": { 00:21:24.897 "type": "rebuild", 00:21:24.897 "target": "spare", 00:21:24.897 "progress": { 00:21:24.897 "blocks": 2816, 00:21:24.897 "percent": 35 00:21:24.897 } 00:21:24.897 }, 00:21:24.897 "base_bdevs_list": [ 00:21:24.897 { 00:21:24.897 "name": "spare", 00:21:24.897 "uuid": "d15224cc-1c6f-553d-a81a-b873b869a136", 00:21:24.897 "is_configured": true, 00:21:24.897 "data_offset": 256, 00:21:24.897 "data_size": 7936 00:21:24.897 }, 00:21:24.897 { 00:21:24.897 "name": "BaseBdev2", 00:21:24.897 "uuid": "db527a73-a852-55b0-9fd9-310b6094c567", 00:21:24.897 "is_configured": true, 00:21:24.897 "data_offset": 256, 00:21:24.897 "data_size": 7936 00:21:24.897 } 00:21:24.897 ] 00:21:24.897 }' 00:21:24.897 23:43:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:24.897 23:43:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:24.897 23:43:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:24.897 23:43:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:24.897 23:43:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:21:25.156 [2024-07-24 23:43:10.040301] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:25.156 [2024-07-24 23:43:10.108155] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:21:25.156 [2024-07-24 23:43:10.108187] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:25.156 [2024-07-24 23:43:10.108197] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:25.156 [2024-07-24 23:43:10.108201] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:21:25.156 23:43:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:25.156 23:43:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:25.156 23:43:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:25.156 23:43:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:25.156 23:43:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:25.156 23:43:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:25.156 23:43:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:25.156 23:43:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:25.156 23:43:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:25.156 23:43:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:25.156 23:43:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:25.156 23:43:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:25.414 23:43:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:25.414 "name": "raid_bdev1", 00:21:25.414 "uuid": "975f8426-3ecd-4388-9eca-f644e743c773", 00:21:25.414 "strip_size_kb": 0, 00:21:25.414 "state": "online", 00:21:25.414 "raid_level": "raid1", 00:21:25.414 "superblock": true, 00:21:25.414 "num_base_bdevs": 2, 00:21:25.414 "num_base_bdevs_discovered": 1, 00:21:25.414 "num_base_bdevs_operational": 1, 00:21:25.414 "base_bdevs_list": [ 00:21:25.414 { 00:21:25.414 "name": null, 00:21:25.414 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:25.414 "is_configured": false, 00:21:25.414 "data_offset": 256, 00:21:25.414 "data_size": 7936 00:21:25.414 }, 00:21:25.414 { 00:21:25.414 "name": "BaseBdev2", 00:21:25.414 "uuid": "db527a73-a852-55b0-9fd9-310b6094c567", 00:21:25.414 "is_configured": true, 00:21:25.414 "data_offset": 256, 00:21:25.414 "data_size": 7936 00:21:25.414 } 00:21:25.414 ] 00:21:25.414 }' 00:21:25.414 23:43:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:25.414 23:43:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:21:25.981 23:43:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:21:25.981 [2024-07-24 23:43:10.926424] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:21:25.981 [2024-07-24 23:43:10.926460] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:25.981 [2024-07-24 23:43:10.926491] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11d1d30 00:21:25.981 [2024-07-24 23:43:10.926498] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:25.981 [2024-07-24 23:43:10.926766] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:25.981 [2024-07-24 23:43:10.926777] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:21:25.981 [2024-07-24 23:43:10.926832] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:21:25.981 [2024-07-24 23:43:10.926840] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:21:25.981 [2024-07-24 23:43:10.926845] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:21:25.981 [2024-07-24 23:43:10.926856] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:25.981 [2024-07-24 23:43:10.930999] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x138c5b0 00:21:25.981 spare 00:21:25.981 [2024-07-24 23:43:10.932048] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:21:25.981 23:43:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@762 -- # sleep 1 00:21:27.357 23:43:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:27.357 23:43:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:27.357 23:43:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:27.357 23:43:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:27.357 23:43:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:27.357 23:43:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:27.357 23:43:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:27.358 23:43:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:27.358 "name": "raid_bdev1", 00:21:27.358 "uuid": "975f8426-3ecd-4388-9eca-f644e743c773", 00:21:27.358 "strip_size_kb": 0, 00:21:27.358 "state": "online", 00:21:27.358 "raid_level": "raid1", 00:21:27.358 "superblock": true, 00:21:27.358 "num_base_bdevs": 2, 00:21:27.358 "num_base_bdevs_discovered": 2, 00:21:27.358 "num_base_bdevs_operational": 2, 00:21:27.358 "process": { 00:21:27.358 "type": "rebuild", 00:21:27.358 "target": "spare", 00:21:27.358 "progress": { 00:21:27.358 "blocks": 2816, 00:21:27.358 "percent": 35 00:21:27.358 } 00:21:27.358 }, 00:21:27.358 "base_bdevs_list": [ 00:21:27.358 { 00:21:27.358 "name": "spare", 00:21:27.358 "uuid": "d15224cc-1c6f-553d-a81a-b873b869a136", 00:21:27.358 "is_configured": true, 00:21:27.358 "data_offset": 256, 00:21:27.358 "data_size": 7936 00:21:27.358 }, 00:21:27.358 { 00:21:27.358 "name": "BaseBdev2", 00:21:27.358 "uuid": "db527a73-a852-55b0-9fd9-310b6094c567", 00:21:27.358 "is_configured": true, 00:21:27.358 "data_offset": 256, 00:21:27.358 "data_size": 7936 00:21:27.358 } 00:21:27.358 ] 00:21:27.358 }' 00:21:27.358 23:43:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:27.358 23:43:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:27.358 23:43:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:27.358 23:43:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:27.358 23:43:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:21:27.358 [2024-07-24 23:43:12.342580] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:27.617 [2024-07-24 23:43:12.442581] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:21:27.617 [2024-07-24 23:43:12.442610] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:27.617 [2024-07-24 23:43:12.442619] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:27.617 [2024-07-24 23:43:12.442623] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:21:27.617 23:43:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:27.617 23:43:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:27.617 23:43:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:27.617 23:43:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:27.617 23:43:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:27.617 23:43:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:27.617 23:43:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:27.617 23:43:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:27.617 23:43:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:27.617 23:43:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:27.617 23:43:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:27.617 23:43:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:27.876 23:43:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:27.876 "name": "raid_bdev1", 00:21:27.876 "uuid": "975f8426-3ecd-4388-9eca-f644e743c773", 00:21:27.876 "strip_size_kb": 0, 00:21:27.876 "state": "online", 00:21:27.876 "raid_level": "raid1", 00:21:27.876 "superblock": true, 00:21:27.876 "num_base_bdevs": 2, 00:21:27.876 "num_base_bdevs_discovered": 1, 00:21:27.876 "num_base_bdevs_operational": 1, 00:21:27.876 "base_bdevs_list": [ 00:21:27.876 { 00:21:27.876 "name": null, 00:21:27.876 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:27.876 "is_configured": false, 00:21:27.876 "data_offset": 256, 00:21:27.876 "data_size": 7936 00:21:27.876 }, 00:21:27.876 { 00:21:27.876 "name": "BaseBdev2", 00:21:27.876 "uuid": "db527a73-a852-55b0-9fd9-310b6094c567", 00:21:27.876 "is_configured": true, 00:21:27.876 "data_offset": 256, 00:21:27.876 "data_size": 7936 00:21:27.876 } 00:21:27.876 ] 00:21:27.876 }' 00:21:27.876 23:43:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:27.876 23:43:12 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:21:28.134 23:43:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:28.134 23:43:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:28.134 23:43:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:28.134 23:43:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:28.134 23:43:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:28.134 23:43:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:28.134 23:43:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:28.392 23:43:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:28.392 "name": "raid_bdev1", 00:21:28.392 "uuid": "975f8426-3ecd-4388-9eca-f644e743c773", 00:21:28.392 "strip_size_kb": 0, 00:21:28.392 "state": "online", 00:21:28.392 "raid_level": "raid1", 00:21:28.392 "superblock": true, 00:21:28.392 "num_base_bdevs": 2, 00:21:28.392 "num_base_bdevs_discovered": 1, 00:21:28.392 "num_base_bdevs_operational": 1, 00:21:28.392 "base_bdevs_list": [ 00:21:28.392 { 00:21:28.392 "name": null, 00:21:28.392 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:28.392 "is_configured": false, 00:21:28.392 "data_offset": 256, 00:21:28.392 "data_size": 7936 00:21:28.392 }, 00:21:28.392 { 00:21:28.392 "name": "BaseBdev2", 00:21:28.392 "uuid": "db527a73-a852-55b0-9fd9-310b6094c567", 00:21:28.392 "is_configured": true, 00:21:28.392 "data_offset": 256, 00:21:28.392 "data_size": 7936 00:21:28.393 } 00:21:28.393 ] 00:21:28.393 }' 00:21:28.393 23:43:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:28.393 23:43:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:28.393 23:43:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:28.393 23:43:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:28.393 23:43:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:21:28.651 23:43:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:21:28.909 [2024-07-24 23:43:13.669771] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:21:28.909 [2024-07-24 23:43:13.669804] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:28.909 [2024-07-24 23:43:13.669818] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1389d40 00:21:28.909 [2024-07-24 23:43:13.669840] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:28.909 [2024-07-24 23:43:13.670095] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:28.909 [2024-07-24 23:43:13.670105] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:21:28.909 [2024-07-24 23:43:13.670150] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:21:28.909 [2024-07-24 23:43:13.670157] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:21:28.909 [2024-07-24 23:43:13.670162] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:21:28.909 BaseBdev1 00:21:28.909 23:43:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@773 -- # sleep 1 00:21:29.955 23:43:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:29.955 23:43:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:29.955 23:43:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:29.955 23:43:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:29.955 23:43:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:29.955 23:43:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:29.955 23:43:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:29.955 23:43:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:29.955 23:43:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:29.955 23:43:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:29.955 23:43:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:29.955 23:43:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:29.955 23:43:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:29.955 "name": "raid_bdev1", 00:21:29.955 "uuid": "975f8426-3ecd-4388-9eca-f644e743c773", 00:21:29.955 "strip_size_kb": 0, 00:21:29.955 "state": "online", 00:21:29.955 "raid_level": "raid1", 00:21:29.955 "superblock": true, 00:21:29.955 "num_base_bdevs": 2, 00:21:29.955 "num_base_bdevs_discovered": 1, 00:21:29.955 "num_base_bdevs_operational": 1, 00:21:29.955 "base_bdevs_list": [ 00:21:29.955 { 00:21:29.955 "name": null, 00:21:29.955 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:29.955 "is_configured": false, 00:21:29.955 "data_offset": 256, 00:21:29.955 "data_size": 7936 00:21:29.955 }, 00:21:29.955 { 00:21:29.955 "name": "BaseBdev2", 00:21:29.955 "uuid": "db527a73-a852-55b0-9fd9-310b6094c567", 00:21:29.955 "is_configured": true, 00:21:29.955 "data_offset": 256, 00:21:29.955 "data_size": 7936 00:21:29.955 } 00:21:29.955 ] 00:21:29.955 }' 00:21:29.955 23:43:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:29.955 23:43:14 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:21:30.523 23:43:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:30.523 23:43:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:30.523 23:43:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:30.523 23:43:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:30.523 23:43:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:30.523 23:43:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:30.523 23:43:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:30.781 23:43:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:30.781 "name": "raid_bdev1", 00:21:30.781 "uuid": "975f8426-3ecd-4388-9eca-f644e743c773", 00:21:30.781 "strip_size_kb": 0, 00:21:30.781 "state": "online", 00:21:30.781 "raid_level": "raid1", 00:21:30.781 "superblock": true, 00:21:30.781 "num_base_bdevs": 2, 00:21:30.781 "num_base_bdevs_discovered": 1, 00:21:30.781 "num_base_bdevs_operational": 1, 00:21:30.781 "base_bdevs_list": [ 00:21:30.781 { 00:21:30.781 "name": null, 00:21:30.781 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:30.781 "is_configured": false, 00:21:30.781 "data_offset": 256, 00:21:30.781 "data_size": 7936 00:21:30.781 }, 00:21:30.781 { 00:21:30.781 "name": "BaseBdev2", 00:21:30.781 "uuid": "db527a73-a852-55b0-9fd9-310b6094c567", 00:21:30.781 "is_configured": true, 00:21:30.781 "data_offset": 256, 00:21:30.781 "data_size": 7936 00:21:30.781 } 00:21:30.781 ] 00:21:30.781 }' 00:21:30.781 23:43:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:30.781 23:43:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:30.781 23:43:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:30.781 23:43:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:30.781 23:43:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:21:30.781 23:43:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@650 -- # local es=0 00:21:30.781 23:43:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:21:30.781 23:43:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:30.781 23:43:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:21:30.781 23:43:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:30.781 23:43:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:21:30.781 23:43:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:30.781 23:43:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:21:30.782 23:43:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:30.782 23:43:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:21:30.782 23:43:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:21:30.782 [2024-07-24 23:43:15.759181] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:30.782 [2024-07-24 23:43:15.759275] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:21:30.782 [2024-07-24 23:43:15.759284] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:21:30.782 request: 00:21:30.782 { 00:21:30.782 "base_bdev": "BaseBdev1", 00:21:30.782 "raid_bdev": "raid_bdev1", 00:21:30.782 "method": "bdev_raid_add_base_bdev", 00:21:30.782 "req_id": 1 00:21:30.782 } 00:21:30.782 Got JSON-RPC error response 00:21:30.782 response: 00:21:30.782 { 00:21:30.782 "code": -22, 00:21:30.782 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:21:30.782 } 00:21:30.782 23:43:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@653 -- # es=1 00:21:30.782 23:43:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:21:30.782 23:43:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:21:30.782 23:43:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:21:30.782 23:43:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@777 -- # sleep 1 00:21:32.159 23:43:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:32.159 23:43:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:32.159 23:43:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:32.159 23:43:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:32.159 23:43:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:32.159 23:43:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:32.159 23:43:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:32.159 23:43:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:32.159 23:43:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:32.159 23:43:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:32.159 23:43:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:32.159 23:43:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:32.159 23:43:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:32.159 "name": "raid_bdev1", 00:21:32.159 "uuid": "975f8426-3ecd-4388-9eca-f644e743c773", 00:21:32.159 "strip_size_kb": 0, 00:21:32.159 "state": "online", 00:21:32.159 "raid_level": "raid1", 00:21:32.159 "superblock": true, 00:21:32.159 "num_base_bdevs": 2, 00:21:32.159 "num_base_bdevs_discovered": 1, 00:21:32.159 "num_base_bdevs_operational": 1, 00:21:32.159 "base_bdevs_list": [ 00:21:32.159 { 00:21:32.159 "name": null, 00:21:32.159 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:32.159 "is_configured": false, 00:21:32.159 "data_offset": 256, 00:21:32.159 "data_size": 7936 00:21:32.159 }, 00:21:32.159 { 00:21:32.159 "name": "BaseBdev2", 00:21:32.159 "uuid": "db527a73-a852-55b0-9fd9-310b6094c567", 00:21:32.159 "is_configured": true, 00:21:32.159 "data_offset": 256, 00:21:32.159 "data_size": 7936 00:21:32.159 } 00:21:32.159 ] 00:21:32.159 }' 00:21:32.159 23:43:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:32.159 23:43:16 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:21:32.725 23:43:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:32.725 23:43:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:32.725 23:43:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:32.725 23:43:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:32.725 23:43:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:32.725 23:43:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:32.725 23:43:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:32.725 23:43:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:32.725 "name": "raid_bdev1", 00:21:32.725 "uuid": "975f8426-3ecd-4388-9eca-f644e743c773", 00:21:32.725 "strip_size_kb": 0, 00:21:32.725 "state": "online", 00:21:32.725 "raid_level": "raid1", 00:21:32.725 "superblock": true, 00:21:32.725 "num_base_bdevs": 2, 00:21:32.725 "num_base_bdevs_discovered": 1, 00:21:32.725 "num_base_bdevs_operational": 1, 00:21:32.725 "base_bdevs_list": [ 00:21:32.725 { 00:21:32.725 "name": null, 00:21:32.725 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:32.725 "is_configured": false, 00:21:32.725 "data_offset": 256, 00:21:32.725 "data_size": 7936 00:21:32.725 }, 00:21:32.725 { 00:21:32.725 "name": "BaseBdev2", 00:21:32.725 "uuid": "db527a73-a852-55b0-9fd9-310b6094c567", 00:21:32.725 "is_configured": true, 00:21:32.725 "data_offset": 256, 00:21:32.725 "data_size": 7936 00:21:32.725 } 00:21:32.725 ] 00:21:32.725 }' 00:21:32.725 23:43:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:32.725 23:43:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:32.725 23:43:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:32.984 23:43:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:32.984 23:43:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@782 -- # killprocess 382284 00:21:32.984 23:43:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@950 -- # '[' -z 382284 ']' 00:21:32.984 23:43:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@954 -- # kill -0 382284 00:21:32.984 23:43:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@955 -- # uname 00:21:32.984 23:43:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:21:32.984 23:43:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 382284 00:21:32.984 23:43:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:21:32.984 23:43:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:21:32.984 23:43:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@968 -- # echo 'killing process with pid 382284' 00:21:32.984 killing process with pid 382284 00:21:32.984 23:43:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@969 -- # kill 382284 00:21:32.984 Received shutdown signal, test time was about 60.000000 seconds 00:21:32.984 00:21:32.984 Latency(us) 00:21:32.984 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:21:32.984 =================================================================================================================== 00:21:32.984 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:21:32.984 [2024-07-24 23:43:17.784413] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:32.984 [2024-07-24 23:43:17.784489] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:32.984 [2024-07-24 23:43:17.784522] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:32.984 [2024-07-24 23:43:17.784529] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13896e0 name raid_bdev1, state offline 00:21:32.984 23:43:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@974 -- # wait 382284 00:21:32.984 [2024-07-24 23:43:17.807753] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:32.984 23:43:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@784 -- # return 0 00:21:32.984 00:21:32.984 real 0m25.503s 00:21:32.984 user 0m39.180s 00:21:32.984 sys 0m3.225s 00:21:32.984 23:43:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@1126 -- # xtrace_disable 00:21:32.984 23:43:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:21:32.984 ************************************ 00:21:32.984 END TEST raid_rebuild_test_sb_4k 00:21:32.984 ************************************ 00:21:33.243 23:43:18 bdev_raid -- bdev/bdev_raid.sh@904 -- # base_malloc_params='-m 32' 00:21:33.243 23:43:18 bdev_raid -- bdev/bdev_raid.sh@905 -- # run_test raid_state_function_test_sb_md_separate raid_state_function_test raid1 2 true 00:21:33.243 23:43:18 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:21:33.243 23:43:18 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:21:33.243 23:43:18 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:33.243 ************************************ 00:21:33.243 START TEST raid_state_function_test_sb_md_separate 00:21:33.243 ************************************ 00:21:33.243 23:43:18 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@1125 -- # raid_state_function_test raid1 2 true 00:21:33.243 23:43:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:21:33.243 23:43:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:21:33.243 23:43:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:21:33.243 23:43:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:21:33.243 23:43:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:21:33.243 23:43:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:33.244 23:43:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:21:33.244 23:43:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:33.244 23:43:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:33.244 23:43:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:21:33.244 23:43:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:33.244 23:43:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:33.244 23:43:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:21:33.244 23:43:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:21:33.244 23:43:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:21:33.244 23:43:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # local strip_size 00:21:33.244 23:43:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:21:33.244 23:43:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:21:33.244 23:43:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:21:33.244 23:43:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:21:33.244 23:43:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:21:33.244 23:43:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:21:33.244 23:43:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@244 -- # raid_pid=386846 00:21:33.244 23:43:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 386846' 00:21:33.244 Process raid pid: 386846 00:21:33.244 23:43:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:21:33.244 23:43:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@246 -- # waitforlisten 386846 /var/tmp/spdk-raid.sock 00:21:33.244 23:43:18 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@831 -- # '[' -z 386846 ']' 00:21:33.244 23:43:18 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:33.244 23:43:18 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@836 -- # local max_retries=100 00:21:33.244 23:43:18 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:33.244 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:33.244 23:43:18 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@840 -- # xtrace_disable 00:21:33.244 23:43:18 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:21:33.244 [2024-07-24 23:43:18.091494] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:21:33.244 [2024-07-24 23:43:18.091530] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:21:33.244 [2024-07-24 23:43:18.154130] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:33.244 [2024-07-24 23:43:18.232525] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:21:33.502 [2024-07-24 23:43:18.281711] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:33.502 [2024-07-24 23:43:18.281734] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:34.069 23:43:18 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:21:34.069 23:43:18 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@864 -- # return 0 00:21:34.069 23:43:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:21:34.069 [2024-07-24 23:43:19.040405] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:34.069 [2024-07-24 23:43:19.040437] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:34.069 [2024-07-24 23:43:19.040442] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:34.069 [2024-07-24 23:43:19.040448] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:34.069 23:43:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:21:34.069 23:43:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:34.069 23:43:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:34.069 23:43:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:34.069 23:43:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:34.069 23:43:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:34.069 23:43:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:34.069 23:43:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:34.069 23:43:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:34.069 23:43:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:34.069 23:43:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:34.069 23:43:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:34.328 23:43:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:34.328 "name": "Existed_Raid", 00:21:34.328 "uuid": "a1213edf-fc0e-45bb-a706-07d324b3f67b", 00:21:34.328 "strip_size_kb": 0, 00:21:34.328 "state": "configuring", 00:21:34.328 "raid_level": "raid1", 00:21:34.328 "superblock": true, 00:21:34.328 "num_base_bdevs": 2, 00:21:34.328 "num_base_bdevs_discovered": 0, 00:21:34.328 "num_base_bdevs_operational": 2, 00:21:34.328 "base_bdevs_list": [ 00:21:34.328 { 00:21:34.328 "name": "BaseBdev1", 00:21:34.328 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:34.328 "is_configured": false, 00:21:34.328 "data_offset": 0, 00:21:34.328 "data_size": 0 00:21:34.328 }, 00:21:34.328 { 00:21:34.328 "name": "BaseBdev2", 00:21:34.328 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:34.328 "is_configured": false, 00:21:34.328 "data_offset": 0, 00:21:34.328 "data_size": 0 00:21:34.328 } 00:21:34.328 ] 00:21:34.328 }' 00:21:34.328 23:43:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:34.328 23:43:19 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:21:34.895 23:43:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:34.895 [2024-07-24 23:43:19.814316] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:34.895 [2024-07-24 23:43:19.814335] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x159eb10 name Existed_Raid, state configuring 00:21:34.896 23:43:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:21:35.154 [2024-07-24 23:43:19.982772] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:35.154 [2024-07-24 23:43:19.982788] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:35.154 [2024-07-24 23:43:19.982792] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:35.154 [2024-07-24 23:43:19.982797] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:35.154 23:43:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev1 00:21:35.413 [2024-07-24 23:43:20.168133] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:35.413 BaseBdev1 00:21:35.413 23:43:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:21:35.413 23:43:20 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:21:35.413 23:43:20 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:21:35.413 23:43:20 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@901 -- # local i 00:21:35.413 23:43:20 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:21:35.413 23:43:20 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:21:35.413 23:43:20 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:35.413 23:43:20 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:21:35.672 [ 00:21:35.672 { 00:21:35.672 "name": "BaseBdev1", 00:21:35.672 "aliases": [ 00:21:35.672 "26d96377-afdd-47b8-a13a-d0fa81e673e9" 00:21:35.672 ], 00:21:35.672 "product_name": "Malloc disk", 00:21:35.672 "block_size": 4096, 00:21:35.672 "num_blocks": 8192, 00:21:35.672 "uuid": "26d96377-afdd-47b8-a13a-d0fa81e673e9", 00:21:35.672 "md_size": 32, 00:21:35.672 "md_interleave": false, 00:21:35.672 "dif_type": 0, 00:21:35.672 "assigned_rate_limits": { 00:21:35.672 "rw_ios_per_sec": 0, 00:21:35.672 "rw_mbytes_per_sec": 0, 00:21:35.672 "r_mbytes_per_sec": 0, 00:21:35.672 "w_mbytes_per_sec": 0 00:21:35.672 }, 00:21:35.672 "claimed": true, 00:21:35.672 "claim_type": "exclusive_write", 00:21:35.672 "zoned": false, 00:21:35.672 "supported_io_types": { 00:21:35.672 "read": true, 00:21:35.672 "write": true, 00:21:35.672 "unmap": true, 00:21:35.672 "flush": true, 00:21:35.672 "reset": true, 00:21:35.672 "nvme_admin": false, 00:21:35.672 "nvme_io": false, 00:21:35.672 "nvme_io_md": false, 00:21:35.672 "write_zeroes": true, 00:21:35.672 "zcopy": true, 00:21:35.672 "get_zone_info": false, 00:21:35.672 "zone_management": false, 00:21:35.672 "zone_append": false, 00:21:35.672 "compare": false, 00:21:35.672 "compare_and_write": false, 00:21:35.672 "abort": true, 00:21:35.672 "seek_hole": false, 00:21:35.672 "seek_data": false, 00:21:35.672 "copy": true, 00:21:35.672 "nvme_iov_md": false 00:21:35.672 }, 00:21:35.672 "memory_domains": [ 00:21:35.672 { 00:21:35.672 "dma_device_id": "system", 00:21:35.672 "dma_device_type": 1 00:21:35.672 }, 00:21:35.672 { 00:21:35.672 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:35.672 "dma_device_type": 2 00:21:35.672 } 00:21:35.672 ], 00:21:35.672 "driver_specific": {} 00:21:35.672 } 00:21:35.672 ] 00:21:35.672 23:43:20 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@907 -- # return 0 00:21:35.672 23:43:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:21:35.672 23:43:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:35.672 23:43:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:35.672 23:43:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:35.672 23:43:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:35.672 23:43:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:35.672 23:43:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:35.672 23:43:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:35.672 23:43:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:35.672 23:43:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:35.672 23:43:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:35.672 23:43:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:35.931 23:43:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:35.931 "name": "Existed_Raid", 00:21:35.931 "uuid": "00ba64b4-2a1d-455b-8e3d-2c8e4812f35f", 00:21:35.931 "strip_size_kb": 0, 00:21:35.931 "state": "configuring", 00:21:35.931 "raid_level": "raid1", 00:21:35.931 "superblock": true, 00:21:35.931 "num_base_bdevs": 2, 00:21:35.931 "num_base_bdevs_discovered": 1, 00:21:35.931 "num_base_bdevs_operational": 2, 00:21:35.931 "base_bdevs_list": [ 00:21:35.931 { 00:21:35.931 "name": "BaseBdev1", 00:21:35.931 "uuid": "26d96377-afdd-47b8-a13a-d0fa81e673e9", 00:21:35.931 "is_configured": true, 00:21:35.931 "data_offset": 256, 00:21:35.931 "data_size": 7936 00:21:35.931 }, 00:21:35.931 { 00:21:35.931 "name": "BaseBdev2", 00:21:35.931 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:35.931 "is_configured": false, 00:21:35.931 "data_offset": 0, 00:21:35.931 "data_size": 0 00:21:35.931 } 00:21:35.931 ] 00:21:35.931 }' 00:21:35.931 23:43:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:35.931 23:43:20 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:21:36.189 23:43:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:36.448 [2024-07-24 23:43:21.271005] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:36.448 [2024-07-24 23:43:21.271034] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x159e3a0 name Existed_Raid, state configuring 00:21:36.448 23:43:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:21:36.448 [2024-07-24 23:43:21.439467] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:36.448 [2024-07-24 23:43:21.440532] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:36.448 [2024-07-24 23:43:21.440556] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:36.707 23:43:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:21:36.707 23:43:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:36.707 23:43:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:21:36.707 23:43:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:36.707 23:43:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:36.707 23:43:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:36.707 23:43:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:36.707 23:43:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:36.707 23:43:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:36.707 23:43:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:36.707 23:43:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:36.707 23:43:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:36.707 23:43:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:36.707 23:43:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:36.707 23:43:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:36.707 "name": "Existed_Raid", 00:21:36.707 "uuid": "94607ddf-9c06-494d-8363-5afeb61ce494", 00:21:36.707 "strip_size_kb": 0, 00:21:36.707 "state": "configuring", 00:21:36.707 "raid_level": "raid1", 00:21:36.707 "superblock": true, 00:21:36.707 "num_base_bdevs": 2, 00:21:36.707 "num_base_bdevs_discovered": 1, 00:21:36.707 "num_base_bdevs_operational": 2, 00:21:36.707 "base_bdevs_list": [ 00:21:36.707 { 00:21:36.707 "name": "BaseBdev1", 00:21:36.707 "uuid": "26d96377-afdd-47b8-a13a-d0fa81e673e9", 00:21:36.707 "is_configured": true, 00:21:36.707 "data_offset": 256, 00:21:36.707 "data_size": 7936 00:21:36.707 }, 00:21:36.707 { 00:21:36.707 "name": "BaseBdev2", 00:21:36.707 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:36.707 "is_configured": false, 00:21:36.707 "data_offset": 0, 00:21:36.707 "data_size": 0 00:21:36.707 } 00:21:36.707 ] 00:21:36.707 }' 00:21:36.707 23:43:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:36.707 23:43:21 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:21:37.276 23:43:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev2 00:21:37.535 [2024-07-24 23:43:22.296960] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:37.535 [2024-07-24 23:43:22.297067] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x159dba0 00:21:37.535 [2024-07-24 23:43:22.297075] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:21:37.535 [2024-07-24 23:43:22.297119] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x159d5e0 00:21:37.535 [2024-07-24 23:43:22.297186] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x159dba0 00:21:37.535 [2024-07-24 23:43:22.297191] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x159dba0 00:21:37.535 [2024-07-24 23:43:22.297234] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:37.535 BaseBdev2 00:21:37.535 23:43:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:21:37.535 23:43:22 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:21:37.535 23:43:22 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:21:37.535 23:43:22 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@901 -- # local i 00:21:37.535 23:43:22 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:21:37.535 23:43:22 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:21:37.535 23:43:22 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:37.535 23:43:22 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:21:37.794 [ 00:21:37.794 { 00:21:37.794 "name": "BaseBdev2", 00:21:37.794 "aliases": [ 00:21:37.794 "a171153f-8c24-47ec-a2c6-2fb6a613a4b3" 00:21:37.794 ], 00:21:37.794 "product_name": "Malloc disk", 00:21:37.794 "block_size": 4096, 00:21:37.794 "num_blocks": 8192, 00:21:37.794 "uuid": "a171153f-8c24-47ec-a2c6-2fb6a613a4b3", 00:21:37.794 "md_size": 32, 00:21:37.794 "md_interleave": false, 00:21:37.794 "dif_type": 0, 00:21:37.794 "assigned_rate_limits": { 00:21:37.794 "rw_ios_per_sec": 0, 00:21:37.794 "rw_mbytes_per_sec": 0, 00:21:37.794 "r_mbytes_per_sec": 0, 00:21:37.794 "w_mbytes_per_sec": 0 00:21:37.794 }, 00:21:37.794 "claimed": true, 00:21:37.794 "claim_type": "exclusive_write", 00:21:37.794 "zoned": false, 00:21:37.794 "supported_io_types": { 00:21:37.794 "read": true, 00:21:37.794 "write": true, 00:21:37.794 "unmap": true, 00:21:37.794 "flush": true, 00:21:37.794 "reset": true, 00:21:37.794 "nvme_admin": false, 00:21:37.794 "nvme_io": false, 00:21:37.794 "nvme_io_md": false, 00:21:37.794 "write_zeroes": true, 00:21:37.794 "zcopy": true, 00:21:37.794 "get_zone_info": false, 00:21:37.794 "zone_management": false, 00:21:37.794 "zone_append": false, 00:21:37.794 "compare": false, 00:21:37.794 "compare_and_write": false, 00:21:37.794 "abort": true, 00:21:37.794 "seek_hole": false, 00:21:37.794 "seek_data": false, 00:21:37.794 "copy": true, 00:21:37.794 "nvme_iov_md": false 00:21:37.794 }, 00:21:37.794 "memory_domains": [ 00:21:37.794 { 00:21:37.794 "dma_device_id": "system", 00:21:37.794 "dma_device_type": 1 00:21:37.794 }, 00:21:37.794 { 00:21:37.794 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:37.794 "dma_device_type": 2 00:21:37.794 } 00:21:37.794 ], 00:21:37.794 "driver_specific": {} 00:21:37.794 } 00:21:37.794 ] 00:21:37.794 23:43:22 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@907 -- # return 0 00:21:37.795 23:43:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:21:37.795 23:43:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:37.795 23:43:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:21:37.795 23:43:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:37.795 23:43:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:37.795 23:43:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:37.795 23:43:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:37.795 23:43:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:37.795 23:43:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:37.795 23:43:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:37.795 23:43:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:37.795 23:43:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:37.795 23:43:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:37.795 23:43:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:38.053 23:43:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:38.053 "name": "Existed_Raid", 00:21:38.053 "uuid": "94607ddf-9c06-494d-8363-5afeb61ce494", 00:21:38.053 "strip_size_kb": 0, 00:21:38.053 "state": "online", 00:21:38.053 "raid_level": "raid1", 00:21:38.053 "superblock": true, 00:21:38.053 "num_base_bdevs": 2, 00:21:38.053 "num_base_bdevs_discovered": 2, 00:21:38.053 "num_base_bdevs_operational": 2, 00:21:38.053 "base_bdevs_list": [ 00:21:38.053 { 00:21:38.053 "name": "BaseBdev1", 00:21:38.053 "uuid": "26d96377-afdd-47b8-a13a-d0fa81e673e9", 00:21:38.053 "is_configured": true, 00:21:38.053 "data_offset": 256, 00:21:38.053 "data_size": 7936 00:21:38.053 }, 00:21:38.053 { 00:21:38.053 "name": "BaseBdev2", 00:21:38.053 "uuid": "a171153f-8c24-47ec-a2c6-2fb6a613a4b3", 00:21:38.053 "is_configured": true, 00:21:38.053 "data_offset": 256, 00:21:38.054 "data_size": 7936 00:21:38.054 } 00:21:38.054 ] 00:21:38.054 }' 00:21:38.054 23:43:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:38.054 23:43:22 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:21:38.312 23:43:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:21:38.312 23:43:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:21:38.312 23:43:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:38.312 23:43:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:38.312 23:43:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:38.312 23:43:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:21:38.312 23:43:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:21:38.312 23:43:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:38.570 [2024-07-24 23:43:23.424075] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:38.570 23:43:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:38.570 "name": "Existed_Raid", 00:21:38.570 "aliases": [ 00:21:38.570 "94607ddf-9c06-494d-8363-5afeb61ce494" 00:21:38.570 ], 00:21:38.570 "product_name": "Raid Volume", 00:21:38.570 "block_size": 4096, 00:21:38.571 "num_blocks": 7936, 00:21:38.571 "uuid": "94607ddf-9c06-494d-8363-5afeb61ce494", 00:21:38.571 "md_size": 32, 00:21:38.571 "md_interleave": false, 00:21:38.571 "dif_type": 0, 00:21:38.571 "assigned_rate_limits": { 00:21:38.571 "rw_ios_per_sec": 0, 00:21:38.571 "rw_mbytes_per_sec": 0, 00:21:38.571 "r_mbytes_per_sec": 0, 00:21:38.571 "w_mbytes_per_sec": 0 00:21:38.571 }, 00:21:38.571 "claimed": false, 00:21:38.571 "zoned": false, 00:21:38.571 "supported_io_types": { 00:21:38.571 "read": true, 00:21:38.571 "write": true, 00:21:38.571 "unmap": false, 00:21:38.571 "flush": false, 00:21:38.571 "reset": true, 00:21:38.571 "nvme_admin": false, 00:21:38.571 "nvme_io": false, 00:21:38.571 "nvme_io_md": false, 00:21:38.571 "write_zeroes": true, 00:21:38.571 "zcopy": false, 00:21:38.571 "get_zone_info": false, 00:21:38.571 "zone_management": false, 00:21:38.571 "zone_append": false, 00:21:38.571 "compare": false, 00:21:38.571 "compare_and_write": false, 00:21:38.571 "abort": false, 00:21:38.571 "seek_hole": false, 00:21:38.571 "seek_data": false, 00:21:38.571 "copy": false, 00:21:38.571 "nvme_iov_md": false 00:21:38.571 }, 00:21:38.571 "memory_domains": [ 00:21:38.571 { 00:21:38.571 "dma_device_id": "system", 00:21:38.571 "dma_device_type": 1 00:21:38.571 }, 00:21:38.571 { 00:21:38.571 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:38.571 "dma_device_type": 2 00:21:38.571 }, 00:21:38.571 { 00:21:38.571 "dma_device_id": "system", 00:21:38.571 "dma_device_type": 1 00:21:38.571 }, 00:21:38.571 { 00:21:38.571 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:38.571 "dma_device_type": 2 00:21:38.571 } 00:21:38.571 ], 00:21:38.571 "driver_specific": { 00:21:38.571 "raid": { 00:21:38.571 "uuid": "94607ddf-9c06-494d-8363-5afeb61ce494", 00:21:38.571 "strip_size_kb": 0, 00:21:38.571 "state": "online", 00:21:38.571 "raid_level": "raid1", 00:21:38.571 "superblock": true, 00:21:38.571 "num_base_bdevs": 2, 00:21:38.571 "num_base_bdevs_discovered": 2, 00:21:38.571 "num_base_bdevs_operational": 2, 00:21:38.571 "base_bdevs_list": [ 00:21:38.571 { 00:21:38.571 "name": "BaseBdev1", 00:21:38.571 "uuid": "26d96377-afdd-47b8-a13a-d0fa81e673e9", 00:21:38.571 "is_configured": true, 00:21:38.571 "data_offset": 256, 00:21:38.571 "data_size": 7936 00:21:38.571 }, 00:21:38.571 { 00:21:38.571 "name": "BaseBdev2", 00:21:38.571 "uuid": "a171153f-8c24-47ec-a2c6-2fb6a613a4b3", 00:21:38.571 "is_configured": true, 00:21:38.571 "data_offset": 256, 00:21:38.571 "data_size": 7936 00:21:38.571 } 00:21:38.571 ] 00:21:38.571 } 00:21:38.571 } 00:21:38.571 }' 00:21:38.571 23:43:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:38.571 23:43:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:21:38.571 BaseBdev2' 00:21:38.571 23:43:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:38.571 23:43:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:21:38.571 23:43:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:38.828 23:43:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:38.828 "name": "BaseBdev1", 00:21:38.828 "aliases": [ 00:21:38.828 "26d96377-afdd-47b8-a13a-d0fa81e673e9" 00:21:38.828 ], 00:21:38.828 "product_name": "Malloc disk", 00:21:38.828 "block_size": 4096, 00:21:38.828 "num_blocks": 8192, 00:21:38.828 "uuid": "26d96377-afdd-47b8-a13a-d0fa81e673e9", 00:21:38.828 "md_size": 32, 00:21:38.828 "md_interleave": false, 00:21:38.828 "dif_type": 0, 00:21:38.828 "assigned_rate_limits": { 00:21:38.828 "rw_ios_per_sec": 0, 00:21:38.828 "rw_mbytes_per_sec": 0, 00:21:38.828 "r_mbytes_per_sec": 0, 00:21:38.828 "w_mbytes_per_sec": 0 00:21:38.828 }, 00:21:38.828 "claimed": true, 00:21:38.828 "claim_type": "exclusive_write", 00:21:38.828 "zoned": false, 00:21:38.828 "supported_io_types": { 00:21:38.828 "read": true, 00:21:38.828 "write": true, 00:21:38.828 "unmap": true, 00:21:38.828 "flush": true, 00:21:38.828 "reset": true, 00:21:38.828 "nvme_admin": false, 00:21:38.828 "nvme_io": false, 00:21:38.828 "nvme_io_md": false, 00:21:38.828 "write_zeroes": true, 00:21:38.828 "zcopy": true, 00:21:38.828 "get_zone_info": false, 00:21:38.828 "zone_management": false, 00:21:38.828 "zone_append": false, 00:21:38.828 "compare": false, 00:21:38.828 "compare_and_write": false, 00:21:38.828 "abort": true, 00:21:38.828 "seek_hole": false, 00:21:38.828 "seek_data": false, 00:21:38.828 "copy": true, 00:21:38.828 "nvme_iov_md": false 00:21:38.829 }, 00:21:38.829 "memory_domains": [ 00:21:38.829 { 00:21:38.829 "dma_device_id": "system", 00:21:38.829 "dma_device_type": 1 00:21:38.829 }, 00:21:38.829 { 00:21:38.829 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:38.829 "dma_device_type": 2 00:21:38.829 } 00:21:38.829 ], 00:21:38.829 "driver_specific": {} 00:21:38.829 }' 00:21:38.829 23:43:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:38.829 23:43:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:38.829 23:43:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:21:38.829 23:43:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:38.829 23:43:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:38.829 23:43:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:21:38.829 23:43:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:39.086 23:43:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:39.086 23:43:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:21:39.086 23:43:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:39.086 23:43:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:39.086 23:43:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:21:39.086 23:43:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:39.086 23:43:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:21:39.086 23:43:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:39.344 23:43:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:39.344 "name": "BaseBdev2", 00:21:39.344 "aliases": [ 00:21:39.344 "a171153f-8c24-47ec-a2c6-2fb6a613a4b3" 00:21:39.344 ], 00:21:39.344 "product_name": "Malloc disk", 00:21:39.344 "block_size": 4096, 00:21:39.344 "num_blocks": 8192, 00:21:39.344 "uuid": "a171153f-8c24-47ec-a2c6-2fb6a613a4b3", 00:21:39.344 "md_size": 32, 00:21:39.344 "md_interleave": false, 00:21:39.344 "dif_type": 0, 00:21:39.344 "assigned_rate_limits": { 00:21:39.344 "rw_ios_per_sec": 0, 00:21:39.344 "rw_mbytes_per_sec": 0, 00:21:39.344 "r_mbytes_per_sec": 0, 00:21:39.344 "w_mbytes_per_sec": 0 00:21:39.344 }, 00:21:39.344 "claimed": true, 00:21:39.344 "claim_type": "exclusive_write", 00:21:39.344 "zoned": false, 00:21:39.344 "supported_io_types": { 00:21:39.344 "read": true, 00:21:39.344 "write": true, 00:21:39.344 "unmap": true, 00:21:39.344 "flush": true, 00:21:39.344 "reset": true, 00:21:39.344 "nvme_admin": false, 00:21:39.344 "nvme_io": false, 00:21:39.344 "nvme_io_md": false, 00:21:39.344 "write_zeroes": true, 00:21:39.345 "zcopy": true, 00:21:39.345 "get_zone_info": false, 00:21:39.345 "zone_management": false, 00:21:39.345 "zone_append": false, 00:21:39.345 "compare": false, 00:21:39.345 "compare_and_write": false, 00:21:39.345 "abort": true, 00:21:39.345 "seek_hole": false, 00:21:39.345 "seek_data": false, 00:21:39.345 "copy": true, 00:21:39.345 "nvme_iov_md": false 00:21:39.345 }, 00:21:39.345 "memory_domains": [ 00:21:39.345 { 00:21:39.345 "dma_device_id": "system", 00:21:39.345 "dma_device_type": 1 00:21:39.345 }, 00:21:39.345 { 00:21:39.345 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:39.345 "dma_device_type": 2 00:21:39.345 } 00:21:39.345 ], 00:21:39.345 "driver_specific": {} 00:21:39.345 }' 00:21:39.345 23:43:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:39.345 23:43:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:39.345 23:43:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:21:39.345 23:43:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:39.345 23:43:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:39.345 23:43:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:21:39.345 23:43:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:39.345 23:43:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:39.604 23:43:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:21:39.604 23:43:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:39.604 23:43:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:39.604 23:43:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:21:39.604 23:43:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:21:39.604 [2024-07-24 23:43:24.590930] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:39.863 23:43:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@275 -- # local expected_state 00:21:39.863 23:43:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:21:39.863 23:43:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@213 -- # case $1 in 00:21:39.863 23:43:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@214 -- # return 0 00:21:39.863 23:43:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:21:39.863 23:43:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:21:39.863 23:43:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:39.863 23:43:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:39.863 23:43:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:39.863 23:43:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:39.863 23:43:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:39.863 23:43:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:39.863 23:43:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:39.863 23:43:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:39.863 23:43:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:39.863 23:43:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:39.863 23:43:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:39.863 23:43:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:39.863 "name": "Existed_Raid", 00:21:39.863 "uuid": "94607ddf-9c06-494d-8363-5afeb61ce494", 00:21:39.863 "strip_size_kb": 0, 00:21:39.863 "state": "online", 00:21:39.863 "raid_level": "raid1", 00:21:39.863 "superblock": true, 00:21:39.863 "num_base_bdevs": 2, 00:21:39.863 "num_base_bdevs_discovered": 1, 00:21:39.863 "num_base_bdevs_operational": 1, 00:21:39.863 "base_bdevs_list": [ 00:21:39.863 { 00:21:39.863 "name": null, 00:21:39.863 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:39.863 "is_configured": false, 00:21:39.863 "data_offset": 256, 00:21:39.863 "data_size": 7936 00:21:39.863 }, 00:21:39.863 { 00:21:39.863 "name": "BaseBdev2", 00:21:39.863 "uuid": "a171153f-8c24-47ec-a2c6-2fb6a613a4b3", 00:21:39.863 "is_configured": true, 00:21:39.863 "data_offset": 256, 00:21:39.863 "data_size": 7936 00:21:39.863 } 00:21:39.863 ] 00:21:39.863 }' 00:21:39.863 23:43:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:39.863 23:43:24 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:21:40.429 23:43:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:21:40.429 23:43:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:40.429 23:43:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:40.429 23:43:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:21:40.429 23:43:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:21:40.429 23:43:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:40.429 23:43:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:21:40.689 [2024-07-24 23:43:25.559272] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:21:40.689 [2024-07-24 23:43:25.559335] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:40.689 [2024-07-24 23:43:25.569994] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:40.689 [2024-07-24 23:43:25.570018] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:40.689 [2024-07-24 23:43:25.570024] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x159dba0 name Existed_Raid, state offline 00:21:40.689 23:43:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:21:40.689 23:43:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:40.689 23:43:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:40.689 23:43:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:21:40.948 23:43:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:21:40.948 23:43:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:21:40.948 23:43:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:21:40.948 23:43:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@341 -- # killprocess 386846 00:21:40.948 23:43:25 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@950 -- # '[' -z 386846 ']' 00:21:40.948 23:43:25 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@954 -- # kill -0 386846 00:21:40.948 23:43:25 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@955 -- # uname 00:21:40.948 23:43:25 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:21:40.948 23:43:25 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 386846 00:21:40.948 23:43:25 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:21:40.948 23:43:25 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:21:40.948 23:43:25 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@968 -- # echo 'killing process with pid 386846' 00:21:40.948 killing process with pid 386846 00:21:40.948 23:43:25 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@969 -- # kill 386846 00:21:40.948 [2024-07-24 23:43:25.782184] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:40.948 23:43:25 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@974 -- # wait 386846 00:21:40.948 [2024-07-24 23:43:25.782962] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:41.207 23:43:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@343 -- # return 0 00:21:41.207 00:21:41.207 real 0m7.922s 00:21:41.207 user 0m14.207s 00:21:41.207 sys 0m1.311s 00:21:41.207 23:43:25 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@1126 -- # xtrace_disable 00:21:41.207 23:43:25 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:21:41.207 ************************************ 00:21:41.207 END TEST raid_state_function_test_sb_md_separate 00:21:41.207 ************************************ 00:21:41.207 23:43:25 bdev_raid -- bdev/bdev_raid.sh@906 -- # run_test raid_superblock_test_md_separate raid_superblock_test raid1 2 00:21:41.207 23:43:25 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:21:41.207 23:43:25 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:21:41.207 23:43:25 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:41.207 ************************************ 00:21:41.207 START TEST raid_superblock_test_md_separate 00:21:41.207 ************************************ 00:21:41.207 23:43:26 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@1125 -- # raid_superblock_test raid1 2 00:21:41.207 23:43:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:21:41.207 23:43:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:21:41.207 23:43:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:21:41.207 23:43:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:21:41.207 23:43:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:21:41.207 23:43:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:21:41.207 23:43:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:21:41.207 23:43:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:21:41.207 23:43:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:21:41.207 23:43:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@398 -- # local strip_size 00:21:41.207 23:43:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:21:41.207 23:43:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:21:41.207 23:43:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:21:41.207 23:43:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:21:41.207 23:43:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:21:41.207 23:43:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@411 -- # raid_pid=388434 00:21:41.207 23:43:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:21:41.207 23:43:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@412 -- # waitforlisten 388434 /var/tmp/spdk-raid.sock 00:21:41.207 23:43:26 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@831 -- # '[' -z 388434 ']' 00:21:41.207 23:43:26 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:41.207 23:43:26 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@836 -- # local max_retries=100 00:21:41.207 23:43:26 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:41.207 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:41.207 23:43:26 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@840 -- # xtrace_disable 00:21:41.207 23:43:26 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:21:41.207 [2024-07-24 23:43:26.060623] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:21:41.207 [2024-07-24 23:43:26.060657] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid388434 ] 00:21:41.207 [2024-07-24 23:43:26.124876] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:41.207 [2024-07-24 23:43:26.203060] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:21:41.466 [2024-07-24 23:43:26.256287] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:41.466 [2024-07-24 23:43:26.256313] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:42.033 23:43:26 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:21:42.033 23:43:26 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@864 -- # return 0 00:21:42.033 23:43:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:21:42.033 23:43:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:21:42.033 23:43:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:21:42.033 23:43:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:21:42.033 23:43:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:21:42.033 23:43:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:21:42.033 23:43:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:21:42.033 23:43:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:21:42.033 23:43:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b malloc1 00:21:42.033 malloc1 00:21:42.033 23:43:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:21:42.292 [2024-07-24 23:43:27.163995] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:21:42.292 [2024-07-24 23:43:27.164032] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:42.292 [2024-07-24 23:43:27.164050] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26ce120 00:21:42.292 [2024-07-24 23:43:27.164056] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:42.292 [2024-07-24 23:43:27.165078] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:42.292 [2024-07-24 23:43:27.165097] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:21:42.292 pt1 00:21:42.292 23:43:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:21:42.292 23:43:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:21:42.292 23:43:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:21:42.292 23:43:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:21:42.292 23:43:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:21:42.292 23:43:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:21:42.292 23:43:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:21:42.292 23:43:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:21:42.292 23:43:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b malloc2 00:21:42.551 malloc2 00:21:42.551 23:43:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:21:42.551 [2024-07-24 23:43:27.481212] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:21:42.551 [2024-07-24 23:43:27.481244] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:42.551 [2024-07-24 23:43:27.481258] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26ccd30 00:21:42.551 [2024-07-24 23:43:27.481264] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:42.551 [2024-07-24 23:43:27.482184] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:42.551 [2024-07-24 23:43:27.482203] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:21:42.551 pt2 00:21:42.551 23:43:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:21:42.551 23:43:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:21:42.551 23:43:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:21:42.810 [2024-07-24 23:43:27.633625] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:21:42.810 [2024-07-24 23:43:27.634433] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:21:42.810 [2024-07-24 23:43:27.634554] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x252a4b0 00:21:42.810 [2024-07-24 23:43:27.634563] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:21:42.810 [2024-07-24 23:43:27.634610] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26ccfc0 00:21:42.810 [2024-07-24 23:43:27.634703] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x252a4b0 00:21:42.810 [2024-07-24 23:43:27.634709] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x252a4b0 00:21:42.810 [2024-07-24 23:43:27.634754] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:42.810 23:43:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:42.810 23:43:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:42.810 23:43:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:42.810 23:43:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:42.810 23:43:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:42.810 23:43:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:42.810 23:43:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:42.810 23:43:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:42.810 23:43:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:42.810 23:43:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:42.810 23:43:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:42.810 23:43:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:42.810 23:43:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:42.810 "name": "raid_bdev1", 00:21:42.810 "uuid": "38dd37a6-bdc7-4503-a3b6-d59d61851478", 00:21:42.810 "strip_size_kb": 0, 00:21:42.810 "state": "online", 00:21:42.810 "raid_level": "raid1", 00:21:42.810 "superblock": true, 00:21:42.810 "num_base_bdevs": 2, 00:21:42.810 "num_base_bdevs_discovered": 2, 00:21:42.810 "num_base_bdevs_operational": 2, 00:21:42.810 "base_bdevs_list": [ 00:21:42.810 { 00:21:42.810 "name": "pt1", 00:21:42.810 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:42.810 "is_configured": true, 00:21:42.810 "data_offset": 256, 00:21:42.810 "data_size": 7936 00:21:42.810 }, 00:21:42.810 { 00:21:42.810 "name": "pt2", 00:21:42.810 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:42.810 "is_configured": true, 00:21:42.810 "data_offset": 256, 00:21:42.810 "data_size": 7936 00:21:42.810 } 00:21:42.810 ] 00:21:42.810 }' 00:21:42.810 23:43:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:42.810 23:43:27 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:21:43.376 23:43:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:21:43.376 23:43:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:21:43.376 23:43:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:43.376 23:43:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:43.376 23:43:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:43.376 23:43:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:21:43.376 23:43:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:43.376 23:43:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:43.635 [2024-07-24 23:43:28.407784] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:43.635 23:43:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:43.635 "name": "raid_bdev1", 00:21:43.635 "aliases": [ 00:21:43.635 "38dd37a6-bdc7-4503-a3b6-d59d61851478" 00:21:43.635 ], 00:21:43.635 "product_name": "Raid Volume", 00:21:43.635 "block_size": 4096, 00:21:43.635 "num_blocks": 7936, 00:21:43.635 "uuid": "38dd37a6-bdc7-4503-a3b6-d59d61851478", 00:21:43.635 "md_size": 32, 00:21:43.635 "md_interleave": false, 00:21:43.635 "dif_type": 0, 00:21:43.635 "assigned_rate_limits": { 00:21:43.635 "rw_ios_per_sec": 0, 00:21:43.635 "rw_mbytes_per_sec": 0, 00:21:43.635 "r_mbytes_per_sec": 0, 00:21:43.635 "w_mbytes_per_sec": 0 00:21:43.635 }, 00:21:43.635 "claimed": false, 00:21:43.635 "zoned": false, 00:21:43.635 "supported_io_types": { 00:21:43.635 "read": true, 00:21:43.635 "write": true, 00:21:43.635 "unmap": false, 00:21:43.635 "flush": false, 00:21:43.635 "reset": true, 00:21:43.635 "nvme_admin": false, 00:21:43.635 "nvme_io": false, 00:21:43.635 "nvme_io_md": false, 00:21:43.635 "write_zeroes": true, 00:21:43.635 "zcopy": false, 00:21:43.635 "get_zone_info": false, 00:21:43.635 "zone_management": false, 00:21:43.635 "zone_append": false, 00:21:43.635 "compare": false, 00:21:43.635 "compare_and_write": false, 00:21:43.635 "abort": false, 00:21:43.635 "seek_hole": false, 00:21:43.635 "seek_data": false, 00:21:43.635 "copy": false, 00:21:43.635 "nvme_iov_md": false 00:21:43.635 }, 00:21:43.635 "memory_domains": [ 00:21:43.635 { 00:21:43.635 "dma_device_id": "system", 00:21:43.635 "dma_device_type": 1 00:21:43.635 }, 00:21:43.635 { 00:21:43.635 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:43.635 "dma_device_type": 2 00:21:43.635 }, 00:21:43.636 { 00:21:43.636 "dma_device_id": "system", 00:21:43.636 "dma_device_type": 1 00:21:43.636 }, 00:21:43.636 { 00:21:43.636 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:43.636 "dma_device_type": 2 00:21:43.636 } 00:21:43.636 ], 00:21:43.636 "driver_specific": { 00:21:43.636 "raid": { 00:21:43.636 "uuid": "38dd37a6-bdc7-4503-a3b6-d59d61851478", 00:21:43.636 "strip_size_kb": 0, 00:21:43.636 "state": "online", 00:21:43.636 "raid_level": "raid1", 00:21:43.636 "superblock": true, 00:21:43.636 "num_base_bdevs": 2, 00:21:43.636 "num_base_bdevs_discovered": 2, 00:21:43.636 "num_base_bdevs_operational": 2, 00:21:43.636 "base_bdevs_list": [ 00:21:43.636 { 00:21:43.636 "name": "pt1", 00:21:43.636 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:43.636 "is_configured": true, 00:21:43.636 "data_offset": 256, 00:21:43.636 "data_size": 7936 00:21:43.636 }, 00:21:43.636 { 00:21:43.636 "name": "pt2", 00:21:43.636 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:43.636 "is_configured": true, 00:21:43.636 "data_offset": 256, 00:21:43.636 "data_size": 7936 00:21:43.636 } 00:21:43.636 ] 00:21:43.636 } 00:21:43.636 } 00:21:43.636 }' 00:21:43.636 23:43:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:43.636 23:43:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:21:43.636 pt2' 00:21:43.636 23:43:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:43.636 23:43:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:21:43.636 23:43:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:43.636 23:43:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:43.636 "name": "pt1", 00:21:43.636 "aliases": [ 00:21:43.636 "00000000-0000-0000-0000-000000000001" 00:21:43.636 ], 00:21:43.636 "product_name": "passthru", 00:21:43.636 "block_size": 4096, 00:21:43.636 "num_blocks": 8192, 00:21:43.636 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:43.636 "md_size": 32, 00:21:43.636 "md_interleave": false, 00:21:43.636 "dif_type": 0, 00:21:43.636 "assigned_rate_limits": { 00:21:43.636 "rw_ios_per_sec": 0, 00:21:43.636 "rw_mbytes_per_sec": 0, 00:21:43.636 "r_mbytes_per_sec": 0, 00:21:43.636 "w_mbytes_per_sec": 0 00:21:43.636 }, 00:21:43.636 "claimed": true, 00:21:43.636 "claim_type": "exclusive_write", 00:21:43.636 "zoned": false, 00:21:43.636 "supported_io_types": { 00:21:43.636 "read": true, 00:21:43.636 "write": true, 00:21:43.636 "unmap": true, 00:21:43.636 "flush": true, 00:21:43.636 "reset": true, 00:21:43.636 "nvme_admin": false, 00:21:43.636 "nvme_io": false, 00:21:43.636 "nvme_io_md": false, 00:21:43.636 "write_zeroes": true, 00:21:43.636 "zcopy": true, 00:21:43.636 "get_zone_info": false, 00:21:43.636 "zone_management": false, 00:21:43.636 "zone_append": false, 00:21:43.636 "compare": false, 00:21:43.636 "compare_and_write": false, 00:21:43.636 "abort": true, 00:21:43.636 "seek_hole": false, 00:21:43.636 "seek_data": false, 00:21:43.636 "copy": true, 00:21:43.636 "nvme_iov_md": false 00:21:43.636 }, 00:21:43.636 "memory_domains": [ 00:21:43.636 { 00:21:43.636 "dma_device_id": "system", 00:21:43.636 "dma_device_type": 1 00:21:43.636 }, 00:21:43.636 { 00:21:43.636 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:43.636 "dma_device_type": 2 00:21:43.636 } 00:21:43.636 ], 00:21:43.636 "driver_specific": { 00:21:43.636 "passthru": { 00:21:43.636 "name": "pt1", 00:21:43.636 "base_bdev_name": "malloc1" 00:21:43.636 } 00:21:43.636 } 00:21:43.636 }' 00:21:43.636 23:43:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:43.894 23:43:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:43.894 23:43:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:21:43.894 23:43:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:43.894 23:43:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:43.894 23:43:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:21:43.894 23:43:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:43.894 23:43:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:43.894 23:43:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:21:43.894 23:43:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:43.894 23:43:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:44.152 23:43:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:21:44.152 23:43:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:44.153 23:43:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:21:44.153 23:43:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:44.153 23:43:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:44.153 "name": "pt2", 00:21:44.153 "aliases": [ 00:21:44.153 "00000000-0000-0000-0000-000000000002" 00:21:44.153 ], 00:21:44.153 "product_name": "passthru", 00:21:44.153 "block_size": 4096, 00:21:44.153 "num_blocks": 8192, 00:21:44.153 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:44.153 "md_size": 32, 00:21:44.153 "md_interleave": false, 00:21:44.153 "dif_type": 0, 00:21:44.153 "assigned_rate_limits": { 00:21:44.153 "rw_ios_per_sec": 0, 00:21:44.153 "rw_mbytes_per_sec": 0, 00:21:44.153 "r_mbytes_per_sec": 0, 00:21:44.153 "w_mbytes_per_sec": 0 00:21:44.153 }, 00:21:44.153 "claimed": true, 00:21:44.153 "claim_type": "exclusive_write", 00:21:44.153 "zoned": false, 00:21:44.153 "supported_io_types": { 00:21:44.153 "read": true, 00:21:44.153 "write": true, 00:21:44.153 "unmap": true, 00:21:44.153 "flush": true, 00:21:44.153 "reset": true, 00:21:44.153 "nvme_admin": false, 00:21:44.153 "nvme_io": false, 00:21:44.153 "nvme_io_md": false, 00:21:44.153 "write_zeroes": true, 00:21:44.153 "zcopy": true, 00:21:44.153 "get_zone_info": false, 00:21:44.153 "zone_management": false, 00:21:44.153 "zone_append": false, 00:21:44.153 "compare": false, 00:21:44.153 "compare_and_write": false, 00:21:44.153 "abort": true, 00:21:44.153 "seek_hole": false, 00:21:44.153 "seek_data": false, 00:21:44.153 "copy": true, 00:21:44.153 "nvme_iov_md": false 00:21:44.153 }, 00:21:44.153 "memory_domains": [ 00:21:44.153 { 00:21:44.153 "dma_device_id": "system", 00:21:44.153 "dma_device_type": 1 00:21:44.153 }, 00:21:44.153 { 00:21:44.153 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:44.153 "dma_device_type": 2 00:21:44.153 } 00:21:44.153 ], 00:21:44.153 "driver_specific": { 00:21:44.153 "passthru": { 00:21:44.153 "name": "pt2", 00:21:44.153 "base_bdev_name": "malloc2" 00:21:44.153 } 00:21:44.153 } 00:21:44.153 }' 00:21:44.153 23:43:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:44.153 23:43:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:44.153 23:43:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:21:44.153 23:43:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:44.411 23:43:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:44.411 23:43:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:21:44.411 23:43:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:44.411 23:43:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:44.411 23:43:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:21:44.411 23:43:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:44.411 23:43:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:44.411 23:43:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:21:44.411 23:43:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:44.411 23:43:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:21:44.670 [2024-07-24 23:43:29.482572] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:44.670 23:43:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=38dd37a6-bdc7-4503-a3b6-d59d61851478 00:21:44.670 23:43:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@435 -- # '[' -z 38dd37a6-bdc7-4503-a3b6-d59d61851478 ']' 00:21:44.670 23:43:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:44.670 [2024-07-24 23:43:29.646835] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:44.670 [2024-07-24 23:43:29.646850] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:44.670 [2024-07-24 23:43:29.646891] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:44.670 [2024-07-24 23:43:29.646927] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:44.670 [2024-07-24 23:43:29.646933] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x252a4b0 name raid_bdev1, state offline 00:21:44.670 23:43:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:44.670 23:43:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:21:44.930 23:43:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:21:44.930 23:43:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:21:44.930 23:43:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:21:44.930 23:43:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:21:45.187 23:43:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:21:45.187 23:43:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:21:45.187 23:43:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:21:45.187 23:43:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:21:45.446 23:43:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:21:45.446 23:43:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:21:45.446 23:43:30 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@650 -- # local es=0 00:21:45.446 23:43:30 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:21:45.446 23:43:30 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:45.446 23:43:30 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:21:45.446 23:43:30 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:45.446 23:43:30 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:21:45.446 23:43:30 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:45.446 23:43:30 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:21:45.446 23:43:30 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:45.446 23:43:30 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:21:45.446 23:43:30 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:21:45.704 [2024-07-24 23:43:30.476959] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:21:45.704 [2024-07-24 23:43:30.477944] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:21:45.704 [2024-07-24 23:43:30.477984] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:21:45.704 [2024-07-24 23:43:30.478010] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:21:45.704 [2024-07-24 23:43:30.478019] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:45.704 [2024-07-24 23:43:30.478040] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2528720 name raid_bdev1, state configuring 00:21:45.704 request: 00:21:45.704 { 00:21:45.704 "name": "raid_bdev1", 00:21:45.704 "raid_level": "raid1", 00:21:45.704 "base_bdevs": [ 00:21:45.704 "malloc1", 00:21:45.704 "malloc2" 00:21:45.704 ], 00:21:45.704 "superblock": false, 00:21:45.704 "method": "bdev_raid_create", 00:21:45.704 "req_id": 1 00:21:45.704 } 00:21:45.704 Got JSON-RPC error response 00:21:45.704 response: 00:21:45.704 { 00:21:45.704 "code": -17, 00:21:45.704 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:21:45.704 } 00:21:45.704 23:43:30 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@653 -- # es=1 00:21:45.704 23:43:30 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:21:45.704 23:43:30 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:21:45.704 23:43:30 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:21:45.704 23:43:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:45.704 23:43:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:21:45.704 23:43:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:21:45.704 23:43:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:21:45.705 23:43:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:21:45.963 [2024-07-24 23:43:30.801780] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:21:45.963 [2024-07-24 23:43:30.801816] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:45.963 [2024-07-24 23:43:30.801846] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26ce350 00:21:45.963 [2024-07-24 23:43:30.801853] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:45.963 [2024-07-24 23:43:30.802928] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:45.963 [2024-07-24 23:43:30.802947] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:21:45.963 [2024-07-24 23:43:30.802979] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:21:45.963 [2024-07-24 23:43:30.802998] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:21:45.963 pt1 00:21:45.963 23:43:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:21:45.963 23:43:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:45.963 23:43:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:45.964 23:43:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:45.964 23:43:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:45.964 23:43:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:45.964 23:43:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:45.964 23:43:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:45.964 23:43:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:45.964 23:43:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:45.964 23:43:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:45.964 23:43:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:46.225 23:43:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:46.225 "name": "raid_bdev1", 00:21:46.225 "uuid": "38dd37a6-bdc7-4503-a3b6-d59d61851478", 00:21:46.225 "strip_size_kb": 0, 00:21:46.225 "state": "configuring", 00:21:46.225 "raid_level": "raid1", 00:21:46.225 "superblock": true, 00:21:46.225 "num_base_bdevs": 2, 00:21:46.225 "num_base_bdevs_discovered": 1, 00:21:46.225 "num_base_bdevs_operational": 2, 00:21:46.225 "base_bdevs_list": [ 00:21:46.225 { 00:21:46.225 "name": "pt1", 00:21:46.225 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:46.225 "is_configured": true, 00:21:46.225 "data_offset": 256, 00:21:46.225 "data_size": 7936 00:21:46.225 }, 00:21:46.225 { 00:21:46.225 "name": null, 00:21:46.225 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:46.225 "is_configured": false, 00:21:46.225 "data_offset": 256, 00:21:46.225 "data_size": 7936 00:21:46.225 } 00:21:46.225 ] 00:21:46.225 }' 00:21:46.225 23:43:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:46.225 23:43:30 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:21:46.517 23:43:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:21:46.517 23:43:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:21:46.517 23:43:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:21:46.517 23:43:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:21:46.777 [2024-07-24 23:43:31.623902] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:21:46.777 [2024-07-24 23:43:31.623939] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:46.777 [2024-07-24 23:43:31.623952] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x252cb90 00:21:46.777 [2024-07-24 23:43:31.623959] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:46.777 [2024-07-24 23:43:31.624094] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:46.777 [2024-07-24 23:43:31.624103] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:21:46.777 [2024-07-24 23:43:31.624131] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:21:46.777 [2024-07-24 23:43:31.624142] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:21:46.777 [2024-07-24 23:43:31.624202] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x252d0d0 00:21:46.777 [2024-07-24 23:43:31.624207] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:21:46.777 [2024-07-24 23:43:31.624248] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x252e210 00:21:46.777 [2024-07-24 23:43:31.624316] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x252d0d0 00:21:46.777 [2024-07-24 23:43:31.624321] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x252d0d0 00:21:46.777 [2024-07-24 23:43:31.624365] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:46.777 pt2 00:21:46.777 23:43:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:21:46.777 23:43:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:21:46.777 23:43:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:46.777 23:43:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:46.777 23:43:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:46.777 23:43:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:46.777 23:43:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:46.777 23:43:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:46.777 23:43:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:46.777 23:43:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:46.777 23:43:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:46.777 23:43:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:46.777 23:43:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:46.777 23:43:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:47.036 23:43:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:47.036 "name": "raid_bdev1", 00:21:47.036 "uuid": "38dd37a6-bdc7-4503-a3b6-d59d61851478", 00:21:47.036 "strip_size_kb": 0, 00:21:47.036 "state": "online", 00:21:47.036 "raid_level": "raid1", 00:21:47.036 "superblock": true, 00:21:47.036 "num_base_bdevs": 2, 00:21:47.036 "num_base_bdevs_discovered": 2, 00:21:47.036 "num_base_bdevs_operational": 2, 00:21:47.036 "base_bdevs_list": [ 00:21:47.036 { 00:21:47.036 "name": "pt1", 00:21:47.036 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:47.036 "is_configured": true, 00:21:47.036 "data_offset": 256, 00:21:47.036 "data_size": 7936 00:21:47.036 }, 00:21:47.036 { 00:21:47.036 "name": "pt2", 00:21:47.036 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:47.036 "is_configured": true, 00:21:47.036 "data_offset": 256, 00:21:47.036 "data_size": 7936 00:21:47.036 } 00:21:47.036 ] 00:21:47.036 }' 00:21:47.036 23:43:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:47.036 23:43:31 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:21:47.293 23:43:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:21:47.293 23:43:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:21:47.293 23:43:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:47.293 23:43:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:47.293 23:43:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:47.293 23:43:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:21:47.293 23:43:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:47.293 23:43:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:47.552 [2024-07-24 23:43:32.426143] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:47.552 23:43:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:47.552 "name": "raid_bdev1", 00:21:47.552 "aliases": [ 00:21:47.552 "38dd37a6-bdc7-4503-a3b6-d59d61851478" 00:21:47.552 ], 00:21:47.552 "product_name": "Raid Volume", 00:21:47.552 "block_size": 4096, 00:21:47.552 "num_blocks": 7936, 00:21:47.552 "uuid": "38dd37a6-bdc7-4503-a3b6-d59d61851478", 00:21:47.552 "md_size": 32, 00:21:47.552 "md_interleave": false, 00:21:47.552 "dif_type": 0, 00:21:47.552 "assigned_rate_limits": { 00:21:47.552 "rw_ios_per_sec": 0, 00:21:47.552 "rw_mbytes_per_sec": 0, 00:21:47.552 "r_mbytes_per_sec": 0, 00:21:47.552 "w_mbytes_per_sec": 0 00:21:47.552 }, 00:21:47.552 "claimed": false, 00:21:47.552 "zoned": false, 00:21:47.552 "supported_io_types": { 00:21:47.552 "read": true, 00:21:47.552 "write": true, 00:21:47.552 "unmap": false, 00:21:47.552 "flush": false, 00:21:47.552 "reset": true, 00:21:47.552 "nvme_admin": false, 00:21:47.552 "nvme_io": false, 00:21:47.552 "nvme_io_md": false, 00:21:47.552 "write_zeroes": true, 00:21:47.552 "zcopy": false, 00:21:47.552 "get_zone_info": false, 00:21:47.552 "zone_management": false, 00:21:47.552 "zone_append": false, 00:21:47.552 "compare": false, 00:21:47.552 "compare_and_write": false, 00:21:47.552 "abort": false, 00:21:47.552 "seek_hole": false, 00:21:47.552 "seek_data": false, 00:21:47.552 "copy": false, 00:21:47.552 "nvme_iov_md": false 00:21:47.552 }, 00:21:47.552 "memory_domains": [ 00:21:47.552 { 00:21:47.552 "dma_device_id": "system", 00:21:47.552 "dma_device_type": 1 00:21:47.552 }, 00:21:47.552 { 00:21:47.552 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:47.552 "dma_device_type": 2 00:21:47.552 }, 00:21:47.552 { 00:21:47.552 "dma_device_id": "system", 00:21:47.552 "dma_device_type": 1 00:21:47.552 }, 00:21:47.552 { 00:21:47.552 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:47.552 "dma_device_type": 2 00:21:47.552 } 00:21:47.552 ], 00:21:47.552 "driver_specific": { 00:21:47.552 "raid": { 00:21:47.552 "uuid": "38dd37a6-bdc7-4503-a3b6-d59d61851478", 00:21:47.552 "strip_size_kb": 0, 00:21:47.552 "state": "online", 00:21:47.552 "raid_level": "raid1", 00:21:47.552 "superblock": true, 00:21:47.552 "num_base_bdevs": 2, 00:21:47.552 "num_base_bdevs_discovered": 2, 00:21:47.552 "num_base_bdevs_operational": 2, 00:21:47.552 "base_bdevs_list": [ 00:21:47.552 { 00:21:47.552 "name": "pt1", 00:21:47.552 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:47.552 "is_configured": true, 00:21:47.552 "data_offset": 256, 00:21:47.552 "data_size": 7936 00:21:47.552 }, 00:21:47.552 { 00:21:47.552 "name": "pt2", 00:21:47.552 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:47.552 "is_configured": true, 00:21:47.552 "data_offset": 256, 00:21:47.552 "data_size": 7936 00:21:47.552 } 00:21:47.552 ] 00:21:47.552 } 00:21:47.552 } 00:21:47.552 }' 00:21:47.552 23:43:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:47.552 23:43:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:21:47.552 pt2' 00:21:47.552 23:43:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:47.552 23:43:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:21:47.552 23:43:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:47.811 23:43:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:47.811 "name": "pt1", 00:21:47.811 "aliases": [ 00:21:47.811 "00000000-0000-0000-0000-000000000001" 00:21:47.811 ], 00:21:47.811 "product_name": "passthru", 00:21:47.811 "block_size": 4096, 00:21:47.811 "num_blocks": 8192, 00:21:47.811 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:47.811 "md_size": 32, 00:21:47.811 "md_interleave": false, 00:21:47.811 "dif_type": 0, 00:21:47.811 "assigned_rate_limits": { 00:21:47.811 "rw_ios_per_sec": 0, 00:21:47.811 "rw_mbytes_per_sec": 0, 00:21:47.811 "r_mbytes_per_sec": 0, 00:21:47.811 "w_mbytes_per_sec": 0 00:21:47.811 }, 00:21:47.811 "claimed": true, 00:21:47.811 "claim_type": "exclusive_write", 00:21:47.811 "zoned": false, 00:21:47.811 "supported_io_types": { 00:21:47.811 "read": true, 00:21:47.811 "write": true, 00:21:47.811 "unmap": true, 00:21:47.811 "flush": true, 00:21:47.811 "reset": true, 00:21:47.811 "nvme_admin": false, 00:21:47.811 "nvme_io": false, 00:21:47.811 "nvme_io_md": false, 00:21:47.811 "write_zeroes": true, 00:21:47.811 "zcopy": true, 00:21:47.811 "get_zone_info": false, 00:21:47.811 "zone_management": false, 00:21:47.811 "zone_append": false, 00:21:47.811 "compare": false, 00:21:47.811 "compare_and_write": false, 00:21:47.811 "abort": true, 00:21:47.811 "seek_hole": false, 00:21:47.811 "seek_data": false, 00:21:47.811 "copy": true, 00:21:47.811 "nvme_iov_md": false 00:21:47.811 }, 00:21:47.811 "memory_domains": [ 00:21:47.811 { 00:21:47.811 "dma_device_id": "system", 00:21:47.811 "dma_device_type": 1 00:21:47.811 }, 00:21:47.811 { 00:21:47.811 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:47.811 "dma_device_type": 2 00:21:47.811 } 00:21:47.811 ], 00:21:47.811 "driver_specific": { 00:21:47.811 "passthru": { 00:21:47.811 "name": "pt1", 00:21:47.811 "base_bdev_name": "malloc1" 00:21:47.811 } 00:21:47.811 } 00:21:47.811 }' 00:21:47.811 23:43:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:47.811 23:43:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:47.811 23:43:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:21:47.811 23:43:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:47.811 23:43:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:47.811 23:43:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:21:47.811 23:43:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:48.070 23:43:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:48.070 23:43:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:21:48.070 23:43:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:48.070 23:43:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:48.070 23:43:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:21:48.070 23:43:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:48.070 23:43:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:21:48.070 23:43:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:48.329 23:43:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:48.329 "name": "pt2", 00:21:48.329 "aliases": [ 00:21:48.329 "00000000-0000-0000-0000-000000000002" 00:21:48.329 ], 00:21:48.329 "product_name": "passthru", 00:21:48.329 "block_size": 4096, 00:21:48.329 "num_blocks": 8192, 00:21:48.329 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:48.329 "md_size": 32, 00:21:48.329 "md_interleave": false, 00:21:48.329 "dif_type": 0, 00:21:48.329 "assigned_rate_limits": { 00:21:48.329 "rw_ios_per_sec": 0, 00:21:48.329 "rw_mbytes_per_sec": 0, 00:21:48.329 "r_mbytes_per_sec": 0, 00:21:48.329 "w_mbytes_per_sec": 0 00:21:48.329 }, 00:21:48.329 "claimed": true, 00:21:48.329 "claim_type": "exclusive_write", 00:21:48.329 "zoned": false, 00:21:48.329 "supported_io_types": { 00:21:48.329 "read": true, 00:21:48.329 "write": true, 00:21:48.329 "unmap": true, 00:21:48.329 "flush": true, 00:21:48.329 "reset": true, 00:21:48.329 "nvme_admin": false, 00:21:48.329 "nvme_io": false, 00:21:48.329 "nvme_io_md": false, 00:21:48.329 "write_zeroes": true, 00:21:48.329 "zcopy": true, 00:21:48.329 "get_zone_info": false, 00:21:48.329 "zone_management": false, 00:21:48.329 "zone_append": false, 00:21:48.329 "compare": false, 00:21:48.329 "compare_and_write": false, 00:21:48.329 "abort": true, 00:21:48.329 "seek_hole": false, 00:21:48.329 "seek_data": false, 00:21:48.329 "copy": true, 00:21:48.329 "nvme_iov_md": false 00:21:48.329 }, 00:21:48.329 "memory_domains": [ 00:21:48.329 { 00:21:48.329 "dma_device_id": "system", 00:21:48.329 "dma_device_type": 1 00:21:48.329 }, 00:21:48.329 { 00:21:48.329 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:48.329 "dma_device_type": 2 00:21:48.329 } 00:21:48.329 ], 00:21:48.329 "driver_specific": { 00:21:48.329 "passthru": { 00:21:48.329 "name": "pt2", 00:21:48.329 "base_bdev_name": "malloc2" 00:21:48.329 } 00:21:48.329 } 00:21:48.329 }' 00:21:48.329 23:43:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:48.329 23:43:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:48.329 23:43:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:21:48.329 23:43:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:48.329 23:43:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:48.329 23:43:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:21:48.329 23:43:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:48.329 23:43:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:48.594 23:43:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:21:48.594 23:43:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:48.594 23:43:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:48.594 23:43:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:21:48.594 23:43:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:48.594 23:43:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:21:48.594 [2024-07-24 23:43:33.577118] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:48.594 23:43:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # '[' 38dd37a6-bdc7-4503-a3b6-d59d61851478 '!=' 38dd37a6-bdc7-4503-a3b6-d59d61851478 ']' 00:21:48.594 23:43:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:21:48.594 23:43:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@213 -- # case $1 in 00:21:48.594 23:43:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@214 -- # return 0 00:21:48.855 23:43:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:21:48.855 [2024-07-24 23:43:33.745405] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:21:48.855 23:43:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:48.855 23:43:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:48.855 23:43:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:48.855 23:43:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:48.855 23:43:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:48.855 23:43:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:48.855 23:43:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:48.855 23:43:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:48.855 23:43:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:48.855 23:43:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:48.855 23:43:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:48.855 23:43:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:49.114 23:43:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:49.114 "name": "raid_bdev1", 00:21:49.114 "uuid": "38dd37a6-bdc7-4503-a3b6-d59d61851478", 00:21:49.114 "strip_size_kb": 0, 00:21:49.114 "state": "online", 00:21:49.114 "raid_level": "raid1", 00:21:49.114 "superblock": true, 00:21:49.114 "num_base_bdevs": 2, 00:21:49.114 "num_base_bdevs_discovered": 1, 00:21:49.114 "num_base_bdevs_operational": 1, 00:21:49.114 "base_bdevs_list": [ 00:21:49.114 { 00:21:49.114 "name": null, 00:21:49.114 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:49.114 "is_configured": false, 00:21:49.114 "data_offset": 256, 00:21:49.114 "data_size": 7936 00:21:49.114 }, 00:21:49.114 { 00:21:49.114 "name": "pt2", 00:21:49.114 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:49.114 "is_configured": true, 00:21:49.114 "data_offset": 256, 00:21:49.114 "data_size": 7936 00:21:49.114 } 00:21:49.114 ] 00:21:49.114 }' 00:21:49.114 23:43:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:49.114 23:43:33 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:21:49.681 23:43:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:49.681 [2024-07-24 23:43:34.587577] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:49.681 [2024-07-24 23:43:34.587595] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:49.681 [2024-07-24 23:43:34.587633] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:49.681 [2024-07-24 23:43:34.587665] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:49.682 [2024-07-24 23:43:34.587670] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x252d0d0 name raid_bdev1, state offline 00:21:49.682 23:43:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:49.682 23:43:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:21:49.940 23:43:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:21:49.940 23:43:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:21:49.940 23:43:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:21:49.940 23:43:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:21:49.940 23:43:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:21:50.200 23:43:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:21:50.200 23:43:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:21:50.200 23:43:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:21:50.200 23:43:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:21:50.200 23:43:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@518 -- # i=1 00:21:50.200 23:43:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:21:50.200 [2024-07-24 23:43:35.100895] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:21:50.200 [2024-07-24 23:43:35.100928] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:50.200 [2024-07-24 23:43:35.100939] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x252b2a0 00:21:50.200 [2024-07-24 23:43:35.100961] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:50.200 [2024-07-24 23:43:35.102019] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:50.200 [2024-07-24 23:43:35.102039] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:21:50.200 [2024-07-24 23:43:35.102070] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:21:50.200 [2024-07-24 23:43:35.102090] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:21:50.200 [2024-07-24 23:43:35.102147] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x252dc20 00:21:50.200 [2024-07-24 23:43:35.102152] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:21:50.200 [2024-07-24 23:43:35.102199] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x252d8e0 00:21:50.200 [2024-07-24 23:43:35.102264] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x252dc20 00:21:50.200 [2024-07-24 23:43:35.102269] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x252dc20 00:21:50.200 [2024-07-24 23:43:35.102313] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:50.200 pt2 00:21:50.200 23:43:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:50.200 23:43:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:50.200 23:43:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:50.200 23:43:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:50.200 23:43:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:50.200 23:43:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:50.200 23:43:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:50.200 23:43:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:50.200 23:43:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:50.200 23:43:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:50.200 23:43:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:50.200 23:43:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:50.459 23:43:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:50.459 "name": "raid_bdev1", 00:21:50.459 "uuid": "38dd37a6-bdc7-4503-a3b6-d59d61851478", 00:21:50.459 "strip_size_kb": 0, 00:21:50.459 "state": "online", 00:21:50.459 "raid_level": "raid1", 00:21:50.459 "superblock": true, 00:21:50.459 "num_base_bdevs": 2, 00:21:50.459 "num_base_bdevs_discovered": 1, 00:21:50.459 "num_base_bdevs_operational": 1, 00:21:50.459 "base_bdevs_list": [ 00:21:50.459 { 00:21:50.459 "name": null, 00:21:50.459 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:50.459 "is_configured": false, 00:21:50.459 "data_offset": 256, 00:21:50.459 "data_size": 7936 00:21:50.459 }, 00:21:50.459 { 00:21:50.459 "name": "pt2", 00:21:50.459 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:50.459 "is_configured": true, 00:21:50.459 "data_offset": 256, 00:21:50.459 "data_size": 7936 00:21:50.459 } 00:21:50.459 ] 00:21:50.459 }' 00:21:50.459 23:43:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:50.459 23:43:35 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:21:51.027 23:43:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:51.027 [2024-07-24 23:43:35.882918] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:51.027 [2024-07-24 23:43:35.882934] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:51.027 [2024-07-24 23:43:35.882970] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:51.027 [2024-07-24 23:43:35.883001] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:51.027 [2024-07-24 23:43:35.883007] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x252dc20 name raid_bdev1, state offline 00:21:51.027 23:43:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:51.027 23:43:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:21:51.286 23:43:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:21:51.286 23:43:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:21:51.286 23:43:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:21:51.286 23:43:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:21:51.286 [2024-07-24 23:43:36.211774] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:21:51.286 [2024-07-24 23:43:36.211810] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:51.286 [2024-07-24 23:43:36.211823] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2529400 00:21:51.286 [2024-07-24 23:43:36.211829] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:51.286 [2024-07-24 23:43:36.212889] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:51.286 [2024-07-24 23:43:36.212909] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:21:51.286 [2024-07-24 23:43:36.212941] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:21:51.286 [2024-07-24 23:43:36.212962] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:21:51.286 [2024-07-24 23:43:36.213022] bdev_raid.c:3639:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:21:51.286 [2024-07-24 23:43:36.213029] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:51.286 [2024-07-24 23:43:36.213038] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25308b0 name raid_bdev1, state configuring 00:21:51.286 [2024-07-24 23:43:36.213052] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:21:51.286 [2024-07-24 23:43:36.213087] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x2531360 00:21:51.286 [2024-07-24 23:43:36.213092] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:21:51.286 [2024-07-24 23:43:36.213132] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x252e600 00:21:51.286 [2024-07-24 23:43:36.213198] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2531360 00:21:51.286 [2024-07-24 23:43:36.213203] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2531360 00:21:51.286 [2024-07-24 23:43:36.213249] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:51.286 pt1 00:21:51.286 23:43:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:21:51.286 23:43:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:51.286 23:43:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:51.286 23:43:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:51.286 23:43:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:51.286 23:43:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:51.286 23:43:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:51.286 23:43:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:51.286 23:43:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:51.286 23:43:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:51.286 23:43:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:51.286 23:43:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:51.286 23:43:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:51.545 23:43:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:51.545 "name": "raid_bdev1", 00:21:51.545 "uuid": "38dd37a6-bdc7-4503-a3b6-d59d61851478", 00:21:51.545 "strip_size_kb": 0, 00:21:51.545 "state": "online", 00:21:51.545 "raid_level": "raid1", 00:21:51.545 "superblock": true, 00:21:51.545 "num_base_bdevs": 2, 00:21:51.545 "num_base_bdevs_discovered": 1, 00:21:51.545 "num_base_bdevs_operational": 1, 00:21:51.545 "base_bdevs_list": [ 00:21:51.545 { 00:21:51.545 "name": null, 00:21:51.545 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:51.545 "is_configured": false, 00:21:51.545 "data_offset": 256, 00:21:51.545 "data_size": 7936 00:21:51.545 }, 00:21:51.545 { 00:21:51.545 "name": "pt2", 00:21:51.545 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:51.545 "is_configured": true, 00:21:51.545 "data_offset": 256, 00:21:51.545 "data_size": 7936 00:21:51.545 } 00:21:51.545 ] 00:21:51.545 }' 00:21:51.545 23:43:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:51.545 23:43:36 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:21:52.113 23:43:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:21:52.113 23:43:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:21:52.113 23:43:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:21:52.113 23:43:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:52.113 23:43:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:21:52.372 [2024-07-24 23:43:37.214512] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:52.372 23:43:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # '[' 38dd37a6-bdc7-4503-a3b6-d59d61851478 '!=' 38dd37a6-bdc7-4503-a3b6-d59d61851478 ']' 00:21:52.372 23:43:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@562 -- # killprocess 388434 00:21:52.372 23:43:37 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@950 -- # '[' -z 388434 ']' 00:21:52.372 23:43:37 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@954 -- # kill -0 388434 00:21:52.372 23:43:37 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@955 -- # uname 00:21:52.372 23:43:37 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:21:52.372 23:43:37 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 388434 00:21:52.372 23:43:37 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:21:52.372 23:43:37 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:21:52.372 23:43:37 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@968 -- # echo 'killing process with pid 388434' 00:21:52.372 killing process with pid 388434 00:21:52.372 23:43:37 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@969 -- # kill 388434 00:21:52.372 [2024-07-24 23:43:37.270996] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:52.372 [2024-07-24 23:43:37.271034] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:52.372 [2024-07-24 23:43:37.271064] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:52.372 [2024-07-24 23:43:37.271070] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2531360 name raid_bdev1, state offline 00:21:52.372 23:43:37 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@974 -- # wait 388434 00:21:52.372 [2024-07-24 23:43:37.289972] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:52.631 23:43:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@564 -- # return 0 00:21:52.631 00:21:52.631 real 0m11.444s 00:21:52.631 user 0m21.073s 00:21:52.631 sys 0m1.777s 00:21:52.631 23:43:37 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@1126 -- # xtrace_disable 00:21:52.631 23:43:37 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:21:52.631 ************************************ 00:21:52.631 END TEST raid_superblock_test_md_separate 00:21:52.631 ************************************ 00:21:52.631 23:43:37 bdev_raid -- bdev/bdev_raid.sh@907 -- # '[' true = true ']' 00:21:52.631 23:43:37 bdev_raid -- bdev/bdev_raid.sh@908 -- # run_test raid_rebuild_test_sb_md_separate raid_rebuild_test raid1 2 true false true 00:21:52.631 23:43:37 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:21:52.631 23:43:37 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:21:52.631 23:43:37 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:52.631 ************************************ 00:21:52.631 START TEST raid_rebuild_test_sb_md_separate 00:21:52.631 ************************************ 00:21:52.631 23:43:37 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@1125 -- # raid_rebuild_test raid1 2 true false true 00:21:52.631 23:43:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:21:52.631 23:43:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:21:52.631 23:43:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:21:52.631 23:43:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:21:52.631 23:43:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@572 -- # local verify=true 00:21:52.631 23:43:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:21:52.631 23:43:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:21:52.631 23:43:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:21:52.631 23:43:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:21:52.631 23:43:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:21:52.631 23:43:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:21:52.631 23:43:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:21:52.631 23:43:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:21:52.631 23:43:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:21:52.631 23:43:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:21:52.631 23:43:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:21:52.631 23:43:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # local strip_size 00:21:52.631 23:43:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@576 -- # local create_arg 00:21:52.631 23:43:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:21:52.631 23:43:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@578 -- # local data_offset 00:21:52.631 23:43:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:21:52.631 23:43:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:21:52.631 23:43:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:21:52.631 23:43:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:21:52.631 23:43:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@596 -- # raid_pid=390578 00:21:52.631 23:43:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@597 -- # waitforlisten 390578 /var/tmp/spdk-raid.sock 00:21:52.631 23:43:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:21:52.631 23:43:37 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@831 -- # '[' -z 390578 ']' 00:21:52.631 23:43:37 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:52.631 23:43:37 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@836 -- # local max_retries=100 00:21:52.631 23:43:37 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:52.631 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:52.631 23:43:37 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@840 -- # xtrace_disable 00:21:52.631 23:43:37 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:21:52.631 [2024-07-24 23:43:37.590434] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:21:52.631 [2024-07-24 23:43:37.590484] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid390578 ] 00:21:52.631 I/O size of 3145728 is greater than zero copy threshold (65536). 00:21:52.631 Zero copy mechanism will not be used. 00:21:52.890 [2024-07-24 23:43:37.653241] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:52.890 [2024-07-24 23:43:37.729566] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:21:52.890 [2024-07-24 23:43:37.780532] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:52.890 [2024-07-24 23:43:37.780558] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:53.458 23:43:38 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:21:53.458 23:43:38 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@864 -- # return 0 00:21:53.458 23:43:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:21:53.458 23:43:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev1_malloc 00:21:53.717 BaseBdev1_malloc 00:21:53.717 23:43:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:21:53.717 [2024-07-24 23:43:38.700435] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:21:53.717 [2024-07-24 23:43:38.700473] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:53.717 [2024-07-24 23:43:38.700504] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x128db50 00:21:53.717 [2024-07-24 23:43:38.700512] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:53.717 [2024-07-24 23:43:38.701518] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:53.717 [2024-07-24 23:43:38.701538] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:21:53.717 BaseBdev1 00:21:53.717 23:43:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:21:53.717 23:43:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev2_malloc 00:21:53.976 BaseBdev2_malloc 00:21:53.976 23:43:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:21:54.235 [2024-07-24 23:43:39.045476] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:21:54.235 [2024-07-24 23:43:39.045508] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:54.235 [2024-07-24 23:43:39.045520] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13e5670 00:21:54.235 [2024-07-24 23:43:39.045526] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:54.235 [2024-07-24 23:43:39.046451] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:54.235 [2024-07-24 23:43:39.046474] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:21:54.235 BaseBdev2 00:21:54.235 23:43:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b spare_malloc 00:21:54.235 spare_malloc 00:21:54.235 23:43:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:21:54.493 spare_delay 00:21:54.493 23:43:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:21:54.752 [2024-07-24 23:43:39.546866] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:21:54.752 [2024-07-24 23:43:39.546895] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:54.752 [2024-07-24 23:43:39.546908] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13e1c20 00:21:54.752 [2024-07-24 23:43:39.546930] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:54.752 [2024-07-24 23:43:39.547886] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:54.752 [2024-07-24 23:43:39.547905] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:21:54.752 spare 00:21:54.752 23:43:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:21:54.752 [2024-07-24 23:43:39.711322] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:54.752 [2024-07-24 23:43:39.712242] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:54.752 [2024-07-24 23:43:39.712362] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x13e2640 00:21:54.752 [2024-07-24 23:43:39.712372] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:21:54.752 [2024-07-24 23:43:39.712424] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12f37e0 00:21:54.752 [2024-07-24 23:43:39.712513] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x13e2640 00:21:54.752 [2024-07-24 23:43:39.712519] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x13e2640 00:21:54.752 [2024-07-24 23:43:39.712567] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:54.752 23:43:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:54.752 23:43:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:54.752 23:43:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:54.752 23:43:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:54.752 23:43:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:54.752 23:43:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:54.752 23:43:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:54.752 23:43:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:54.752 23:43:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:54.752 23:43:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:54.752 23:43:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:54.752 23:43:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:55.011 23:43:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:55.011 "name": "raid_bdev1", 00:21:55.011 "uuid": "4ed6a8ec-03c9-4c29-a224-8caa13066ec1", 00:21:55.011 "strip_size_kb": 0, 00:21:55.011 "state": "online", 00:21:55.011 "raid_level": "raid1", 00:21:55.011 "superblock": true, 00:21:55.011 "num_base_bdevs": 2, 00:21:55.011 "num_base_bdevs_discovered": 2, 00:21:55.011 "num_base_bdevs_operational": 2, 00:21:55.011 "base_bdevs_list": [ 00:21:55.011 { 00:21:55.011 "name": "BaseBdev1", 00:21:55.011 "uuid": "7aedfa29-9a13-54f3-afab-4f01efaa34cf", 00:21:55.011 "is_configured": true, 00:21:55.011 "data_offset": 256, 00:21:55.011 "data_size": 7936 00:21:55.011 }, 00:21:55.011 { 00:21:55.011 "name": "BaseBdev2", 00:21:55.011 "uuid": "92639f06-72fa-50f6-b13f-5d8dd8fac2c5", 00:21:55.011 "is_configured": true, 00:21:55.011 "data_offset": 256, 00:21:55.011 "data_size": 7936 00:21:55.011 } 00:21:55.011 ] 00:21:55.011 }' 00:21:55.011 23:43:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:55.011 23:43:39 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:21:55.578 23:43:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:55.578 23:43:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:21:55.578 [2024-07-24 23:43:40.533589] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:55.578 23:43:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:21:55.578 23:43:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:55.578 23:43:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:21:55.837 23:43:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:21:55.838 23:43:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:21:55.838 23:43:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:21:55.838 23:43:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:21:55.838 23:43:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:21:55.838 23:43:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:55.838 23:43:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:21:55.838 23:43:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # local bdev_list 00:21:55.838 23:43:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:21:55.838 23:43:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # local nbd_list 00:21:55.838 23:43:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@12 -- # local i 00:21:55.838 23:43:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:21:55.838 23:43:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:55.838 23:43:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:21:56.096 [2024-07-24 23:43:40.882361] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12f37e0 00:21:56.096 /dev/nbd0 00:21:56.096 23:43:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:21:56.096 23:43:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:21:56.096 23:43:40 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:21:56.097 23:43:40 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # local i 00:21:56.097 23:43:40 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:21:56.097 23:43:40 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:21:56.097 23:43:40 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:21:56.097 23:43:40 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@873 -- # break 00:21:56.097 23:43:40 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:21:56.097 23:43:40 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:21:56.097 23:43:40 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:21:56.097 1+0 records in 00:21:56.097 1+0 records out 00:21:56.097 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000240177 s, 17.1 MB/s 00:21:56.097 23:43:40 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:56.097 23:43:40 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # size=4096 00:21:56.097 23:43:40 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:56.097 23:43:40 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:21:56.097 23:43:40 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@889 -- # return 0 00:21:56.097 23:43:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:21:56.097 23:43:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:56.097 23:43:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:21:56.097 23:43:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:21:56.097 23:43:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=4096 count=7936 oflag=direct 00:21:56.664 7936+0 records in 00:21:56.664 7936+0 records out 00:21:56.664 32505856 bytes (33 MB, 31 MiB) copied, 0.485478 s, 67.0 MB/s 00:21:56.664 23:43:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:21:56.664 23:43:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:56.664 23:43:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:21:56.664 23:43:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # local nbd_list 00:21:56.664 23:43:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@51 -- # local i 00:21:56.664 23:43:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:21:56.664 23:43:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:21:56.664 23:43:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:21:56.664 23:43:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:21:56.664 23:43:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:21:56.664 23:43:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:21:56.664 23:43:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:21:56.664 23:43:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:21:56.664 [2024-07-24 23:43:41.616281] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:56.664 23:43:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:21:56.664 23:43:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:21:56.664 23:43:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:21:56.923 [2024-07-24 23:43:41.768676] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:56.923 23:43:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:56.923 23:43:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:56.923 23:43:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:56.923 23:43:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:56.923 23:43:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:56.923 23:43:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:56.923 23:43:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:56.923 23:43:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:56.923 23:43:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:56.923 23:43:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:56.923 23:43:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:56.923 23:43:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:57.182 23:43:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:57.182 "name": "raid_bdev1", 00:21:57.182 "uuid": "4ed6a8ec-03c9-4c29-a224-8caa13066ec1", 00:21:57.182 "strip_size_kb": 0, 00:21:57.182 "state": "online", 00:21:57.182 "raid_level": "raid1", 00:21:57.182 "superblock": true, 00:21:57.182 "num_base_bdevs": 2, 00:21:57.182 "num_base_bdevs_discovered": 1, 00:21:57.182 "num_base_bdevs_operational": 1, 00:21:57.182 "base_bdevs_list": [ 00:21:57.182 { 00:21:57.182 "name": null, 00:21:57.182 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:57.182 "is_configured": false, 00:21:57.182 "data_offset": 256, 00:21:57.182 "data_size": 7936 00:21:57.182 }, 00:21:57.182 { 00:21:57.182 "name": "BaseBdev2", 00:21:57.182 "uuid": "92639f06-72fa-50f6-b13f-5d8dd8fac2c5", 00:21:57.182 "is_configured": true, 00:21:57.182 "data_offset": 256, 00:21:57.182 "data_size": 7936 00:21:57.182 } 00:21:57.182 ] 00:21:57.182 }' 00:21:57.182 23:43:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:57.182 23:43:41 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:21:57.441 23:43:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:21:57.699 [2024-07-24 23:43:42.562765] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:57.699 [2024-07-24 23:43:42.564790] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13e4000 00:21:57.699 [2024-07-24 23:43:42.566314] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:21:57.699 23:43:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@646 -- # sleep 1 00:21:58.633 23:43:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:58.633 23:43:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:58.633 23:43:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:58.633 23:43:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:58.633 23:43:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:58.633 23:43:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:58.633 23:43:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:58.891 23:43:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:58.891 "name": "raid_bdev1", 00:21:58.891 "uuid": "4ed6a8ec-03c9-4c29-a224-8caa13066ec1", 00:21:58.891 "strip_size_kb": 0, 00:21:58.891 "state": "online", 00:21:58.891 "raid_level": "raid1", 00:21:58.891 "superblock": true, 00:21:58.891 "num_base_bdevs": 2, 00:21:58.891 "num_base_bdevs_discovered": 2, 00:21:58.891 "num_base_bdevs_operational": 2, 00:21:58.891 "process": { 00:21:58.891 "type": "rebuild", 00:21:58.891 "target": "spare", 00:21:58.891 "progress": { 00:21:58.891 "blocks": 2816, 00:21:58.891 "percent": 35 00:21:58.891 } 00:21:58.891 }, 00:21:58.891 "base_bdevs_list": [ 00:21:58.891 { 00:21:58.891 "name": "spare", 00:21:58.891 "uuid": "5700aef1-dd29-5838-a96b-2020eefdbd12", 00:21:58.891 "is_configured": true, 00:21:58.891 "data_offset": 256, 00:21:58.891 "data_size": 7936 00:21:58.891 }, 00:21:58.891 { 00:21:58.891 "name": "BaseBdev2", 00:21:58.891 "uuid": "92639f06-72fa-50f6-b13f-5d8dd8fac2c5", 00:21:58.891 "is_configured": true, 00:21:58.891 "data_offset": 256, 00:21:58.891 "data_size": 7936 00:21:58.891 } 00:21:58.891 ] 00:21:58.891 }' 00:21:58.891 23:43:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:58.891 23:43:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:58.891 23:43:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:58.891 23:43:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:58.891 23:43:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:21:59.149 [2024-07-24 23:43:43.983056] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:59.149 [2024-07-24 23:43:44.077185] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:21:59.149 [2024-07-24 23:43:44.077220] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:59.149 [2024-07-24 23:43:44.077244] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:59.149 [2024-07-24 23:43:44.077249] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:21:59.149 23:43:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:59.149 23:43:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:59.149 23:43:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:59.149 23:43:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:59.149 23:43:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:59.149 23:43:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:59.149 23:43:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:59.149 23:43:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:59.149 23:43:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:59.150 23:43:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:59.150 23:43:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:59.150 23:43:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:59.408 23:43:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:59.408 "name": "raid_bdev1", 00:21:59.408 "uuid": "4ed6a8ec-03c9-4c29-a224-8caa13066ec1", 00:21:59.408 "strip_size_kb": 0, 00:21:59.408 "state": "online", 00:21:59.408 "raid_level": "raid1", 00:21:59.408 "superblock": true, 00:21:59.408 "num_base_bdevs": 2, 00:21:59.408 "num_base_bdevs_discovered": 1, 00:21:59.408 "num_base_bdevs_operational": 1, 00:21:59.408 "base_bdevs_list": [ 00:21:59.408 { 00:21:59.408 "name": null, 00:21:59.408 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:59.408 "is_configured": false, 00:21:59.408 "data_offset": 256, 00:21:59.408 "data_size": 7936 00:21:59.408 }, 00:21:59.408 { 00:21:59.408 "name": "BaseBdev2", 00:21:59.408 "uuid": "92639f06-72fa-50f6-b13f-5d8dd8fac2c5", 00:21:59.408 "is_configured": true, 00:21:59.408 "data_offset": 256, 00:21:59.408 "data_size": 7936 00:21:59.408 } 00:21:59.408 ] 00:21:59.408 }' 00:21:59.408 23:43:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:59.408 23:43:44 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:21:59.975 23:43:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:59.975 23:43:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:59.975 23:43:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:59.975 23:43:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:59.975 23:43:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:59.975 23:43:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:59.975 23:43:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:59.975 23:43:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:59.975 "name": "raid_bdev1", 00:21:59.975 "uuid": "4ed6a8ec-03c9-4c29-a224-8caa13066ec1", 00:21:59.975 "strip_size_kb": 0, 00:21:59.975 "state": "online", 00:21:59.975 "raid_level": "raid1", 00:21:59.975 "superblock": true, 00:21:59.975 "num_base_bdevs": 2, 00:21:59.975 "num_base_bdevs_discovered": 1, 00:21:59.975 "num_base_bdevs_operational": 1, 00:21:59.975 "base_bdevs_list": [ 00:21:59.975 { 00:21:59.975 "name": null, 00:21:59.975 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:59.975 "is_configured": false, 00:21:59.975 "data_offset": 256, 00:21:59.975 "data_size": 7936 00:21:59.975 }, 00:21:59.975 { 00:21:59.975 "name": "BaseBdev2", 00:21:59.975 "uuid": "92639f06-72fa-50f6-b13f-5d8dd8fac2c5", 00:21:59.975 "is_configured": true, 00:21:59.975 "data_offset": 256, 00:21:59.975 "data_size": 7936 00:21:59.975 } 00:21:59.975 ] 00:21:59.975 }' 00:21:59.975 23:43:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:00.233 23:43:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:00.233 23:43:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:00.233 23:43:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:00.233 23:43:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:00.233 [2024-07-24 23:43:45.187052] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:00.233 [2024-07-24 23:43:45.189001] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x128d700 00:22:00.233 [2024-07-24 23:43:45.190073] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:00.233 23:43:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@662 -- # sleep 1 00:22:01.609 23:43:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:01.609 23:43:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:01.609 23:43:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:01.609 23:43:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:01.609 23:43:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:01.609 23:43:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:01.609 23:43:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:01.609 23:43:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:01.609 "name": "raid_bdev1", 00:22:01.609 "uuid": "4ed6a8ec-03c9-4c29-a224-8caa13066ec1", 00:22:01.609 "strip_size_kb": 0, 00:22:01.609 "state": "online", 00:22:01.609 "raid_level": "raid1", 00:22:01.609 "superblock": true, 00:22:01.609 "num_base_bdevs": 2, 00:22:01.609 "num_base_bdevs_discovered": 2, 00:22:01.609 "num_base_bdevs_operational": 2, 00:22:01.609 "process": { 00:22:01.609 "type": "rebuild", 00:22:01.609 "target": "spare", 00:22:01.609 "progress": { 00:22:01.609 "blocks": 2816, 00:22:01.609 "percent": 35 00:22:01.609 } 00:22:01.609 }, 00:22:01.609 "base_bdevs_list": [ 00:22:01.609 { 00:22:01.609 "name": "spare", 00:22:01.609 "uuid": "5700aef1-dd29-5838-a96b-2020eefdbd12", 00:22:01.609 "is_configured": true, 00:22:01.609 "data_offset": 256, 00:22:01.609 "data_size": 7936 00:22:01.609 }, 00:22:01.609 { 00:22:01.609 "name": "BaseBdev2", 00:22:01.609 "uuid": "92639f06-72fa-50f6-b13f-5d8dd8fac2c5", 00:22:01.609 "is_configured": true, 00:22:01.609 "data_offset": 256, 00:22:01.609 "data_size": 7936 00:22:01.609 } 00:22:01.609 ] 00:22:01.609 }' 00:22:01.609 23:43:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:01.609 23:43:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:01.609 23:43:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:01.609 23:43:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:01.609 23:43:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:22:01.609 23:43:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:22:01.609 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:22:01.609 23:43:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:22:01.609 23:43:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:22:01.609 23:43:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:22:01.609 23:43:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@705 -- # local timeout=819 00:22:01.609 23:43:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:01.609 23:43:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:01.609 23:43:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:01.609 23:43:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:01.609 23:43:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:01.609 23:43:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:01.609 23:43:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:01.609 23:43:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:01.868 23:43:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:01.868 "name": "raid_bdev1", 00:22:01.868 "uuid": "4ed6a8ec-03c9-4c29-a224-8caa13066ec1", 00:22:01.868 "strip_size_kb": 0, 00:22:01.868 "state": "online", 00:22:01.868 "raid_level": "raid1", 00:22:01.868 "superblock": true, 00:22:01.868 "num_base_bdevs": 2, 00:22:01.868 "num_base_bdevs_discovered": 2, 00:22:01.868 "num_base_bdevs_operational": 2, 00:22:01.868 "process": { 00:22:01.868 "type": "rebuild", 00:22:01.868 "target": "spare", 00:22:01.868 "progress": { 00:22:01.868 "blocks": 3328, 00:22:01.868 "percent": 41 00:22:01.868 } 00:22:01.868 }, 00:22:01.868 "base_bdevs_list": [ 00:22:01.868 { 00:22:01.868 "name": "spare", 00:22:01.868 "uuid": "5700aef1-dd29-5838-a96b-2020eefdbd12", 00:22:01.868 "is_configured": true, 00:22:01.868 "data_offset": 256, 00:22:01.868 "data_size": 7936 00:22:01.868 }, 00:22:01.868 { 00:22:01.868 "name": "BaseBdev2", 00:22:01.868 "uuid": "92639f06-72fa-50f6-b13f-5d8dd8fac2c5", 00:22:01.868 "is_configured": true, 00:22:01.868 "data_offset": 256, 00:22:01.868 "data_size": 7936 00:22:01.868 } 00:22:01.868 ] 00:22:01.868 }' 00:22:01.868 23:43:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:01.868 23:43:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:01.868 23:43:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:01.868 23:43:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:01.868 23:43:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@710 -- # sleep 1 00:22:02.803 23:43:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:02.803 23:43:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:02.803 23:43:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:02.803 23:43:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:02.803 23:43:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:02.803 23:43:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:02.803 23:43:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:02.803 23:43:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:03.061 23:43:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:03.061 "name": "raid_bdev1", 00:22:03.061 "uuid": "4ed6a8ec-03c9-4c29-a224-8caa13066ec1", 00:22:03.061 "strip_size_kb": 0, 00:22:03.061 "state": "online", 00:22:03.061 "raid_level": "raid1", 00:22:03.061 "superblock": true, 00:22:03.061 "num_base_bdevs": 2, 00:22:03.061 "num_base_bdevs_discovered": 2, 00:22:03.061 "num_base_bdevs_operational": 2, 00:22:03.061 "process": { 00:22:03.061 "type": "rebuild", 00:22:03.061 "target": "spare", 00:22:03.061 "progress": { 00:22:03.061 "blocks": 6656, 00:22:03.061 "percent": 83 00:22:03.061 } 00:22:03.061 }, 00:22:03.061 "base_bdevs_list": [ 00:22:03.061 { 00:22:03.061 "name": "spare", 00:22:03.061 "uuid": "5700aef1-dd29-5838-a96b-2020eefdbd12", 00:22:03.061 "is_configured": true, 00:22:03.061 "data_offset": 256, 00:22:03.061 "data_size": 7936 00:22:03.061 }, 00:22:03.061 { 00:22:03.061 "name": "BaseBdev2", 00:22:03.061 "uuid": "92639f06-72fa-50f6-b13f-5d8dd8fac2c5", 00:22:03.061 "is_configured": true, 00:22:03.061 "data_offset": 256, 00:22:03.061 "data_size": 7936 00:22:03.061 } 00:22:03.061 ] 00:22:03.061 }' 00:22:03.061 23:43:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:03.061 23:43:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:03.061 23:43:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:03.061 23:43:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:03.061 23:43:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@710 -- # sleep 1 00:22:03.320 [2024-07-24 23:43:48.311780] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:22:03.320 [2024-07-24 23:43:48.311820] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:22:03.320 [2024-07-24 23:43:48.311876] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:04.258 23:43:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:04.258 23:43:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:04.258 23:43:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:04.258 23:43:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:04.258 23:43:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:04.258 23:43:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:04.258 23:43:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:04.258 23:43:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:04.258 23:43:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:04.258 "name": "raid_bdev1", 00:22:04.258 "uuid": "4ed6a8ec-03c9-4c29-a224-8caa13066ec1", 00:22:04.258 "strip_size_kb": 0, 00:22:04.258 "state": "online", 00:22:04.258 "raid_level": "raid1", 00:22:04.258 "superblock": true, 00:22:04.258 "num_base_bdevs": 2, 00:22:04.258 "num_base_bdevs_discovered": 2, 00:22:04.258 "num_base_bdevs_operational": 2, 00:22:04.258 "base_bdevs_list": [ 00:22:04.258 { 00:22:04.258 "name": "spare", 00:22:04.258 "uuid": "5700aef1-dd29-5838-a96b-2020eefdbd12", 00:22:04.258 "is_configured": true, 00:22:04.258 "data_offset": 256, 00:22:04.258 "data_size": 7936 00:22:04.258 }, 00:22:04.258 { 00:22:04.258 "name": "BaseBdev2", 00:22:04.258 "uuid": "92639f06-72fa-50f6-b13f-5d8dd8fac2c5", 00:22:04.258 "is_configured": true, 00:22:04.258 "data_offset": 256, 00:22:04.258 "data_size": 7936 00:22:04.258 } 00:22:04.258 ] 00:22:04.258 }' 00:22:04.258 23:43:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:04.258 23:43:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:22:04.258 23:43:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:04.258 23:43:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:22:04.258 23:43:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@708 -- # break 00:22:04.258 23:43:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:04.258 23:43:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:04.258 23:43:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:04.258 23:43:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:04.258 23:43:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:04.258 23:43:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:04.258 23:43:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:04.517 23:43:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:04.517 "name": "raid_bdev1", 00:22:04.517 "uuid": "4ed6a8ec-03c9-4c29-a224-8caa13066ec1", 00:22:04.517 "strip_size_kb": 0, 00:22:04.517 "state": "online", 00:22:04.517 "raid_level": "raid1", 00:22:04.517 "superblock": true, 00:22:04.517 "num_base_bdevs": 2, 00:22:04.517 "num_base_bdevs_discovered": 2, 00:22:04.517 "num_base_bdevs_operational": 2, 00:22:04.517 "base_bdevs_list": [ 00:22:04.517 { 00:22:04.517 "name": "spare", 00:22:04.517 "uuid": "5700aef1-dd29-5838-a96b-2020eefdbd12", 00:22:04.517 "is_configured": true, 00:22:04.517 "data_offset": 256, 00:22:04.517 "data_size": 7936 00:22:04.517 }, 00:22:04.517 { 00:22:04.517 "name": "BaseBdev2", 00:22:04.517 "uuid": "92639f06-72fa-50f6-b13f-5d8dd8fac2c5", 00:22:04.517 "is_configured": true, 00:22:04.517 "data_offset": 256, 00:22:04.517 "data_size": 7936 00:22:04.517 } 00:22:04.517 ] 00:22:04.517 }' 00:22:04.517 23:43:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:04.517 23:43:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:04.517 23:43:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:04.517 23:43:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:04.517 23:43:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:04.517 23:43:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:04.517 23:43:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:04.517 23:43:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:04.517 23:43:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:04.517 23:43:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:04.517 23:43:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:04.517 23:43:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:04.517 23:43:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:04.517 23:43:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:04.517 23:43:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:04.517 23:43:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:04.775 23:43:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:04.775 "name": "raid_bdev1", 00:22:04.775 "uuid": "4ed6a8ec-03c9-4c29-a224-8caa13066ec1", 00:22:04.775 "strip_size_kb": 0, 00:22:04.775 "state": "online", 00:22:04.775 "raid_level": "raid1", 00:22:04.775 "superblock": true, 00:22:04.775 "num_base_bdevs": 2, 00:22:04.775 "num_base_bdevs_discovered": 2, 00:22:04.775 "num_base_bdevs_operational": 2, 00:22:04.775 "base_bdevs_list": [ 00:22:04.775 { 00:22:04.775 "name": "spare", 00:22:04.775 "uuid": "5700aef1-dd29-5838-a96b-2020eefdbd12", 00:22:04.775 "is_configured": true, 00:22:04.775 "data_offset": 256, 00:22:04.775 "data_size": 7936 00:22:04.775 }, 00:22:04.775 { 00:22:04.775 "name": "BaseBdev2", 00:22:04.775 "uuid": "92639f06-72fa-50f6-b13f-5d8dd8fac2c5", 00:22:04.775 "is_configured": true, 00:22:04.775 "data_offset": 256, 00:22:04.775 "data_size": 7936 00:22:04.775 } 00:22:04.775 ] 00:22:04.775 }' 00:22:04.775 23:43:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:04.775 23:43:49 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:05.341 23:43:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:05.341 [2024-07-24 23:43:50.259636] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:05.341 [2024-07-24 23:43:50.259656] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:05.341 [2024-07-24 23:43:50.259696] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:05.341 [2024-07-24 23:43:50.259734] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:05.341 [2024-07-24 23:43:50.259740] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13e2640 name raid_bdev1, state offline 00:22:05.341 23:43:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:05.341 23:43:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # jq length 00:22:05.600 23:43:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:22:05.600 23:43:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:22:05.600 23:43:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:22:05.600 23:43:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:22:05.600 23:43:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:05.600 23:43:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:22:05.600 23:43:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # local bdev_list 00:22:05.600 23:43:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:22:05.600 23:43:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # local nbd_list 00:22:05.600 23:43:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@12 -- # local i 00:22:05.600 23:43:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:22:05.600 23:43:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:22:05.600 23:43:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:22:05.859 /dev/nbd0 00:22:05.859 23:43:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:22:05.859 23:43:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:22:05.859 23:43:50 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:22:05.859 23:43:50 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # local i 00:22:05.859 23:43:50 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:22:05.859 23:43:50 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:22:05.859 23:43:50 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:22:05.859 23:43:50 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@873 -- # break 00:22:05.859 23:43:50 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:22:05.859 23:43:50 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:22:05.859 23:43:50 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:05.859 1+0 records in 00:22:05.859 1+0 records out 00:22:05.859 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000217929 s, 18.8 MB/s 00:22:05.859 23:43:50 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:05.859 23:43:50 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # size=4096 00:22:05.859 23:43:50 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:05.859 23:43:50 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:22:05.859 23:43:50 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@889 -- # return 0 00:22:05.859 23:43:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:05.859 23:43:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:22:05.859 23:43:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:22:05.859 /dev/nbd1 00:22:05.859 23:43:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:22:05.859 23:43:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:22:05.859 23:43:50 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:22:05.859 23:43:50 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # local i 00:22:05.859 23:43:50 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:22:05.859 23:43:50 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:22:05.859 23:43:50 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:22:05.859 23:43:50 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@873 -- # break 00:22:05.859 23:43:50 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:22:05.859 23:43:50 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:22:05.859 23:43:50 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:05.859 1+0 records in 00:22:05.859 1+0 records out 00:22:05.859 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00017909 s, 22.9 MB/s 00:22:05.859 23:43:50 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:05.859 23:43:50 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # size=4096 00:22:05.859 23:43:50 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:05.859 23:43:50 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:22:05.859 23:43:50 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@889 -- # return 0 00:22:05.859 23:43:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:05.859 23:43:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:22:05.859 23:43:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:22:06.119 23:43:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:22:06.119 23:43:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:06.119 23:43:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:22:06.119 23:43:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # local nbd_list 00:22:06.119 23:43:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@51 -- # local i 00:22:06.119 23:43:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:06.119 23:43:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:22:06.119 23:43:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:22:06.119 23:43:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:22:06.119 23:43:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:22:06.119 23:43:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:06.119 23:43:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:06.119 23:43:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:22:06.119 23:43:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:22:06.119 23:43:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:22:06.119 23:43:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:06.119 23:43:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:22:06.378 23:43:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:22:06.378 23:43:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:22:06.378 23:43:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:22:06.378 23:43:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:06.378 23:43:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:06.378 23:43:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:22:06.378 23:43:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:22:06.378 23:43:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:22:06.378 23:43:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:22:06.378 23:43:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:22:06.637 23:43:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:22:06.637 [2024-07-24 23:43:51.594366] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:22:06.637 [2024-07-24 23:43:51.594397] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:06.637 [2024-07-24 23:43:51.594410] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x128c270 00:22:06.637 [2024-07-24 23:43:51.594432] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:06.637 [2024-07-24 23:43:51.595507] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:06.637 [2024-07-24 23:43:51.595525] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:22:06.637 [2024-07-24 23:43:51.595562] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:22:06.637 [2024-07-24 23:43:51.595578] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:06.637 [2024-07-24 23:43:51.595641] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:06.637 spare 00:22:06.637 23:43:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:06.637 23:43:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:06.637 23:43:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:06.637 23:43:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:06.637 23:43:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:06.637 23:43:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:06.637 23:43:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:06.637 23:43:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:06.637 23:43:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:06.637 23:43:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:06.637 23:43:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:06.637 23:43:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:06.896 [2024-07-24 23:43:51.695926] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1426d50 00:22:06.896 [2024-07-24 23:43:51.695935] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:22:06.896 [2024-07-24 23:43:51.695974] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1426a10 00:22:06.896 [2024-07-24 23:43:51.696049] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1426d50 00:22:06.896 [2024-07-24 23:43:51.696054] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1426d50 00:22:06.896 [2024-07-24 23:43:51.696098] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:06.896 23:43:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:06.896 "name": "raid_bdev1", 00:22:06.896 "uuid": "4ed6a8ec-03c9-4c29-a224-8caa13066ec1", 00:22:06.896 "strip_size_kb": 0, 00:22:06.896 "state": "online", 00:22:06.896 "raid_level": "raid1", 00:22:06.896 "superblock": true, 00:22:06.896 "num_base_bdevs": 2, 00:22:06.896 "num_base_bdevs_discovered": 2, 00:22:06.896 "num_base_bdevs_operational": 2, 00:22:06.896 "base_bdevs_list": [ 00:22:06.896 { 00:22:06.896 "name": "spare", 00:22:06.896 "uuid": "5700aef1-dd29-5838-a96b-2020eefdbd12", 00:22:06.896 "is_configured": true, 00:22:06.896 "data_offset": 256, 00:22:06.896 "data_size": 7936 00:22:06.896 }, 00:22:06.896 { 00:22:06.896 "name": "BaseBdev2", 00:22:06.896 "uuid": "92639f06-72fa-50f6-b13f-5d8dd8fac2c5", 00:22:06.896 "is_configured": true, 00:22:06.896 "data_offset": 256, 00:22:06.896 "data_size": 7936 00:22:06.896 } 00:22:06.896 ] 00:22:06.896 }' 00:22:06.896 23:43:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:06.897 23:43:51 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:07.464 23:43:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:07.464 23:43:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:07.464 23:43:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:07.464 23:43:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:07.464 23:43:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:07.464 23:43:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:07.464 23:43:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:07.464 23:43:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:07.464 "name": "raid_bdev1", 00:22:07.464 "uuid": "4ed6a8ec-03c9-4c29-a224-8caa13066ec1", 00:22:07.464 "strip_size_kb": 0, 00:22:07.464 "state": "online", 00:22:07.464 "raid_level": "raid1", 00:22:07.464 "superblock": true, 00:22:07.464 "num_base_bdevs": 2, 00:22:07.464 "num_base_bdevs_discovered": 2, 00:22:07.464 "num_base_bdevs_operational": 2, 00:22:07.464 "base_bdevs_list": [ 00:22:07.464 { 00:22:07.464 "name": "spare", 00:22:07.464 "uuid": "5700aef1-dd29-5838-a96b-2020eefdbd12", 00:22:07.464 "is_configured": true, 00:22:07.464 "data_offset": 256, 00:22:07.464 "data_size": 7936 00:22:07.464 }, 00:22:07.464 { 00:22:07.464 "name": "BaseBdev2", 00:22:07.464 "uuid": "92639f06-72fa-50f6-b13f-5d8dd8fac2c5", 00:22:07.464 "is_configured": true, 00:22:07.464 "data_offset": 256, 00:22:07.464 "data_size": 7936 00:22:07.464 } 00:22:07.464 ] 00:22:07.464 }' 00:22:07.464 23:43:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:07.723 23:43:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:07.723 23:43:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:07.723 23:43:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:07.723 23:43:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:07.723 23:43:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:22:07.723 23:43:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:22:07.723 23:43:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:22:07.982 [2024-07-24 23:43:52.849677] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:07.982 23:43:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:07.982 23:43:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:07.982 23:43:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:07.982 23:43:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:07.982 23:43:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:07.982 23:43:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:07.982 23:43:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:07.982 23:43:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:07.982 23:43:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:07.982 23:43:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:07.982 23:43:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:07.982 23:43:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:08.241 23:43:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:08.242 "name": "raid_bdev1", 00:22:08.242 "uuid": "4ed6a8ec-03c9-4c29-a224-8caa13066ec1", 00:22:08.242 "strip_size_kb": 0, 00:22:08.242 "state": "online", 00:22:08.242 "raid_level": "raid1", 00:22:08.242 "superblock": true, 00:22:08.242 "num_base_bdevs": 2, 00:22:08.242 "num_base_bdevs_discovered": 1, 00:22:08.242 "num_base_bdevs_operational": 1, 00:22:08.242 "base_bdevs_list": [ 00:22:08.242 { 00:22:08.242 "name": null, 00:22:08.242 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:08.242 "is_configured": false, 00:22:08.242 "data_offset": 256, 00:22:08.242 "data_size": 7936 00:22:08.242 }, 00:22:08.242 { 00:22:08.242 "name": "BaseBdev2", 00:22:08.242 "uuid": "92639f06-72fa-50f6-b13f-5d8dd8fac2c5", 00:22:08.242 "is_configured": true, 00:22:08.242 "data_offset": 256, 00:22:08.242 "data_size": 7936 00:22:08.242 } 00:22:08.242 ] 00:22:08.242 }' 00:22:08.242 23:43:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:08.242 23:43:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:08.809 23:43:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:08.809 [2024-07-24 23:43:53.683856] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:08.809 [2024-07-24 23:43:53.683977] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:22:08.809 [2024-07-24 23:43:53.683989] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:22:08.809 [2024-07-24 23:43:53.684009] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:08.809 [2024-07-24 23:43:53.685952] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12f37e0 00:22:08.809 [2024-07-24 23:43:53.687089] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:08.809 23:43:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@755 -- # sleep 1 00:22:09.745 23:43:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:09.745 23:43:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:09.745 23:43:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:09.745 23:43:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:09.745 23:43:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:09.745 23:43:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:09.745 23:43:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:10.004 23:43:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:10.004 "name": "raid_bdev1", 00:22:10.004 "uuid": "4ed6a8ec-03c9-4c29-a224-8caa13066ec1", 00:22:10.004 "strip_size_kb": 0, 00:22:10.004 "state": "online", 00:22:10.004 "raid_level": "raid1", 00:22:10.004 "superblock": true, 00:22:10.004 "num_base_bdevs": 2, 00:22:10.004 "num_base_bdevs_discovered": 2, 00:22:10.004 "num_base_bdevs_operational": 2, 00:22:10.004 "process": { 00:22:10.004 "type": "rebuild", 00:22:10.004 "target": "spare", 00:22:10.004 "progress": { 00:22:10.004 "blocks": 2816, 00:22:10.004 "percent": 35 00:22:10.004 } 00:22:10.004 }, 00:22:10.004 "base_bdevs_list": [ 00:22:10.004 { 00:22:10.004 "name": "spare", 00:22:10.004 "uuid": "5700aef1-dd29-5838-a96b-2020eefdbd12", 00:22:10.004 "is_configured": true, 00:22:10.004 "data_offset": 256, 00:22:10.004 "data_size": 7936 00:22:10.004 }, 00:22:10.004 { 00:22:10.004 "name": "BaseBdev2", 00:22:10.004 "uuid": "92639f06-72fa-50f6-b13f-5d8dd8fac2c5", 00:22:10.004 "is_configured": true, 00:22:10.004 "data_offset": 256, 00:22:10.004 "data_size": 7936 00:22:10.004 } 00:22:10.004 ] 00:22:10.004 }' 00:22:10.004 23:43:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:10.004 23:43:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:10.004 23:43:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:10.004 23:43:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:10.004 23:43:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:22:10.263 [2024-07-24 23:43:55.116260] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:10.263 [2024-07-24 23:43:55.197687] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:22:10.263 [2024-07-24 23:43:55.197720] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:10.263 [2024-07-24 23:43:55.197728] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:10.263 [2024-07-24 23:43:55.197748] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:22:10.263 23:43:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:10.263 23:43:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:10.263 23:43:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:10.263 23:43:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:10.263 23:43:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:10.263 23:43:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:10.263 23:43:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:10.263 23:43:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:10.263 23:43:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:10.263 23:43:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:10.263 23:43:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:10.263 23:43:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:10.522 23:43:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:10.522 "name": "raid_bdev1", 00:22:10.522 "uuid": "4ed6a8ec-03c9-4c29-a224-8caa13066ec1", 00:22:10.522 "strip_size_kb": 0, 00:22:10.522 "state": "online", 00:22:10.522 "raid_level": "raid1", 00:22:10.522 "superblock": true, 00:22:10.522 "num_base_bdevs": 2, 00:22:10.522 "num_base_bdevs_discovered": 1, 00:22:10.522 "num_base_bdevs_operational": 1, 00:22:10.522 "base_bdevs_list": [ 00:22:10.522 { 00:22:10.522 "name": null, 00:22:10.522 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:10.522 "is_configured": false, 00:22:10.522 "data_offset": 256, 00:22:10.522 "data_size": 7936 00:22:10.522 }, 00:22:10.522 { 00:22:10.522 "name": "BaseBdev2", 00:22:10.522 "uuid": "92639f06-72fa-50f6-b13f-5d8dd8fac2c5", 00:22:10.522 "is_configured": true, 00:22:10.522 "data_offset": 256, 00:22:10.522 "data_size": 7936 00:22:10.522 } 00:22:10.522 ] 00:22:10.522 }' 00:22:10.522 23:43:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:10.522 23:43:55 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:11.088 23:43:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:22:11.088 [2024-07-24 23:43:56.014599] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:22:11.088 [2024-07-24 23:43:56.014633] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:11.088 [2024-07-24 23:43:56.014663] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12f3120 00:22:11.088 [2024-07-24 23:43:56.014674] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:11.088 [2024-07-24 23:43:56.014827] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:11.088 [2024-07-24 23:43:56.014835] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:22:11.088 [2024-07-24 23:43:56.014875] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:22:11.088 [2024-07-24 23:43:56.014882] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:22:11.088 [2024-07-24 23:43:56.014888] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:22:11.088 [2024-07-24 23:43:56.014898] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:11.088 [2024-07-24 23:43:56.016745] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12f2cf0 00:22:11.088 [2024-07-24 23:43:56.017675] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:11.088 spare 00:22:11.088 23:43:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@762 -- # sleep 1 00:22:12.461 23:43:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:12.461 23:43:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:12.461 23:43:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:12.461 23:43:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:12.461 23:43:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:12.461 23:43:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:12.461 23:43:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:12.461 23:43:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:12.461 "name": "raid_bdev1", 00:22:12.461 "uuid": "4ed6a8ec-03c9-4c29-a224-8caa13066ec1", 00:22:12.461 "strip_size_kb": 0, 00:22:12.461 "state": "online", 00:22:12.461 "raid_level": "raid1", 00:22:12.461 "superblock": true, 00:22:12.461 "num_base_bdevs": 2, 00:22:12.461 "num_base_bdevs_discovered": 2, 00:22:12.461 "num_base_bdevs_operational": 2, 00:22:12.461 "process": { 00:22:12.461 "type": "rebuild", 00:22:12.461 "target": "spare", 00:22:12.461 "progress": { 00:22:12.461 "blocks": 2816, 00:22:12.461 "percent": 35 00:22:12.461 } 00:22:12.461 }, 00:22:12.461 "base_bdevs_list": [ 00:22:12.461 { 00:22:12.461 "name": "spare", 00:22:12.461 "uuid": "5700aef1-dd29-5838-a96b-2020eefdbd12", 00:22:12.461 "is_configured": true, 00:22:12.461 "data_offset": 256, 00:22:12.461 "data_size": 7936 00:22:12.461 }, 00:22:12.461 { 00:22:12.462 "name": "BaseBdev2", 00:22:12.462 "uuid": "92639f06-72fa-50f6-b13f-5d8dd8fac2c5", 00:22:12.462 "is_configured": true, 00:22:12.462 "data_offset": 256, 00:22:12.462 "data_size": 7936 00:22:12.462 } 00:22:12.462 ] 00:22:12.462 }' 00:22:12.462 23:43:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:12.462 23:43:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:12.462 23:43:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:12.462 23:43:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:12.462 23:43:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:22:12.462 [2024-07-24 23:43:57.454795] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:12.720 [2024-07-24 23:43:57.528232] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:22:12.720 [2024-07-24 23:43:57.528262] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:12.720 [2024-07-24 23:43:57.528270] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:12.720 [2024-07-24 23:43:57.528274] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:22:12.720 23:43:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:12.720 23:43:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:12.720 23:43:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:12.720 23:43:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:12.720 23:43:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:12.720 23:43:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:12.720 23:43:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:12.720 23:43:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:12.720 23:43:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:12.720 23:43:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:12.720 23:43:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:12.720 23:43:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:12.978 23:43:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:12.978 "name": "raid_bdev1", 00:22:12.978 "uuid": "4ed6a8ec-03c9-4c29-a224-8caa13066ec1", 00:22:12.978 "strip_size_kb": 0, 00:22:12.978 "state": "online", 00:22:12.978 "raid_level": "raid1", 00:22:12.978 "superblock": true, 00:22:12.978 "num_base_bdevs": 2, 00:22:12.978 "num_base_bdevs_discovered": 1, 00:22:12.978 "num_base_bdevs_operational": 1, 00:22:12.978 "base_bdevs_list": [ 00:22:12.978 { 00:22:12.978 "name": null, 00:22:12.978 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:12.978 "is_configured": false, 00:22:12.978 "data_offset": 256, 00:22:12.978 "data_size": 7936 00:22:12.978 }, 00:22:12.978 { 00:22:12.978 "name": "BaseBdev2", 00:22:12.978 "uuid": "92639f06-72fa-50f6-b13f-5d8dd8fac2c5", 00:22:12.978 "is_configured": true, 00:22:12.978 "data_offset": 256, 00:22:12.978 "data_size": 7936 00:22:12.978 } 00:22:12.978 ] 00:22:12.978 }' 00:22:12.978 23:43:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:12.978 23:43:57 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:13.236 23:43:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:13.236 23:43:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:13.236 23:43:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:13.236 23:43:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:13.236 23:43:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:13.495 23:43:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:13.495 23:43:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:13.495 23:43:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:13.495 "name": "raid_bdev1", 00:22:13.495 "uuid": "4ed6a8ec-03c9-4c29-a224-8caa13066ec1", 00:22:13.495 "strip_size_kb": 0, 00:22:13.495 "state": "online", 00:22:13.495 "raid_level": "raid1", 00:22:13.495 "superblock": true, 00:22:13.495 "num_base_bdevs": 2, 00:22:13.495 "num_base_bdevs_discovered": 1, 00:22:13.495 "num_base_bdevs_operational": 1, 00:22:13.495 "base_bdevs_list": [ 00:22:13.495 { 00:22:13.495 "name": null, 00:22:13.495 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:13.495 "is_configured": false, 00:22:13.495 "data_offset": 256, 00:22:13.495 "data_size": 7936 00:22:13.495 }, 00:22:13.495 { 00:22:13.495 "name": "BaseBdev2", 00:22:13.495 "uuid": "92639f06-72fa-50f6-b13f-5d8dd8fac2c5", 00:22:13.495 "is_configured": true, 00:22:13.495 "data_offset": 256, 00:22:13.495 "data_size": 7936 00:22:13.495 } 00:22:13.495 ] 00:22:13.495 }' 00:22:13.495 23:43:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:13.495 23:43:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:13.495 23:43:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:13.495 23:43:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:13.495 23:43:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:22:13.753 23:43:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:22:14.011 [2024-07-24 23:43:58.810352] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:22:14.011 [2024-07-24 23:43:58.810387] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:14.011 [2024-07-24 23:43:58.810402] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x128dd80 00:22:14.011 [2024-07-24 23:43:58.810409] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:14.011 [2024-07-24 23:43:58.810569] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:14.011 [2024-07-24 23:43:58.810579] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:14.011 [2024-07-24 23:43:58.810609] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:22:14.011 [2024-07-24 23:43:58.810616] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:22:14.011 [2024-07-24 23:43:58.810621] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:22:14.011 BaseBdev1 00:22:14.011 23:43:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@773 -- # sleep 1 00:22:14.945 23:43:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:14.945 23:43:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:14.945 23:43:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:14.945 23:43:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:14.945 23:43:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:14.945 23:43:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:14.945 23:43:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:14.946 23:43:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:14.946 23:43:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:14.946 23:43:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:14.946 23:43:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:14.946 23:43:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:15.204 23:44:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:15.204 "name": "raid_bdev1", 00:22:15.204 "uuid": "4ed6a8ec-03c9-4c29-a224-8caa13066ec1", 00:22:15.204 "strip_size_kb": 0, 00:22:15.204 "state": "online", 00:22:15.204 "raid_level": "raid1", 00:22:15.204 "superblock": true, 00:22:15.204 "num_base_bdevs": 2, 00:22:15.204 "num_base_bdevs_discovered": 1, 00:22:15.204 "num_base_bdevs_operational": 1, 00:22:15.204 "base_bdevs_list": [ 00:22:15.204 { 00:22:15.204 "name": null, 00:22:15.204 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:15.204 "is_configured": false, 00:22:15.204 "data_offset": 256, 00:22:15.204 "data_size": 7936 00:22:15.204 }, 00:22:15.204 { 00:22:15.204 "name": "BaseBdev2", 00:22:15.204 "uuid": "92639f06-72fa-50f6-b13f-5d8dd8fac2c5", 00:22:15.204 "is_configured": true, 00:22:15.204 "data_offset": 256, 00:22:15.204 "data_size": 7936 00:22:15.204 } 00:22:15.204 ] 00:22:15.204 }' 00:22:15.204 23:44:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:15.204 23:44:00 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:15.771 23:44:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:15.771 23:44:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:15.771 23:44:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:15.771 23:44:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:15.771 23:44:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:15.771 23:44:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:15.771 23:44:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:15.771 23:44:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:15.771 "name": "raid_bdev1", 00:22:15.771 "uuid": "4ed6a8ec-03c9-4c29-a224-8caa13066ec1", 00:22:15.771 "strip_size_kb": 0, 00:22:15.771 "state": "online", 00:22:15.771 "raid_level": "raid1", 00:22:15.771 "superblock": true, 00:22:15.771 "num_base_bdevs": 2, 00:22:15.771 "num_base_bdevs_discovered": 1, 00:22:15.771 "num_base_bdevs_operational": 1, 00:22:15.771 "base_bdevs_list": [ 00:22:15.771 { 00:22:15.771 "name": null, 00:22:15.771 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:15.771 "is_configured": false, 00:22:15.771 "data_offset": 256, 00:22:15.771 "data_size": 7936 00:22:15.771 }, 00:22:15.771 { 00:22:15.771 "name": "BaseBdev2", 00:22:15.771 "uuid": "92639f06-72fa-50f6-b13f-5d8dd8fac2c5", 00:22:15.771 "is_configured": true, 00:22:15.771 "data_offset": 256, 00:22:15.771 "data_size": 7936 00:22:15.771 } 00:22:15.771 ] 00:22:15.771 }' 00:22:15.771 23:44:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:15.771 23:44:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:15.771 23:44:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:15.771 23:44:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:15.771 23:44:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:22:15.771 23:44:00 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@650 -- # local es=0 00:22:15.771 23:44:00 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:22:15.771 23:44:00 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:15.771 23:44:00 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:22:15.771 23:44:00 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:15.771 23:44:00 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:22:15.771 23:44:00 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:15.771 23:44:00 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:22:15.771 23:44:00 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:15.771 23:44:00 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:22:15.771 23:44:00 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:22:16.030 [2024-07-24 23:44:00.919829] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:16.030 [2024-07-24 23:44:00.919925] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:22:16.030 [2024-07-24 23:44:00.919933] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:22:16.030 request: 00:22:16.030 { 00:22:16.030 "base_bdev": "BaseBdev1", 00:22:16.030 "raid_bdev": "raid_bdev1", 00:22:16.030 "method": "bdev_raid_add_base_bdev", 00:22:16.030 "req_id": 1 00:22:16.030 } 00:22:16.030 Got JSON-RPC error response 00:22:16.030 response: 00:22:16.030 { 00:22:16.030 "code": -22, 00:22:16.030 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:22:16.030 } 00:22:16.030 23:44:00 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@653 -- # es=1 00:22:16.030 23:44:00 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:22:16.030 23:44:00 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:22:16.030 23:44:00 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:22:16.030 23:44:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@777 -- # sleep 1 00:22:16.965 23:44:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:16.965 23:44:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:16.965 23:44:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:16.965 23:44:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:16.965 23:44:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:16.965 23:44:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:16.965 23:44:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:16.965 23:44:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:16.965 23:44:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:16.965 23:44:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:16.965 23:44:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:16.965 23:44:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:17.223 23:44:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:17.223 "name": "raid_bdev1", 00:22:17.223 "uuid": "4ed6a8ec-03c9-4c29-a224-8caa13066ec1", 00:22:17.223 "strip_size_kb": 0, 00:22:17.223 "state": "online", 00:22:17.223 "raid_level": "raid1", 00:22:17.223 "superblock": true, 00:22:17.223 "num_base_bdevs": 2, 00:22:17.223 "num_base_bdevs_discovered": 1, 00:22:17.223 "num_base_bdevs_operational": 1, 00:22:17.223 "base_bdevs_list": [ 00:22:17.223 { 00:22:17.223 "name": null, 00:22:17.223 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:17.223 "is_configured": false, 00:22:17.223 "data_offset": 256, 00:22:17.223 "data_size": 7936 00:22:17.223 }, 00:22:17.223 { 00:22:17.223 "name": "BaseBdev2", 00:22:17.223 "uuid": "92639f06-72fa-50f6-b13f-5d8dd8fac2c5", 00:22:17.223 "is_configured": true, 00:22:17.223 "data_offset": 256, 00:22:17.223 "data_size": 7936 00:22:17.223 } 00:22:17.223 ] 00:22:17.223 }' 00:22:17.223 23:44:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:17.223 23:44:02 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:17.789 23:44:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:17.789 23:44:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:17.789 23:44:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:17.789 23:44:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:17.789 23:44:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:17.789 23:44:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:17.789 23:44:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:17.789 23:44:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:17.789 "name": "raid_bdev1", 00:22:17.789 "uuid": "4ed6a8ec-03c9-4c29-a224-8caa13066ec1", 00:22:17.789 "strip_size_kb": 0, 00:22:17.789 "state": "online", 00:22:17.789 "raid_level": "raid1", 00:22:17.789 "superblock": true, 00:22:17.789 "num_base_bdevs": 2, 00:22:17.789 "num_base_bdevs_discovered": 1, 00:22:17.789 "num_base_bdevs_operational": 1, 00:22:17.789 "base_bdevs_list": [ 00:22:17.789 { 00:22:17.789 "name": null, 00:22:17.789 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:17.789 "is_configured": false, 00:22:17.789 "data_offset": 256, 00:22:17.789 "data_size": 7936 00:22:17.789 }, 00:22:17.789 { 00:22:17.789 "name": "BaseBdev2", 00:22:17.789 "uuid": "92639f06-72fa-50f6-b13f-5d8dd8fac2c5", 00:22:17.789 "is_configured": true, 00:22:17.789 "data_offset": 256, 00:22:17.789 "data_size": 7936 00:22:17.789 } 00:22:17.789 ] 00:22:17.789 }' 00:22:17.789 23:44:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:18.048 23:44:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:18.048 23:44:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:18.048 23:44:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:18.048 23:44:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@782 -- # killprocess 390578 00:22:18.048 23:44:02 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@950 -- # '[' -z 390578 ']' 00:22:18.048 23:44:02 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@954 -- # kill -0 390578 00:22:18.048 23:44:02 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@955 -- # uname 00:22:18.048 23:44:02 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:22:18.048 23:44:02 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 390578 00:22:18.048 23:44:02 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:22:18.048 23:44:02 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:22:18.048 23:44:02 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@968 -- # echo 'killing process with pid 390578' 00:22:18.048 killing process with pid 390578 00:22:18.048 23:44:02 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@969 -- # kill 390578 00:22:18.048 Received shutdown signal, test time was about 10.026194 seconds 00:22:18.048 00:22:18.048 Latency(us) 00:22:18.048 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:22:18.048 =================================================================================================================== 00:22:18.048 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:22:18.048 [2024-07-24 23:44:02.895886] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:18.048 [2024-07-24 23:44:02.895952] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:18.048 [2024-07-24 23:44:02.895982] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:18.048 [2024-07-24 23:44:02.895987] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1426d50 name raid_bdev1, state offline 00:22:18.048 23:44:02 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@974 -- # wait 390578 00:22:18.048 [2024-07-24 23:44:02.922642] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:18.307 23:44:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@784 -- # return 0 00:22:18.307 00:22:18.307 real 0m25.560s 00:22:18.307 user 0m39.235s 00:22:18.307 sys 0m3.243s 00:22:18.307 23:44:03 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@1126 -- # xtrace_disable 00:22:18.307 23:44:03 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:18.307 ************************************ 00:22:18.307 END TEST raid_rebuild_test_sb_md_separate 00:22:18.307 ************************************ 00:22:18.307 23:44:03 bdev_raid -- bdev/bdev_raid.sh@911 -- # base_malloc_params='-m 32 -i' 00:22:18.307 23:44:03 bdev_raid -- bdev/bdev_raid.sh@912 -- # run_test raid_state_function_test_sb_md_interleaved raid_state_function_test raid1 2 true 00:22:18.307 23:44:03 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:22:18.307 23:44:03 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:22:18.307 23:44:03 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:18.307 ************************************ 00:22:18.307 START TEST raid_state_function_test_sb_md_interleaved 00:22:18.307 ************************************ 00:22:18.307 23:44:03 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@1125 -- # raid_state_function_test raid1 2 true 00:22:18.307 23:44:03 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:22:18.307 23:44:03 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:22:18.307 23:44:03 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:22:18.307 23:44:03 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:22:18.307 23:44:03 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:22:18.307 23:44:03 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:18.307 23:44:03 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:22:18.307 23:44:03 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:22:18.307 23:44:03 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:18.307 23:44:03 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:22:18.307 23:44:03 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:22:18.307 23:44:03 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:18.307 23:44:03 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:22:18.307 23:44:03 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:22:18.307 23:44:03 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:22:18.307 23:44:03 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # local strip_size 00:22:18.307 23:44:03 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:22:18.307 23:44:03 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:22:18.307 23:44:03 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:22:18.307 23:44:03 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:22:18.307 23:44:03 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:22:18.307 23:44:03 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:22:18.307 23:44:03 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@244 -- # raid_pid=395140 00:22:18.307 23:44:03 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 395140' 00:22:18.307 Process raid pid: 395140 00:22:18.307 23:44:03 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:22:18.307 23:44:03 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@246 -- # waitforlisten 395140 /var/tmp/spdk-raid.sock 00:22:18.307 23:44:03 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@831 -- # '[' -z 395140 ']' 00:22:18.307 23:44:03 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:18.307 23:44:03 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@836 -- # local max_retries=100 00:22:18.307 23:44:03 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:18.307 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:18.307 23:44:03 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@840 -- # xtrace_disable 00:22:18.307 23:44:03 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:22:18.307 [2024-07-24 23:44:03.205697] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:22:18.307 [2024-07-24 23:44:03.205737] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:22:18.307 [2024-07-24 23:44:03.268568] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:18.570 [2024-07-24 23:44:03.347140] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:22:18.570 [2024-07-24 23:44:03.398059] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:18.570 [2024-07-24 23:44:03.398082] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:19.140 23:44:03 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:22:19.140 23:44:03 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@864 -- # return 0 00:22:19.140 23:44:03 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:22:19.398 [2024-07-24 23:44:04.152844] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:22:19.398 [2024-07-24 23:44:04.152874] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:22:19.398 [2024-07-24 23:44:04.152879] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:22:19.398 [2024-07-24 23:44:04.152884] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:22:19.398 23:44:04 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:22:19.398 23:44:04 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:19.398 23:44:04 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:19.398 23:44:04 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:19.398 23:44:04 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:19.398 23:44:04 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:19.398 23:44:04 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:19.398 23:44:04 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:19.398 23:44:04 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:19.398 23:44:04 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:19.398 23:44:04 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:19.398 23:44:04 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:19.398 23:44:04 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:19.398 "name": "Existed_Raid", 00:22:19.398 "uuid": "cfee6506-45aa-47f1-8d39-dad802a381cc", 00:22:19.398 "strip_size_kb": 0, 00:22:19.398 "state": "configuring", 00:22:19.398 "raid_level": "raid1", 00:22:19.398 "superblock": true, 00:22:19.398 "num_base_bdevs": 2, 00:22:19.398 "num_base_bdevs_discovered": 0, 00:22:19.398 "num_base_bdevs_operational": 2, 00:22:19.398 "base_bdevs_list": [ 00:22:19.398 { 00:22:19.398 "name": "BaseBdev1", 00:22:19.398 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:19.398 "is_configured": false, 00:22:19.398 "data_offset": 0, 00:22:19.398 "data_size": 0 00:22:19.398 }, 00:22:19.398 { 00:22:19.398 "name": "BaseBdev2", 00:22:19.398 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:19.398 "is_configured": false, 00:22:19.398 "data_offset": 0, 00:22:19.398 "data_size": 0 00:22:19.398 } 00:22:19.398 ] 00:22:19.398 }' 00:22:19.398 23:44:04 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:19.399 23:44:04 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:22:19.965 23:44:04 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:22:20.223 [2024-07-24 23:44:04.966863] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:22:20.223 [2024-07-24 23:44:04.966885] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa08b10 name Existed_Raid, state configuring 00:22:20.223 23:44:04 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:22:20.223 [2024-07-24 23:44:05.123284] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:22:20.223 [2024-07-24 23:44:05.123305] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:22:20.223 [2024-07-24 23:44:05.123310] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:22:20.224 [2024-07-24 23:44:05.123315] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:22:20.224 23:44:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev1 00:22:20.482 [2024-07-24 23:44:05.288142] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:20.482 BaseBdev1 00:22:20.482 23:44:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:22:20.482 23:44:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:22:20.482 23:44:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:22:20.482 23:44:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@901 -- # local i 00:22:20.482 23:44:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:22:20.482 23:44:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:22:20.482 23:44:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:20.482 23:44:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:22:20.741 [ 00:22:20.741 { 00:22:20.741 "name": "BaseBdev1", 00:22:20.741 "aliases": [ 00:22:20.741 "c9d751e8-7303-4799-b1ee-addc03c7aba4" 00:22:20.741 ], 00:22:20.741 "product_name": "Malloc disk", 00:22:20.741 "block_size": 4128, 00:22:20.741 "num_blocks": 8192, 00:22:20.741 "uuid": "c9d751e8-7303-4799-b1ee-addc03c7aba4", 00:22:20.741 "md_size": 32, 00:22:20.741 "md_interleave": true, 00:22:20.741 "dif_type": 0, 00:22:20.741 "assigned_rate_limits": { 00:22:20.741 "rw_ios_per_sec": 0, 00:22:20.741 "rw_mbytes_per_sec": 0, 00:22:20.741 "r_mbytes_per_sec": 0, 00:22:20.741 "w_mbytes_per_sec": 0 00:22:20.741 }, 00:22:20.741 "claimed": true, 00:22:20.741 "claim_type": "exclusive_write", 00:22:20.741 "zoned": false, 00:22:20.741 "supported_io_types": { 00:22:20.741 "read": true, 00:22:20.741 "write": true, 00:22:20.741 "unmap": true, 00:22:20.741 "flush": true, 00:22:20.741 "reset": true, 00:22:20.741 "nvme_admin": false, 00:22:20.741 "nvme_io": false, 00:22:20.741 "nvme_io_md": false, 00:22:20.741 "write_zeroes": true, 00:22:20.741 "zcopy": true, 00:22:20.741 "get_zone_info": false, 00:22:20.741 "zone_management": false, 00:22:20.741 "zone_append": false, 00:22:20.741 "compare": false, 00:22:20.741 "compare_and_write": false, 00:22:20.741 "abort": true, 00:22:20.741 "seek_hole": false, 00:22:20.741 "seek_data": false, 00:22:20.741 "copy": true, 00:22:20.741 "nvme_iov_md": false 00:22:20.741 }, 00:22:20.741 "memory_domains": [ 00:22:20.741 { 00:22:20.741 "dma_device_id": "system", 00:22:20.741 "dma_device_type": 1 00:22:20.741 }, 00:22:20.741 { 00:22:20.741 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:20.741 "dma_device_type": 2 00:22:20.741 } 00:22:20.741 ], 00:22:20.741 "driver_specific": {} 00:22:20.741 } 00:22:20.741 ] 00:22:20.741 23:44:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@907 -- # return 0 00:22:20.741 23:44:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:22:20.742 23:44:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:20.742 23:44:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:20.742 23:44:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:20.742 23:44:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:20.742 23:44:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:20.742 23:44:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:20.742 23:44:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:20.742 23:44:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:20.742 23:44:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:20.742 23:44:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:20.742 23:44:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:21.046 23:44:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:21.046 "name": "Existed_Raid", 00:22:21.046 "uuid": "fde444bf-678b-43fe-9d13-cb5ed94891c0", 00:22:21.046 "strip_size_kb": 0, 00:22:21.046 "state": "configuring", 00:22:21.046 "raid_level": "raid1", 00:22:21.046 "superblock": true, 00:22:21.046 "num_base_bdevs": 2, 00:22:21.046 "num_base_bdevs_discovered": 1, 00:22:21.046 "num_base_bdevs_operational": 2, 00:22:21.046 "base_bdevs_list": [ 00:22:21.046 { 00:22:21.046 "name": "BaseBdev1", 00:22:21.046 "uuid": "c9d751e8-7303-4799-b1ee-addc03c7aba4", 00:22:21.046 "is_configured": true, 00:22:21.046 "data_offset": 256, 00:22:21.046 "data_size": 7936 00:22:21.046 }, 00:22:21.046 { 00:22:21.046 "name": "BaseBdev2", 00:22:21.046 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:21.046 "is_configured": false, 00:22:21.046 "data_offset": 0, 00:22:21.046 "data_size": 0 00:22:21.046 } 00:22:21.046 ] 00:22:21.046 }' 00:22:21.046 23:44:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:21.046 23:44:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:22:21.306 23:44:06 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:22:21.564 [2024-07-24 23:44:06.451179] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:22:21.564 [2024-07-24 23:44:06.451210] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa083a0 name Existed_Raid, state configuring 00:22:21.564 23:44:06 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:22:21.823 [2024-07-24 23:44:06.607607] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:21.823 [2024-07-24 23:44:06.608634] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:22:21.823 [2024-07-24 23:44:06.608657] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:22:21.823 23:44:06 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:22:21.823 23:44:06 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:22:21.823 23:44:06 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:22:21.823 23:44:06 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:21.823 23:44:06 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:21.823 23:44:06 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:21.823 23:44:06 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:21.823 23:44:06 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:21.823 23:44:06 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:21.823 23:44:06 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:21.823 23:44:06 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:21.823 23:44:06 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:21.823 23:44:06 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:21.823 23:44:06 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:21.823 23:44:06 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:21.823 "name": "Existed_Raid", 00:22:21.823 "uuid": "5ff18d39-2d05-4adc-8ece-b84febb9ed50", 00:22:21.823 "strip_size_kb": 0, 00:22:21.823 "state": "configuring", 00:22:21.823 "raid_level": "raid1", 00:22:21.823 "superblock": true, 00:22:21.823 "num_base_bdevs": 2, 00:22:21.823 "num_base_bdevs_discovered": 1, 00:22:21.823 "num_base_bdevs_operational": 2, 00:22:21.823 "base_bdevs_list": [ 00:22:21.823 { 00:22:21.823 "name": "BaseBdev1", 00:22:21.823 "uuid": "c9d751e8-7303-4799-b1ee-addc03c7aba4", 00:22:21.823 "is_configured": true, 00:22:21.823 "data_offset": 256, 00:22:21.823 "data_size": 7936 00:22:21.823 }, 00:22:21.823 { 00:22:21.823 "name": "BaseBdev2", 00:22:21.823 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:21.823 "is_configured": false, 00:22:21.823 "data_offset": 0, 00:22:21.823 "data_size": 0 00:22:21.823 } 00:22:21.823 ] 00:22:21.823 }' 00:22:21.823 23:44:06 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:21.823 23:44:06 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:22:22.390 23:44:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev2 00:22:22.648 [2024-07-24 23:44:07.436560] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:22.649 [2024-07-24 23:44:07.436657] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xa07c70 00:22:22.649 [2024-07-24 23:44:07.436665] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:22:22.649 [2024-07-24 23:44:07.436726] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xba5a50 00:22:22.649 [2024-07-24 23:44:07.436778] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xa07c70 00:22:22.649 [2024-07-24 23:44:07.436786] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xa07c70 00:22:22.649 [2024-07-24 23:44:07.436823] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:22.649 BaseBdev2 00:22:22.649 23:44:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:22:22.649 23:44:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:22:22.649 23:44:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:22:22.649 23:44:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@901 -- # local i 00:22:22.649 23:44:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:22:22.649 23:44:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:22:22.649 23:44:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:22.649 23:44:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:22:22.907 [ 00:22:22.907 { 00:22:22.907 "name": "BaseBdev2", 00:22:22.907 "aliases": [ 00:22:22.907 "53d1beb7-9d9d-41b0-be43-266cce11fc94" 00:22:22.907 ], 00:22:22.907 "product_name": "Malloc disk", 00:22:22.907 "block_size": 4128, 00:22:22.907 "num_blocks": 8192, 00:22:22.907 "uuid": "53d1beb7-9d9d-41b0-be43-266cce11fc94", 00:22:22.907 "md_size": 32, 00:22:22.907 "md_interleave": true, 00:22:22.907 "dif_type": 0, 00:22:22.907 "assigned_rate_limits": { 00:22:22.907 "rw_ios_per_sec": 0, 00:22:22.907 "rw_mbytes_per_sec": 0, 00:22:22.907 "r_mbytes_per_sec": 0, 00:22:22.907 "w_mbytes_per_sec": 0 00:22:22.907 }, 00:22:22.907 "claimed": true, 00:22:22.907 "claim_type": "exclusive_write", 00:22:22.907 "zoned": false, 00:22:22.907 "supported_io_types": { 00:22:22.907 "read": true, 00:22:22.907 "write": true, 00:22:22.907 "unmap": true, 00:22:22.907 "flush": true, 00:22:22.907 "reset": true, 00:22:22.907 "nvme_admin": false, 00:22:22.907 "nvme_io": false, 00:22:22.907 "nvme_io_md": false, 00:22:22.907 "write_zeroes": true, 00:22:22.907 "zcopy": true, 00:22:22.907 "get_zone_info": false, 00:22:22.907 "zone_management": false, 00:22:22.907 "zone_append": false, 00:22:22.907 "compare": false, 00:22:22.907 "compare_and_write": false, 00:22:22.907 "abort": true, 00:22:22.907 "seek_hole": false, 00:22:22.907 "seek_data": false, 00:22:22.907 "copy": true, 00:22:22.907 "nvme_iov_md": false 00:22:22.907 }, 00:22:22.907 "memory_domains": [ 00:22:22.907 { 00:22:22.907 "dma_device_id": "system", 00:22:22.907 "dma_device_type": 1 00:22:22.907 }, 00:22:22.907 { 00:22:22.907 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:22.907 "dma_device_type": 2 00:22:22.907 } 00:22:22.907 ], 00:22:22.907 "driver_specific": {} 00:22:22.907 } 00:22:22.907 ] 00:22:22.907 23:44:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@907 -- # return 0 00:22:22.907 23:44:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:22:22.907 23:44:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:22:22.907 23:44:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:22:22.907 23:44:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:22.907 23:44:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:22.907 23:44:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:22.907 23:44:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:22.907 23:44:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:22.907 23:44:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:22.907 23:44:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:22.907 23:44:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:22.908 23:44:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:22.908 23:44:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:22.908 23:44:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:23.166 23:44:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:23.166 "name": "Existed_Raid", 00:22:23.166 "uuid": "5ff18d39-2d05-4adc-8ece-b84febb9ed50", 00:22:23.166 "strip_size_kb": 0, 00:22:23.166 "state": "online", 00:22:23.166 "raid_level": "raid1", 00:22:23.166 "superblock": true, 00:22:23.166 "num_base_bdevs": 2, 00:22:23.166 "num_base_bdevs_discovered": 2, 00:22:23.166 "num_base_bdevs_operational": 2, 00:22:23.166 "base_bdevs_list": [ 00:22:23.166 { 00:22:23.166 "name": "BaseBdev1", 00:22:23.166 "uuid": "c9d751e8-7303-4799-b1ee-addc03c7aba4", 00:22:23.166 "is_configured": true, 00:22:23.166 "data_offset": 256, 00:22:23.166 "data_size": 7936 00:22:23.166 }, 00:22:23.166 { 00:22:23.166 "name": "BaseBdev2", 00:22:23.166 "uuid": "53d1beb7-9d9d-41b0-be43-266cce11fc94", 00:22:23.166 "is_configured": true, 00:22:23.166 "data_offset": 256, 00:22:23.166 "data_size": 7936 00:22:23.166 } 00:22:23.166 ] 00:22:23.166 }' 00:22:23.166 23:44:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:23.166 23:44:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:22:23.425 23:44:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:22:23.425 23:44:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:22:23.425 23:44:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:22:23.425 23:44:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:22:23.425 23:44:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:22:23.425 23:44:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:22:23.684 23:44:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:22:23.684 23:44:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:22:23.684 [2024-07-24 23:44:08.579705] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:23.684 23:44:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:22:23.684 "name": "Existed_Raid", 00:22:23.684 "aliases": [ 00:22:23.684 "5ff18d39-2d05-4adc-8ece-b84febb9ed50" 00:22:23.684 ], 00:22:23.684 "product_name": "Raid Volume", 00:22:23.684 "block_size": 4128, 00:22:23.684 "num_blocks": 7936, 00:22:23.684 "uuid": "5ff18d39-2d05-4adc-8ece-b84febb9ed50", 00:22:23.684 "md_size": 32, 00:22:23.684 "md_interleave": true, 00:22:23.684 "dif_type": 0, 00:22:23.684 "assigned_rate_limits": { 00:22:23.684 "rw_ios_per_sec": 0, 00:22:23.684 "rw_mbytes_per_sec": 0, 00:22:23.684 "r_mbytes_per_sec": 0, 00:22:23.684 "w_mbytes_per_sec": 0 00:22:23.684 }, 00:22:23.684 "claimed": false, 00:22:23.684 "zoned": false, 00:22:23.684 "supported_io_types": { 00:22:23.684 "read": true, 00:22:23.684 "write": true, 00:22:23.684 "unmap": false, 00:22:23.684 "flush": false, 00:22:23.684 "reset": true, 00:22:23.684 "nvme_admin": false, 00:22:23.684 "nvme_io": false, 00:22:23.684 "nvme_io_md": false, 00:22:23.684 "write_zeroes": true, 00:22:23.684 "zcopy": false, 00:22:23.684 "get_zone_info": false, 00:22:23.684 "zone_management": false, 00:22:23.684 "zone_append": false, 00:22:23.684 "compare": false, 00:22:23.684 "compare_and_write": false, 00:22:23.684 "abort": false, 00:22:23.684 "seek_hole": false, 00:22:23.684 "seek_data": false, 00:22:23.684 "copy": false, 00:22:23.684 "nvme_iov_md": false 00:22:23.684 }, 00:22:23.684 "memory_domains": [ 00:22:23.684 { 00:22:23.684 "dma_device_id": "system", 00:22:23.684 "dma_device_type": 1 00:22:23.684 }, 00:22:23.684 { 00:22:23.684 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:23.684 "dma_device_type": 2 00:22:23.684 }, 00:22:23.684 { 00:22:23.685 "dma_device_id": "system", 00:22:23.685 "dma_device_type": 1 00:22:23.685 }, 00:22:23.685 { 00:22:23.685 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:23.685 "dma_device_type": 2 00:22:23.685 } 00:22:23.685 ], 00:22:23.685 "driver_specific": { 00:22:23.685 "raid": { 00:22:23.685 "uuid": "5ff18d39-2d05-4adc-8ece-b84febb9ed50", 00:22:23.685 "strip_size_kb": 0, 00:22:23.685 "state": "online", 00:22:23.685 "raid_level": "raid1", 00:22:23.685 "superblock": true, 00:22:23.685 "num_base_bdevs": 2, 00:22:23.685 "num_base_bdevs_discovered": 2, 00:22:23.685 "num_base_bdevs_operational": 2, 00:22:23.685 "base_bdevs_list": [ 00:22:23.685 { 00:22:23.685 "name": "BaseBdev1", 00:22:23.685 "uuid": "c9d751e8-7303-4799-b1ee-addc03c7aba4", 00:22:23.685 "is_configured": true, 00:22:23.685 "data_offset": 256, 00:22:23.685 "data_size": 7936 00:22:23.685 }, 00:22:23.685 { 00:22:23.685 "name": "BaseBdev2", 00:22:23.685 "uuid": "53d1beb7-9d9d-41b0-be43-266cce11fc94", 00:22:23.685 "is_configured": true, 00:22:23.685 "data_offset": 256, 00:22:23.685 "data_size": 7936 00:22:23.685 } 00:22:23.685 ] 00:22:23.685 } 00:22:23.685 } 00:22:23.685 }' 00:22:23.685 23:44:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:23.685 23:44:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:22:23.685 BaseBdev2' 00:22:23.685 23:44:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:23.685 23:44:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:22:23.685 23:44:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:23.944 23:44:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:23.944 "name": "BaseBdev1", 00:22:23.944 "aliases": [ 00:22:23.944 "c9d751e8-7303-4799-b1ee-addc03c7aba4" 00:22:23.944 ], 00:22:23.944 "product_name": "Malloc disk", 00:22:23.944 "block_size": 4128, 00:22:23.944 "num_blocks": 8192, 00:22:23.944 "uuid": "c9d751e8-7303-4799-b1ee-addc03c7aba4", 00:22:23.944 "md_size": 32, 00:22:23.944 "md_interleave": true, 00:22:23.944 "dif_type": 0, 00:22:23.944 "assigned_rate_limits": { 00:22:23.944 "rw_ios_per_sec": 0, 00:22:23.944 "rw_mbytes_per_sec": 0, 00:22:23.944 "r_mbytes_per_sec": 0, 00:22:23.944 "w_mbytes_per_sec": 0 00:22:23.944 }, 00:22:23.944 "claimed": true, 00:22:23.944 "claim_type": "exclusive_write", 00:22:23.944 "zoned": false, 00:22:23.944 "supported_io_types": { 00:22:23.944 "read": true, 00:22:23.944 "write": true, 00:22:23.944 "unmap": true, 00:22:23.944 "flush": true, 00:22:23.944 "reset": true, 00:22:23.944 "nvme_admin": false, 00:22:23.944 "nvme_io": false, 00:22:23.944 "nvme_io_md": false, 00:22:23.944 "write_zeroes": true, 00:22:23.944 "zcopy": true, 00:22:23.944 "get_zone_info": false, 00:22:23.944 "zone_management": false, 00:22:23.944 "zone_append": false, 00:22:23.944 "compare": false, 00:22:23.944 "compare_and_write": false, 00:22:23.944 "abort": true, 00:22:23.944 "seek_hole": false, 00:22:23.944 "seek_data": false, 00:22:23.944 "copy": true, 00:22:23.944 "nvme_iov_md": false 00:22:23.944 }, 00:22:23.944 "memory_domains": [ 00:22:23.944 { 00:22:23.944 "dma_device_id": "system", 00:22:23.944 "dma_device_type": 1 00:22:23.944 }, 00:22:23.944 { 00:22:23.944 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:23.944 "dma_device_type": 2 00:22:23.944 } 00:22:23.944 ], 00:22:23.944 "driver_specific": {} 00:22:23.944 }' 00:22:23.944 23:44:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:23.944 23:44:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:23.944 23:44:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:22:23.944 23:44:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:23.944 23:44:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:24.202 23:44:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:22:24.202 23:44:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:24.202 23:44:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:24.202 23:44:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:22:24.202 23:44:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:24.202 23:44:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:24.202 23:44:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:22:24.202 23:44:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:24.202 23:44:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:22:24.203 23:44:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:24.461 23:44:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:24.461 "name": "BaseBdev2", 00:22:24.461 "aliases": [ 00:22:24.461 "53d1beb7-9d9d-41b0-be43-266cce11fc94" 00:22:24.461 ], 00:22:24.461 "product_name": "Malloc disk", 00:22:24.461 "block_size": 4128, 00:22:24.461 "num_blocks": 8192, 00:22:24.461 "uuid": "53d1beb7-9d9d-41b0-be43-266cce11fc94", 00:22:24.461 "md_size": 32, 00:22:24.461 "md_interleave": true, 00:22:24.461 "dif_type": 0, 00:22:24.461 "assigned_rate_limits": { 00:22:24.461 "rw_ios_per_sec": 0, 00:22:24.461 "rw_mbytes_per_sec": 0, 00:22:24.461 "r_mbytes_per_sec": 0, 00:22:24.461 "w_mbytes_per_sec": 0 00:22:24.461 }, 00:22:24.461 "claimed": true, 00:22:24.461 "claim_type": "exclusive_write", 00:22:24.461 "zoned": false, 00:22:24.461 "supported_io_types": { 00:22:24.461 "read": true, 00:22:24.461 "write": true, 00:22:24.461 "unmap": true, 00:22:24.461 "flush": true, 00:22:24.461 "reset": true, 00:22:24.461 "nvme_admin": false, 00:22:24.461 "nvme_io": false, 00:22:24.461 "nvme_io_md": false, 00:22:24.461 "write_zeroes": true, 00:22:24.461 "zcopy": true, 00:22:24.461 "get_zone_info": false, 00:22:24.461 "zone_management": false, 00:22:24.461 "zone_append": false, 00:22:24.461 "compare": false, 00:22:24.461 "compare_and_write": false, 00:22:24.461 "abort": true, 00:22:24.461 "seek_hole": false, 00:22:24.461 "seek_data": false, 00:22:24.461 "copy": true, 00:22:24.461 "nvme_iov_md": false 00:22:24.461 }, 00:22:24.461 "memory_domains": [ 00:22:24.461 { 00:22:24.461 "dma_device_id": "system", 00:22:24.461 "dma_device_type": 1 00:22:24.461 }, 00:22:24.461 { 00:22:24.461 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:24.461 "dma_device_type": 2 00:22:24.461 } 00:22:24.461 ], 00:22:24.461 "driver_specific": {} 00:22:24.461 }' 00:22:24.461 23:44:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:24.461 23:44:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:24.461 23:44:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:22:24.461 23:44:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:24.461 23:44:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:24.720 23:44:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:22:24.720 23:44:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:24.720 23:44:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:24.720 23:44:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:22:24.720 23:44:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:24.720 23:44:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:24.720 23:44:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:22:24.720 23:44:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:22:24.979 [2024-07-24 23:44:09.766618] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:24.979 23:44:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@275 -- # local expected_state 00:22:24.979 23:44:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:22:24.979 23:44:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@213 -- # case $1 in 00:22:24.979 23:44:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@214 -- # return 0 00:22:24.979 23:44:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:22:24.979 23:44:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:22:24.979 23:44:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:24.979 23:44:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:24.979 23:44:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:24.979 23:44:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:24.979 23:44:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:24.979 23:44:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:24.979 23:44:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:24.979 23:44:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:24.979 23:44:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:24.979 23:44:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:24.979 23:44:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:24.979 23:44:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:24.979 "name": "Existed_Raid", 00:22:24.979 "uuid": "5ff18d39-2d05-4adc-8ece-b84febb9ed50", 00:22:24.979 "strip_size_kb": 0, 00:22:24.979 "state": "online", 00:22:24.979 "raid_level": "raid1", 00:22:24.979 "superblock": true, 00:22:24.979 "num_base_bdevs": 2, 00:22:24.979 "num_base_bdevs_discovered": 1, 00:22:24.979 "num_base_bdevs_operational": 1, 00:22:24.979 "base_bdevs_list": [ 00:22:24.979 { 00:22:24.979 "name": null, 00:22:24.979 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:24.979 "is_configured": false, 00:22:24.979 "data_offset": 256, 00:22:24.979 "data_size": 7936 00:22:24.979 }, 00:22:24.979 { 00:22:24.979 "name": "BaseBdev2", 00:22:24.979 "uuid": "53d1beb7-9d9d-41b0-be43-266cce11fc94", 00:22:24.979 "is_configured": true, 00:22:24.979 "data_offset": 256, 00:22:24.979 "data_size": 7936 00:22:24.979 } 00:22:24.979 ] 00:22:24.979 }' 00:22:24.979 23:44:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:24.979 23:44:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:22:25.547 23:44:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:22:25.547 23:44:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:22:25.547 23:44:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:22:25.547 23:44:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:25.806 23:44:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:22:25.806 23:44:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:22:25.806 23:44:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:22:25.806 [2024-07-24 23:44:10.758074] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:22:25.806 [2024-07-24 23:44:10.758137] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:25.806 [2024-07-24 23:44:10.768194] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:25.806 [2024-07-24 23:44:10.768236] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:25.806 [2024-07-24 23:44:10.768242] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa07c70 name Existed_Raid, state offline 00:22:25.806 23:44:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:22:25.806 23:44:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:22:25.806 23:44:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:25.806 23:44:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:22:26.066 23:44:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:22:26.066 23:44:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:22:26.066 23:44:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:22:26.066 23:44:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@341 -- # killprocess 395140 00:22:26.066 23:44:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@950 -- # '[' -z 395140 ']' 00:22:26.066 23:44:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # kill -0 395140 00:22:26.066 23:44:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@955 -- # uname 00:22:26.066 23:44:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:22:26.066 23:44:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 395140 00:22:26.066 23:44:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:22:26.066 23:44:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:22:26.066 23:44:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@968 -- # echo 'killing process with pid 395140' 00:22:26.066 killing process with pid 395140 00:22:26.066 23:44:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@969 -- # kill 395140 00:22:26.066 [2024-07-24 23:44:10.995959] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:26.066 23:44:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@974 -- # wait 395140 00:22:26.066 [2024-07-24 23:44:10.996729] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:26.326 23:44:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@343 -- # return 0 00:22:26.326 00:22:26.326 real 0m8.014s 00:22:26.326 user 0m14.407s 00:22:26.326 sys 0m1.288s 00:22:26.326 23:44:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@1126 -- # xtrace_disable 00:22:26.326 23:44:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:22:26.326 ************************************ 00:22:26.326 END TEST raid_state_function_test_sb_md_interleaved 00:22:26.326 ************************************ 00:22:26.326 23:44:11 bdev_raid -- bdev/bdev_raid.sh@913 -- # run_test raid_superblock_test_md_interleaved raid_superblock_test raid1 2 00:22:26.326 23:44:11 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:22:26.326 23:44:11 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:22:26.326 23:44:11 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:26.326 ************************************ 00:22:26.326 START TEST raid_superblock_test_md_interleaved 00:22:26.326 ************************************ 00:22:26.326 23:44:11 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@1125 -- # raid_superblock_test raid1 2 00:22:26.326 23:44:11 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:22:26.326 23:44:11 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:22:26.326 23:44:11 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:22:26.326 23:44:11 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:22:26.326 23:44:11 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:22:26.326 23:44:11 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:22:26.326 23:44:11 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:22:26.326 23:44:11 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:22:26.326 23:44:11 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:22:26.326 23:44:11 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@398 -- # local strip_size 00:22:26.326 23:44:11 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:22:26.326 23:44:11 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:22:26.326 23:44:11 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:22:26.326 23:44:11 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:22:26.326 23:44:11 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:22:26.326 23:44:11 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@411 -- # raid_pid=396731 00:22:26.326 23:44:11 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@412 -- # waitforlisten 396731 /var/tmp/spdk-raid.sock 00:22:26.326 23:44:11 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:22:26.326 23:44:11 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@831 -- # '[' -z 396731 ']' 00:22:26.326 23:44:11 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:26.326 23:44:11 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@836 -- # local max_retries=100 00:22:26.326 23:44:11 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:26.326 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:26.326 23:44:11 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@840 -- # xtrace_disable 00:22:26.326 23:44:11 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:22:26.326 [2024-07-24 23:44:11.270050] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:22:26.326 [2024-07-24 23:44:11.270085] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid396731 ] 00:22:26.585 [2024-07-24 23:44:11.333051] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:26.585 [2024-07-24 23:44:11.410982] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:22:26.585 [2024-07-24 23:44:11.461169] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:26.585 [2024-07-24 23:44:11.461195] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:27.153 23:44:12 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:22:27.153 23:44:12 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@864 -- # return 0 00:22:27.153 23:44:12 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:22:27.154 23:44:12 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:22:27.154 23:44:12 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:22:27.154 23:44:12 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:22:27.154 23:44:12 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:22:27.154 23:44:12 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:22:27.154 23:44:12 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:22:27.154 23:44:12 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:22:27.154 23:44:12 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b malloc1 00:22:27.413 malloc1 00:22:27.413 23:44:12 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:22:27.413 [2024-07-24 23:44:12.388709] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:22:27.413 [2024-07-24 23:44:12.388744] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:27.413 [2024-07-24 23:44:12.388757] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x132f6f0 00:22:27.413 [2024-07-24 23:44:12.388763] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:27.413 [2024-07-24 23:44:12.389772] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:27.413 [2024-07-24 23:44:12.389792] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:22:27.413 pt1 00:22:27.413 23:44:12 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:22:27.413 23:44:12 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:22:27.413 23:44:12 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:22:27.413 23:44:12 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:22:27.413 23:44:12 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:22:27.413 23:44:12 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:22:27.413 23:44:12 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:22:27.413 23:44:12 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:22:27.413 23:44:12 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b malloc2 00:22:27.672 malloc2 00:22:27.672 23:44:12 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:22:27.932 [2024-07-24 23:44:12.713183] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:22:27.932 [2024-07-24 23:44:12.713214] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:27.932 [2024-07-24 23:44:12.713223] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14bce50 00:22:27.932 [2024-07-24 23:44:12.713229] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:27.932 [2024-07-24 23:44:12.714242] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:27.932 [2024-07-24 23:44:12.714261] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:22:27.932 pt2 00:22:27.932 23:44:12 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:22:27.932 23:44:12 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:22:27.932 23:44:12 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:22:27.932 [2024-07-24 23:44:12.881653] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:22:27.932 [2024-07-24 23:44:12.882588] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:27.932 [2024-07-24 23:44:12.882698] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x14be8e0 00:22:27.932 [2024-07-24 23:44:12.882707] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:22:27.932 [2024-07-24 23:44:12.882757] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x132d8b0 00:22:27.932 [2024-07-24 23:44:12.882814] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x14be8e0 00:22:27.932 [2024-07-24 23:44:12.882825] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x14be8e0 00:22:27.932 [2024-07-24 23:44:12.882863] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:27.932 23:44:12 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:27.932 23:44:12 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:27.932 23:44:12 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:27.932 23:44:12 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:27.932 23:44:12 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:27.932 23:44:12 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:27.932 23:44:12 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:27.932 23:44:12 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:27.932 23:44:12 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:27.932 23:44:12 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:27.932 23:44:12 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:27.932 23:44:12 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:28.191 23:44:13 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:28.191 "name": "raid_bdev1", 00:22:28.191 "uuid": "5a2008db-f6ad-4a8c-8c8d-f39da081debb", 00:22:28.191 "strip_size_kb": 0, 00:22:28.191 "state": "online", 00:22:28.191 "raid_level": "raid1", 00:22:28.191 "superblock": true, 00:22:28.191 "num_base_bdevs": 2, 00:22:28.191 "num_base_bdevs_discovered": 2, 00:22:28.191 "num_base_bdevs_operational": 2, 00:22:28.191 "base_bdevs_list": [ 00:22:28.191 { 00:22:28.191 "name": "pt1", 00:22:28.191 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:28.191 "is_configured": true, 00:22:28.191 "data_offset": 256, 00:22:28.191 "data_size": 7936 00:22:28.191 }, 00:22:28.191 { 00:22:28.191 "name": "pt2", 00:22:28.191 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:28.191 "is_configured": true, 00:22:28.191 "data_offset": 256, 00:22:28.191 "data_size": 7936 00:22:28.191 } 00:22:28.191 ] 00:22:28.191 }' 00:22:28.191 23:44:13 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:28.191 23:44:13 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:22:28.758 23:44:13 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:22:28.758 23:44:13 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:22:28.758 23:44:13 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:22:28.758 23:44:13 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:22:28.758 23:44:13 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:22:28.758 23:44:13 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:22:28.758 23:44:13 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:28.758 23:44:13 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:22:28.758 [2024-07-24 23:44:13.719955] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:28.758 23:44:13 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:22:28.758 "name": "raid_bdev1", 00:22:28.758 "aliases": [ 00:22:28.758 "5a2008db-f6ad-4a8c-8c8d-f39da081debb" 00:22:28.758 ], 00:22:28.758 "product_name": "Raid Volume", 00:22:28.758 "block_size": 4128, 00:22:28.758 "num_blocks": 7936, 00:22:28.758 "uuid": "5a2008db-f6ad-4a8c-8c8d-f39da081debb", 00:22:28.758 "md_size": 32, 00:22:28.758 "md_interleave": true, 00:22:28.758 "dif_type": 0, 00:22:28.758 "assigned_rate_limits": { 00:22:28.758 "rw_ios_per_sec": 0, 00:22:28.758 "rw_mbytes_per_sec": 0, 00:22:28.758 "r_mbytes_per_sec": 0, 00:22:28.758 "w_mbytes_per_sec": 0 00:22:28.758 }, 00:22:28.758 "claimed": false, 00:22:28.758 "zoned": false, 00:22:28.758 "supported_io_types": { 00:22:28.758 "read": true, 00:22:28.758 "write": true, 00:22:28.758 "unmap": false, 00:22:28.758 "flush": false, 00:22:28.758 "reset": true, 00:22:28.758 "nvme_admin": false, 00:22:28.758 "nvme_io": false, 00:22:28.758 "nvme_io_md": false, 00:22:28.758 "write_zeroes": true, 00:22:28.758 "zcopy": false, 00:22:28.758 "get_zone_info": false, 00:22:28.758 "zone_management": false, 00:22:28.758 "zone_append": false, 00:22:28.758 "compare": false, 00:22:28.758 "compare_and_write": false, 00:22:28.758 "abort": false, 00:22:28.758 "seek_hole": false, 00:22:28.758 "seek_data": false, 00:22:28.758 "copy": false, 00:22:28.758 "nvme_iov_md": false 00:22:28.758 }, 00:22:28.758 "memory_domains": [ 00:22:28.758 { 00:22:28.758 "dma_device_id": "system", 00:22:28.758 "dma_device_type": 1 00:22:28.758 }, 00:22:28.758 { 00:22:28.758 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:28.758 "dma_device_type": 2 00:22:28.758 }, 00:22:28.758 { 00:22:28.758 "dma_device_id": "system", 00:22:28.758 "dma_device_type": 1 00:22:28.758 }, 00:22:28.758 { 00:22:28.758 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:28.758 "dma_device_type": 2 00:22:28.758 } 00:22:28.758 ], 00:22:28.758 "driver_specific": { 00:22:28.758 "raid": { 00:22:28.758 "uuid": "5a2008db-f6ad-4a8c-8c8d-f39da081debb", 00:22:28.758 "strip_size_kb": 0, 00:22:28.758 "state": "online", 00:22:28.758 "raid_level": "raid1", 00:22:28.758 "superblock": true, 00:22:28.758 "num_base_bdevs": 2, 00:22:28.758 "num_base_bdevs_discovered": 2, 00:22:28.758 "num_base_bdevs_operational": 2, 00:22:28.758 "base_bdevs_list": [ 00:22:28.758 { 00:22:28.758 "name": "pt1", 00:22:28.758 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:28.758 "is_configured": true, 00:22:28.758 "data_offset": 256, 00:22:28.758 "data_size": 7936 00:22:28.758 }, 00:22:28.758 { 00:22:28.758 "name": "pt2", 00:22:28.758 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:28.758 "is_configured": true, 00:22:28.758 "data_offset": 256, 00:22:28.758 "data_size": 7936 00:22:28.758 } 00:22:28.758 ] 00:22:28.758 } 00:22:28.758 } 00:22:28.758 }' 00:22:28.758 23:44:13 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:29.017 23:44:13 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:22:29.017 pt2' 00:22:29.017 23:44:13 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:29.017 23:44:13 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:22:29.017 23:44:13 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:29.017 23:44:13 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:29.017 "name": "pt1", 00:22:29.017 "aliases": [ 00:22:29.017 "00000000-0000-0000-0000-000000000001" 00:22:29.017 ], 00:22:29.017 "product_name": "passthru", 00:22:29.017 "block_size": 4128, 00:22:29.017 "num_blocks": 8192, 00:22:29.017 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:29.017 "md_size": 32, 00:22:29.017 "md_interleave": true, 00:22:29.017 "dif_type": 0, 00:22:29.017 "assigned_rate_limits": { 00:22:29.017 "rw_ios_per_sec": 0, 00:22:29.017 "rw_mbytes_per_sec": 0, 00:22:29.017 "r_mbytes_per_sec": 0, 00:22:29.017 "w_mbytes_per_sec": 0 00:22:29.017 }, 00:22:29.017 "claimed": true, 00:22:29.017 "claim_type": "exclusive_write", 00:22:29.017 "zoned": false, 00:22:29.017 "supported_io_types": { 00:22:29.017 "read": true, 00:22:29.017 "write": true, 00:22:29.017 "unmap": true, 00:22:29.017 "flush": true, 00:22:29.017 "reset": true, 00:22:29.017 "nvme_admin": false, 00:22:29.017 "nvme_io": false, 00:22:29.017 "nvme_io_md": false, 00:22:29.017 "write_zeroes": true, 00:22:29.017 "zcopy": true, 00:22:29.017 "get_zone_info": false, 00:22:29.017 "zone_management": false, 00:22:29.017 "zone_append": false, 00:22:29.017 "compare": false, 00:22:29.017 "compare_and_write": false, 00:22:29.017 "abort": true, 00:22:29.017 "seek_hole": false, 00:22:29.017 "seek_data": false, 00:22:29.017 "copy": true, 00:22:29.017 "nvme_iov_md": false 00:22:29.017 }, 00:22:29.017 "memory_domains": [ 00:22:29.017 { 00:22:29.017 "dma_device_id": "system", 00:22:29.017 "dma_device_type": 1 00:22:29.017 }, 00:22:29.017 { 00:22:29.017 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:29.017 "dma_device_type": 2 00:22:29.017 } 00:22:29.017 ], 00:22:29.017 "driver_specific": { 00:22:29.017 "passthru": { 00:22:29.017 "name": "pt1", 00:22:29.017 "base_bdev_name": "malloc1" 00:22:29.017 } 00:22:29.017 } 00:22:29.017 }' 00:22:29.017 23:44:13 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:29.017 23:44:13 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:29.275 23:44:14 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:22:29.275 23:44:14 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:29.275 23:44:14 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:29.275 23:44:14 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:22:29.275 23:44:14 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:29.275 23:44:14 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:29.275 23:44:14 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:22:29.275 23:44:14 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:29.275 23:44:14 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:29.275 23:44:14 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:22:29.275 23:44:14 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:29.275 23:44:14 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:22:29.275 23:44:14 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:29.534 23:44:14 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:29.534 "name": "pt2", 00:22:29.534 "aliases": [ 00:22:29.534 "00000000-0000-0000-0000-000000000002" 00:22:29.534 ], 00:22:29.534 "product_name": "passthru", 00:22:29.534 "block_size": 4128, 00:22:29.534 "num_blocks": 8192, 00:22:29.534 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:29.534 "md_size": 32, 00:22:29.534 "md_interleave": true, 00:22:29.534 "dif_type": 0, 00:22:29.534 "assigned_rate_limits": { 00:22:29.534 "rw_ios_per_sec": 0, 00:22:29.534 "rw_mbytes_per_sec": 0, 00:22:29.534 "r_mbytes_per_sec": 0, 00:22:29.534 "w_mbytes_per_sec": 0 00:22:29.534 }, 00:22:29.534 "claimed": true, 00:22:29.534 "claim_type": "exclusive_write", 00:22:29.534 "zoned": false, 00:22:29.534 "supported_io_types": { 00:22:29.534 "read": true, 00:22:29.534 "write": true, 00:22:29.534 "unmap": true, 00:22:29.534 "flush": true, 00:22:29.534 "reset": true, 00:22:29.534 "nvme_admin": false, 00:22:29.534 "nvme_io": false, 00:22:29.534 "nvme_io_md": false, 00:22:29.534 "write_zeroes": true, 00:22:29.534 "zcopy": true, 00:22:29.534 "get_zone_info": false, 00:22:29.534 "zone_management": false, 00:22:29.534 "zone_append": false, 00:22:29.534 "compare": false, 00:22:29.534 "compare_and_write": false, 00:22:29.534 "abort": true, 00:22:29.534 "seek_hole": false, 00:22:29.534 "seek_data": false, 00:22:29.534 "copy": true, 00:22:29.534 "nvme_iov_md": false 00:22:29.534 }, 00:22:29.534 "memory_domains": [ 00:22:29.534 { 00:22:29.534 "dma_device_id": "system", 00:22:29.534 "dma_device_type": 1 00:22:29.534 }, 00:22:29.534 { 00:22:29.534 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:29.534 "dma_device_type": 2 00:22:29.534 } 00:22:29.534 ], 00:22:29.534 "driver_specific": { 00:22:29.534 "passthru": { 00:22:29.534 "name": "pt2", 00:22:29.534 "base_bdev_name": "malloc2" 00:22:29.534 } 00:22:29.534 } 00:22:29.534 }' 00:22:29.534 23:44:14 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:29.534 23:44:14 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:29.534 23:44:14 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:22:29.534 23:44:14 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:29.793 23:44:14 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:29.793 23:44:14 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:22:29.793 23:44:14 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:29.793 23:44:14 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:29.793 23:44:14 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:22:29.793 23:44:14 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:29.793 23:44:14 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:29.793 23:44:14 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:22:29.793 23:44:14 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:29.793 23:44:14 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:22:30.052 [2024-07-24 23:44:14.882960] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:30.052 23:44:14 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=5a2008db-f6ad-4a8c-8c8d-f39da081debb 00:22:30.052 23:44:14 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@435 -- # '[' -z 5a2008db-f6ad-4a8c-8c8d-f39da081debb ']' 00:22:30.052 23:44:14 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:30.052 [2024-07-24 23:44:15.051235] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:30.052 [2024-07-24 23:44:15.051249] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:30.052 [2024-07-24 23:44:15.051287] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:30.052 [2024-07-24 23:44:15.051326] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:30.052 [2024-07-24 23:44:15.051332] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14be8e0 name raid_bdev1, state offline 00:22:30.311 23:44:15 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:30.311 23:44:15 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:22:30.311 23:44:15 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:22:30.311 23:44:15 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:22:30.311 23:44:15 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:22:30.311 23:44:15 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:22:30.570 23:44:15 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:22:30.570 23:44:15 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:22:30.829 23:44:15 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:22:30.829 23:44:15 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:22:30.829 23:44:15 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:22:30.829 23:44:15 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:22:30.829 23:44:15 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@650 -- # local es=0 00:22:30.829 23:44:15 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:22:30.829 23:44:15 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:30.829 23:44:15 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:22:30.829 23:44:15 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:30.829 23:44:15 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:22:30.829 23:44:15 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:30.829 23:44:15 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:22:30.829 23:44:15 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:30.829 23:44:15 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:22:30.829 23:44:15 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:22:31.088 [2024-07-24 23:44:15.889393] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:22:31.088 [2024-07-24 23:44:15.890366] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:22:31.088 [2024-07-24 23:44:15.890407] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:22:31.088 [2024-07-24 23:44:15.890433] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:22:31.088 [2024-07-24 23:44:15.890443] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:31.088 [2024-07-24 23:44:15.890464] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x132f030 name raid_bdev1, state configuring 00:22:31.088 request: 00:22:31.088 { 00:22:31.088 "name": "raid_bdev1", 00:22:31.088 "raid_level": "raid1", 00:22:31.088 "base_bdevs": [ 00:22:31.088 "malloc1", 00:22:31.088 "malloc2" 00:22:31.088 ], 00:22:31.088 "superblock": false, 00:22:31.088 "method": "bdev_raid_create", 00:22:31.088 "req_id": 1 00:22:31.088 } 00:22:31.088 Got JSON-RPC error response 00:22:31.088 response: 00:22:31.088 { 00:22:31.088 "code": -17, 00:22:31.088 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:22:31.088 } 00:22:31.088 23:44:15 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@653 -- # es=1 00:22:31.089 23:44:15 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:22:31.089 23:44:15 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:22:31.089 23:44:15 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:22:31.089 23:44:15 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:31.089 23:44:15 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:22:31.089 23:44:16 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:22:31.089 23:44:16 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:22:31.089 23:44:16 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:22:31.348 [2024-07-24 23:44:16.202162] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:22:31.348 [2024-07-24 23:44:16.202187] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:31.348 [2024-07-24 23:44:16.202197] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14bf310 00:22:31.348 [2024-07-24 23:44:16.202203] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:31.348 [2024-07-24 23:44:16.203199] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:31.348 [2024-07-24 23:44:16.203218] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:22:31.348 [2024-07-24 23:44:16.203250] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:22:31.348 [2024-07-24 23:44:16.203267] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:22:31.348 pt1 00:22:31.348 23:44:16 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:22:31.348 23:44:16 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:31.348 23:44:16 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:31.348 23:44:16 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:31.348 23:44:16 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:31.348 23:44:16 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:31.348 23:44:16 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:31.348 23:44:16 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:31.348 23:44:16 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:31.348 23:44:16 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:31.348 23:44:16 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:31.348 23:44:16 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:31.607 23:44:16 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:31.607 "name": "raid_bdev1", 00:22:31.607 "uuid": "5a2008db-f6ad-4a8c-8c8d-f39da081debb", 00:22:31.607 "strip_size_kb": 0, 00:22:31.607 "state": "configuring", 00:22:31.607 "raid_level": "raid1", 00:22:31.607 "superblock": true, 00:22:31.607 "num_base_bdevs": 2, 00:22:31.607 "num_base_bdevs_discovered": 1, 00:22:31.607 "num_base_bdevs_operational": 2, 00:22:31.607 "base_bdevs_list": [ 00:22:31.607 { 00:22:31.607 "name": "pt1", 00:22:31.607 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:31.607 "is_configured": true, 00:22:31.607 "data_offset": 256, 00:22:31.607 "data_size": 7936 00:22:31.607 }, 00:22:31.607 { 00:22:31.607 "name": null, 00:22:31.607 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:31.607 "is_configured": false, 00:22:31.607 "data_offset": 256, 00:22:31.607 "data_size": 7936 00:22:31.607 } 00:22:31.607 ] 00:22:31.607 }' 00:22:31.607 23:44:16 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:31.607 23:44:16 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:22:32.174 23:44:16 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:22:32.174 23:44:16 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:22:32.174 23:44:16 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:22:32.174 23:44:16 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:22:32.174 [2024-07-24 23:44:17.020288] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:22:32.174 [2024-07-24 23:44:17.020324] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:32.174 [2024-07-24 23:44:17.020335] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14bd6d0 00:22:32.174 [2024-07-24 23:44:17.020341] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:32.174 [2024-07-24 23:44:17.020462] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:32.174 [2024-07-24 23:44:17.020475] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:22:32.174 [2024-07-24 23:44:17.020519] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:22:32.174 [2024-07-24 23:44:17.020530] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:32.174 [2024-07-24 23:44:17.020588] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x14bf750 00:22:32.174 [2024-07-24 23:44:17.020594] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:22:32.174 [2024-07-24 23:44:17.020630] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14c1710 00:22:32.174 [2024-07-24 23:44:17.020679] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x14bf750 00:22:32.174 [2024-07-24 23:44:17.020688] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x14bf750 00:22:32.174 [2024-07-24 23:44:17.020727] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:32.174 pt2 00:22:32.174 23:44:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:22:32.174 23:44:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:22:32.174 23:44:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:32.174 23:44:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:32.174 23:44:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:32.174 23:44:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:32.174 23:44:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:32.174 23:44:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:32.174 23:44:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:32.174 23:44:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:32.174 23:44:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:32.174 23:44:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:32.174 23:44:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:32.174 23:44:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:32.433 23:44:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:32.433 "name": "raid_bdev1", 00:22:32.433 "uuid": "5a2008db-f6ad-4a8c-8c8d-f39da081debb", 00:22:32.433 "strip_size_kb": 0, 00:22:32.433 "state": "online", 00:22:32.433 "raid_level": "raid1", 00:22:32.433 "superblock": true, 00:22:32.433 "num_base_bdevs": 2, 00:22:32.433 "num_base_bdevs_discovered": 2, 00:22:32.433 "num_base_bdevs_operational": 2, 00:22:32.433 "base_bdevs_list": [ 00:22:32.433 { 00:22:32.433 "name": "pt1", 00:22:32.433 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:32.433 "is_configured": true, 00:22:32.433 "data_offset": 256, 00:22:32.433 "data_size": 7936 00:22:32.433 }, 00:22:32.433 { 00:22:32.433 "name": "pt2", 00:22:32.433 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:32.433 "is_configured": true, 00:22:32.433 "data_offset": 256, 00:22:32.433 "data_size": 7936 00:22:32.433 } 00:22:32.433 ] 00:22:32.433 }' 00:22:32.433 23:44:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:32.433 23:44:17 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:22:33.000 23:44:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:22:33.000 23:44:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:22:33.000 23:44:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:22:33.000 23:44:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:22:33.000 23:44:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:22:33.000 23:44:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:22:33.000 23:44:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:33.000 23:44:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:22:33.000 [2024-07-24 23:44:17.850627] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:33.000 23:44:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:22:33.000 "name": "raid_bdev1", 00:22:33.000 "aliases": [ 00:22:33.000 "5a2008db-f6ad-4a8c-8c8d-f39da081debb" 00:22:33.000 ], 00:22:33.000 "product_name": "Raid Volume", 00:22:33.000 "block_size": 4128, 00:22:33.000 "num_blocks": 7936, 00:22:33.000 "uuid": "5a2008db-f6ad-4a8c-8c8d-f39da081debb", 00:22:33.000 "md_size": 32, 00:22:33.000 "md_interleave": true, 00:22:33.000 "dif_type": 0, 00:22:33.000 "assigned_rate_limits": { 00:22:33.000 "rw_ios_per_sec": 0, 00:22:33.000 "rw_mbytes_per_sec": 0, 00:22:33.000 "r_mbytes_per_sec": 0, 00:22:33.000 "w_mbytes_per_sec": 0 00:22:33.000 }, 00:22:33.000 "claimed": false, 00:22:33.000 "zoned": false, 00:22:33.000 "supported_io_types": { 00:22:33.000 "read": true, 00:22:33.000 "write": true, 00:22:33.000 "unmap": false, 00:22:33.000 "flush": false, 00:22:33.000 "reset": true, 00:22:33.000 "nvme_admin": false, 00:22:33.000 "nvme_io": false, 00:22:33.000 "nvme_io_md": false, 00:22:33.000 "write_zeroes": true, 00:22:33.000 "zcopy": false, 00:22:33.000 "get_zone_info": false, 00:22:33.000 "zone_management": false, 00:22:33.000 "zone_append": false, 00:22:33.000 "compare": false, 00:22:33.000 "compare_and_write": false, 00:22:33.000 "abort": false, 00:22:33.000 "seek_hole": false, 00:22:33.000 "seek_data": false, 00:22:33.000 "copy": false, 00:22:33.000 "nvme_iov_md": false 00:22:33.000 }, 00:22:33.000 "memory_domains": [ 00:22:33.000 { 00:22:33.000 "dma_device_id": "system", 00:22:33.000 "dma_device_type": 1 00:22:33.000 }, 00:22:33.000 { 00:22:33.000 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:33.000 "dma_device_type": 2 00:22:33.000 }, 00:22:33.000 { 00:22:33.000 "dma_device_id": "system", 00:22:33.000 "dma_device_type": 1 00:22:33.000 }, 00:22:33.000 { 00:22:33.000 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:33.000 "dma_device_type": 2 00:22:33.000 } 00:22:33.000 ], 00:22:33.000 "driver_specific": { 00:22:33.000 "raid": { 00:22:33.000 "uuid": "5a2008db-f6ad-4a8c-8c8d-f39da081debb", 00:22:33.000 "strip_size_kb": 0, 00:22:33.000 "state": "online", 00:22:33.000 "raid_level": "raid1", 00:22:33.000 "superblock": true, 00:22:33.000 "num_base_bdevs": 2, 00:22:33.000 "num_base_bdevs_discovered": 2, 00:22:33.000 "num_base_bdevs_operational": 2, 00:22:33.000 "base_bdevs_list": [ 00:22:33.000 { 00:22:33.000 "name": "pt1", 00:22:33.000 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:33.000 "is_configured": true, 00:22:33.000 "data_offset": 256, 00:22:33.000 "data_size": 7936 00:22:33.000 }, 00:22:33.000 { 00:22:33.000 "name": "pt2", 00:22:33.000 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:33.000 "is_configured": true, 00:22:33.000 "data_offset": 256, 00:22:33.000 "data_size": 7936 00:22:33.000 } 00:22:33.000 ] 00:22:33.000 } 00:22:33.000 } 00:22:33.000 }' 00:22:33.000 23:44:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:33.000 23:44:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:22:33.000 pt2' 00:22:33.000 23:44:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:33.000 23:44:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:22:33.000 23:44:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:33.260 23:44:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:33.260 "name": "pt1", 00:22:33.260 "aliases": [ 00:22:33.260 "00000000-0000-0000-0000-000000000001" 00:22:33.260 ], 00:22:33.260 "product_name": "passthru", 00:22:33.260 "block_size": 4128, 00:22:33.260 "num_blocks": 8192, 00:22:33.260 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:33.260 "md_size": 32, 00:22:33.260 "md_interleave": true, 00:22:33.260 "dif_type": 0, 00:22:33.260 "assigned_rate_limits": { 00:22:33.260 "rw_ios_per_sec": 0, 00:22:33.260 "rw_mbytes_per_sec": 0, 00:22:33.260 "r_mbytes_per_sec": 0, 00:22:33.260 "w_mbytes_per_sec": 0 00:22:33.260 }, 00:22:33.260 "claimed": true, 00:22:33.260 "claim_type": "exclusive_write", 00:22:33.260 "zoned": false, 00:22:33.260 "supported_io_types": { 00:22:33.260 "read": true, 00:22:33.260 "write": true, 00:22:33.260 "unmap": true, 00:22:33.260 "flush": true, 00:22:33.260 "reset": true, 00:22:33.260 "nvme_admin": false, 00:22:33.260 "nvme_io": false, 00:22:33.260 "nvme_io_md": false, 00:22:33.260 "write_zeroes": true, 00:22:33.260 "zcopy": true, 00:22:33.260 "get_zone_info": false, 00:22:33.260 "zone_management": false, 00:22:33.260 "zone_append": false, 00:22:33.260 "compare": false, 00:22:33.260 "compare_and_write": false, 00:22:33.260 "abort": true, 00:22:33.260 "seek_hole": false, 00:22:33.260 "seek_data": false, 00:22:33.260 "copy": true, 00:22:33.260 "nvme_iov_md": false 00:22:33.260 }, 00:22:33.260 "memory_domains": [ 00:22:33.260 { 00:22:33.260 "dma_device_id": "system", 00:22:33.260 "dma_device_type": 1 00:22:33.260 }, 00:22:33.260 { 00:22:33.260 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:33.260 "dma_device_type": 2 00:22:33.260 } 00:22:33.260 ], 00:22:33.260 "driver_specific": { 00:22:33.260 "passthru": { 00:22:33.260 "name": "pt1", 00:22:33.260 "base_bdev_name": "malloc1" 00:22:33.260 } 00:22:33.260 } 00:22:33.260 }' 00:22:33.260 23:44:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:33.260 23:44:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:33.260 23:44:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:22:33.260 23:44:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:33.260 23:44:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:33.260 23:44:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:22:33.260 23:44:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:33.260 23:44:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:33.518 23:44:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:22:33.518 23:44:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:33.518 23:44:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:33.518 23:44:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:22:33.518 23:44:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:33.518 23:44:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:22:33.518 23:44:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:33.776 23:44:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:33.776 "name": "pt2", 00:22:33.776 "aliases": [ 00:22:33.776 "00000000-0000-0000-0000-000000000002" 00:22:33.776 ], 00:22:33.776 "product_name": "passthru", 00:22:33.776 "block_size": 4128, 00:22:33.776 "num_blocks": 8192, 00:22:33.776 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:33.776 "md_size": 32, 00:22:33.776 "md_interleave": true, 00:22:33.776 "dif_type": 0, 00:22:33.776 "assigned_rate_limits": { 00:22:33.776 "rw_ios_per_sec": 0, 00:22:33.776 "rw_mbytes_per_sec": 0, 00:22:33.776 "r_mbytes_per_sec": 0, 00:22:33.776 "w_mbytes_per_sec": 0 00:22:33.776 }, 00:22:33.776 "claimed": true, 00:22:33.776 "claim_type": "exclusive_write", 00:22:33.776 "zoned": false, 00:22:33.776 "supported_io_types": { 00:22:33.776 "read": true, 00:22:33.776 "write": true, 00:22:33.776 "unmap": true, 00:22:33.776 "flush": true, 00:22:33.776 "reset": true, 00:22:33.776 "nvme_admin": false, 00:22:33.776 "nvme_io": false, 00:22:33.776 "nvme_io_md": false, 00:22:33.776 "write_zeroes": true, 00:22:33.777 "zcopy": true, 00:22:33.777 "get_zone_info": false, 00:22:33.777 "zone_management": false, 00:22:33.777 "zone_append": false, 00:22:33.777 "compare": false, 00:22:33.777 "compare_and_write": false, 00:22:33.777 "abort": true, 00:22:33.777 "seek_hole": false, 00:22:33.777 "seek_data": false, 00:22:33.777 "copy": true, 00:22:33.777 "nvme_iov_md": false 00:22:33.777 }, 00:22:33.777 "memory_domains": [ 00:22:33.777 { 00:22:33.777 "dma_device_id": "system", 00:22:33.777 "dma_device_type": 1 00:22:33.777 }, 00:22:33.777 { 00:22:33.777 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:33.777 "dma_device_type": 2 00:22:33.777 } 00:22:33.777 ], 00:22:33.777 "driver_specific": { 00:22:33.777 "passthru": { 00:22:33.777 "name": "pt2", 00:22:33.777 "base_bdev_name": "malloc2" 00:22:33.777 } 00:22:33.777 } 00:22:33.777 }' 00:22:33.777 23:44:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:33.777 23:44:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:33.777 23:44:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:22:33.777 23:44:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:33.777 23:44:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:33.777 23:44:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:22:33.777 23:44:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:33.777 23:44:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:34.035 23:44:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:22:34.035 23:44:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:34.035 23:44:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:34.035 23:44:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:22:34.035 23:44:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:34.035 23:44:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:22:34.035 [2024-07-24 23:44:18.981509] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:34.035 23:44:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # '[' 5a2008db-f6ad-4a8c-8c8d-f39da081debb '!=' 5a2008db-f6ad-4a8c-8c8d-f39da081debb ']' 00:22:34.035 23:44:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:22:34.035 23:44:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@213 -- # case $1 in 00:22:34.035 23:44:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@214 -- # return 0 00:22:34.035 23:44:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:22:34.294 [2024-07-24 23:44:19.157827] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:22:34.294 23:44:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:34.294 23:44:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:34.294 23:44:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:34.294 23:44:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:34.294 23:44:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:34.294 23:44:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:34.294 23:44:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:34.294 23:44:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:34.294 23:44:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:34.294 23:44:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:34.294 23:44:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:34.294 23:44:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:34.553 23:44:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:34.553 "name": "raid_bdev1", 00:22:34.553 "uuid": "5a2008db-f6ad-4a8c-8c8d-f39da081debb", 00:22:34.553 "strip_size_kb": 0, 00:22:34.553 "state": "online", 00:22:34.553 "raid_level": "raid1", 00:22:34.553 "superblock": true, 00:22:34.553 "num_base_bdevs": 2, 00:22:34.553 "num_base_bdevs_discovered": 1, 00:22:34.553 "num_base_bdevs_operational": 1, 00:22:34.553 "base_bdevs_list": [ 00:22:34.553 { 00:22:34.553 "name": null, 00:22:34.553 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:34.553 "is_configured": false, 00:22:34.553 "data_offset": 256, 00:22:34.553 "data_size": 7936 00:22:34.553 }, 00:22:34.553 { 00:22:34.553 "name": "pt2", 00:22:34.553 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:34.553 "is_configured": true, 00:22:34.553 "data_offset": 256, 00:22:34.553 "data_size": 7936 00:22:34.553 } 00:22:34.553 ] 00:22:34.553 }' 00:22:34.553 23:44:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:34.553 23:44:19 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:22:35.120 23:44:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:35.120 [2024-07-24 23:44:19.991982] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:35.120 [2024-07-24 23:44:19.992002] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:35.120 [2024-07-24 23:44:19.992045] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:35.120 [2024-07-24 23:44:19.992076] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:35.120 [2024-07-24 23:44:19.992082] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14bf750 name raid_bdev1, state offline 00:22:35.120 23:44:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:35.120 23:44:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:22:35.379 23:44:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:22:35.379 23:44:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:22:35.379 23:44:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:22:35.379 23:44:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:22:35.379 23:44:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:22:35.379 23:44:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:22:35.379 23:44:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:22:35.379 23:44:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:22:35.379 23:44:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:22:35.379 23:44:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@518 -- # i=1 00:22:35.379 23:44:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:22:35.637 [2024-07-24 23:44:20.521330] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:22:35.637 [2024-07-24 23:44:20.521364] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:35.638 [2024-07-24 23:44:20.521373] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14c19f0 00:22:35.638 [2024-07-24 23:44:20.521379] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:35.638 [2024-07-24 23:44:20.522413] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:35.638 [2024-07-24 23:44:20.522432] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:22:35.638 [2024-07-24 23:44:20.522463] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:22:35.638 [2024-07-24 23:44:20.522488] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:35.638 [2024-07-24 23:44:20.522536] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x14bd900 00:22:35.638 [2024-07-24 23:44:20.522542] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:22:35.638 [2024-07-24 23:44:20.522583] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1324ff0 00:22:35.638 [2024-07-24 23:44:20.522632] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x14bd900 00:22:35.638 [2024-07-24 23:44:20.522637] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x14bd900 00:22:35.638 [2024-07-24 23:44:20.522671] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:35.638 pt2 00:22:35.638 23:44:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:35.638 23:44:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:35.638 23:44:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:35.638 23:44:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:35.638 23:44:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:35.638 23:44:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:35.638 23:44:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:35.638 23:44:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:35.638 23:44:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:35.638 23:44:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:35.638 23:44:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:35.638 23:44:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:35.896 23:44:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:35.896 "name": "raid_bdev1", 00:22:35.896 "uuid": "5a2008db-f6ad-4a8c-8c8d-f39da081debb", 00:22:35.896 "strip_size_kb": 0, 00:22:35.896 "state": "online", 00:22:35.896 "raid_level": "raid1", 00:22:35.896 "superblock": true, 00:22:35.896 "num_base_bdevs": 2, 00:22:35.896 "num_base_bdevs_discovered": 1, 00:22:35.896 "num_base_bdevs_operational": 1, 00:22:35.896 "base_bdevs_list": [ 00:22:35.896 { 00:22:35.896 "name": null, 00:22:35.896 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:35.896 "is_configured": false, 00:22:35.896 "data_offset": 256, 00:22:35.896 "data_size": 7936 00:22:35.896 }, 00:22:35.896 { 00:22:35.896 "name": "pt2", 00:22:35.896 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:35.896 "is_configured": true, 00:22:35.896 "data_offset": 256, 00:22:35.896 "data_size": 7936 00:22:35.896 } 00:22:35.896 ] 00:22:35.896 }' 00:22:35.896 23:44:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:35.896 23:44:20 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:22:36.463 23:44:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:36.463 [2024-07-24 23:44:21.359487] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:36.463 [2024-07-24 23:44:21.359505] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:36.463 [2024-07-24 23:44:21.359542] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:36.463 [2024-07-24 23:44:21.359572] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:36.463 [2024-07-24 23:44:21.359578] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14bd900 name raid_bdev1, state offline 00:22:36.463 23:44:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:36.463 23:44:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:22:36.722 23:44:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:22:36.722 23:44:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:22:36.722 23:44:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:22:36.722 23:44:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:22:36.722 [2024-07-24 23:44:21.700366] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:22:36.722 [2024-07-24 23:44:21.700397] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:36.722 [2024-07-24 23:44:21.700408] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x132e230 00:22:36.722 [2024-07-24 23:44:21.700415] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:36.722 [2024-07-24 23:44:21.701465] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:36.722 [2024-07-24 23:44:21.701494] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:22:36.722 [2024-07-24 23:44:21.701532] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:22:36.722 [2024-07-24 23:44:21.701549] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:22:36.722 [2024-07-24 23:44:21.701608] bdev_raid.c:3639:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:22:36.722 [2024-07-24 23:44:21.701615] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:36.722 [2024-07-24 23:44:21.701623] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14c1eb0 name raid_bdev1, state configuring 00:22:36.722 [2024-07-24 23:44:21.701637] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:36.722 [2024-07-24 23:44:21.701672] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x14c1eb0 00:22:36.722 [2024-07-24 23:44:21.701678] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:22:36.722 [2024-07-24 23:44:21.701715] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14c13d0 00:22:36.722 [2024-07-24 23:44:21.701768] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x14c1eb0 00:22:36.722 [2024-07-24 23:44:21.701774] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x14c1eb0 00:22:36.722 [2024-07-24 23:44:21.701812] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:36.722 pt1 00:22:36.722 23:44:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:22:36.722 23:44:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:36.722 23:44:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:36.722 23:44:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:36.722 23:44:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:36.722 23:44:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:36.722 23:44:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:36.722 23:44:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:36.722 23:44:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:36.722 23:44:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:36.722 23:44:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:36.981 23:44:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:36.981 23:44:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:36.981 23:44:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:36.981 "name": "raid_bdev1", 00:22:36.981 "uuid": "5a2008db-f6ad-4a8c-8c8d-f39da081debb", 00:22:36.981 "strip_size_kb": 0, 00:22:36.981 "state": "online", 00:22:36.981 "raid_level": "raid1", 00:22:36.981 "superblock": true, 00:22:36.981 "num_base_bdevs": 2, 00:22:36.981 "num_base_bdevs_discovered": 1, 00:22:36.981 "num_base_bdevs_operational": 1, 00:22:36.981 "base_bdevs_list": [ 00:22:36.981 { 00:22:36.981 "name": null, 00:22:36.981 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:36.981 "is_configured": false, 00:22:36.981 "data_offset": 256, 00:22:36.981 "data_size": 7936 00:22:36.981 }, 00:22:36.981 { 00:22:36.981 "name": "pt2", 00:22:36.981 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:36.981 "is_configured": true, 00:22:36.981 "data_offset": 256, 00:22:36.981 "data_size": 7936 00:22:36.981 } 00:22:36.981 ] 00:22:36.981 }' 00:22:36.981 23:44:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:36.981 23:44:21 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:22:37.574 23:44:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:22:37.574 23:44:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:22:37.833 23:44:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:22:37.833 23:44:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:37.833 23:44:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:22:37.833 [2024-07-24 23:44:22.711143] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:37.833 23:44:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # '[' 5a2008db-f6ad-4a8c-8c8d-f39da081debb '!=' 5a2008db-f6ad-4a8c-8c8d-f39da081debb ']' 00:22:37.833 23:44:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@562 -- # killprocess 396731 00:22:37.833 23:44:22 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@950 -- # '[' -z 396731 ']' 00:22:37.833 23:44:22 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@954 -- # kill -0 396731 00:22:37.833 23:44:22 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@955 -- # uname 00:22:37.834 23:44:22 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:22:37.834 23:44:22 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 396731 00:22:37.834 23:44:22 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:22:37.834 23:44:22 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:22:37.834 23:44:22 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@968 -- # echo 'killing process with pid 396731' 00:22:37.834 killing process with pid 396731 00:22:37.834 23:44:22 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@969 -- # kill 396731 00:22:37.834 [2024-07-24 23:44:22.779126] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:37.834 [2024-07-24 23:44:22.779172] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:37.834 [2024-07-24 23:44:22.779204] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:37.834 [2024-07-24 23:44:22.779210] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14c1eb0 name raid_bdev1, state offline 00:22:37.834 23:44:22 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@974 -- # wait 396731 00:22:37.834 [2024-07-24 23:44:22.795055] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:38.093 23:44:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@564 -- # return 0 00:22:38.093 00:22:38.093 real 0m11.746s 00:22:38.093 user 0m21.618s 00:22:38.093 sys 0m1.889s 00:22:38.093 23:44:22 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@1126 -- # xtrace_disable 00:22:38.093 23:44:22 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:22:38.093 ************************************ 00:22:38.093 END TEST raid_superblock_test_md_interleaved 00:22:38.093 ************************************ 00:22:38.093 23:44:22 bdev_raid -- bdev/bdev_raid.sh@914 -- # run_test raid_rebuild_test_sb_md_interleaved raid_rebuild_test raid1 2 true false false 00:22:38.093 23:44:22 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:22:38.093 23:44:22 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:22:38.093 23:44:22 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:38.093 ************************************ 00:22:38.093 START TEST raid_rebuild_test_sb_md_interleaved 00:22:38.093 ************************************ 00:22:38.093 23:44:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@1125 -- # raid_rebuild_test raid1 2 true false false 00:22:38.093 23:44:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:22:38.093 23:44:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:22:38.093 23:44:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:22:38.093 23:44:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:22:38.093 23:44:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@572 -- # local verify=false 00:22:38.093 23:44:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:22:38.093 23:44:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:38.093 23:44:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:22:38.093 23:44:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:38.093 23:44:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:38.093 23:44:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:22:38.093 23:44:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:38.093 23:44:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:38.093 23:44:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:22:38.093 23:44:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:22:38.094 23:44:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:22:38.094 23:44:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # local strip_size 00:22:38.094 23:44:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@576 -- # local create_arg 00:22:38.094 23:44:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:22:38.094 23:44:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@578 -- # local data_offset 00:22:38.094 23:44:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:22:38.094 23:44:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:22:38.094 23:44:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:22:38.094 23:44:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:22:38.094 23:44:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@596 -- # raid_pid=398881 00:22:38.094 23:44:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@597 -- # waitforlisten 398881 /var/tmp/spdk-raid.sock 00:22:38.094 23:44:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:22:38.094 23:44:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@831 -- # '[' -z 398881 ']' 00:22:38.094 23:44:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:38.094 23:44:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@836 -- # local max_retries=100 00:22:38.094 23:44:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:38.094 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:38.094 23:44:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@840 -- # xtrace_disable 00:22:38.094 23:44:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:22:38.094 [2024-07-24 23:44:23.092556] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:22:38.094 [2024-07-24 23:44:23.092595] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid398881 ] 00:22:38.094 I/O size of 3145728 is greater than zero copy threshold (65536). 00:22:38.094 Zero copy mechanism will not be used. 00:22:38.353 [2024-07-24 23:44:23.156155] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:38.353 [2024-07-24 23:44:23.234416] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:22:38.353 [2024-07-24 23:44:23.296312] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:38.353 [2024-07-24 23:44:23.296339] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:38.918 23:44:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:22:38.918 23:44:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@864 -- # return 0 00:22:38.918 23:44:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:38.918 23:44:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev1_malloc 00:22:39.177 BaseBdev1_malloc 00:22:39.177 23:44:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:22:39.435 [2024-07-24 23:44:24.200541] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:22:39.435 [2024-07-24 23:44:24.200575] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:39.435 [2024-07-24 23:44:24.200588] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13ce160 00:22:39.435 [2024-07-24 23:44:24.200611] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:39.435 [2024-07-24 23:44:24.201661] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:39.435 [2024-07-24 23:44:24.201681] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:39.435 BaseBdev1 00:22:39.435 23:44:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:39.435 23:44:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev2_malloc 00:22:39.435 BaseBdev2_malloc 00:22:39.435 23:44:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:22:39.692 [2024-07-24 23:44:24.565245] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:22:39.692 [2024-07-24 23:44:24.565281] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:39.692 [2024-07-24 23:44:24.565293] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13c5750 00:22:39.692 [2024-07-24 23:44:24.565299] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:39.692 [2024-07-24 23:44:24.566508] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:39.692 [2024-07-24 23:44:24.566530] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:22:39.692 BaseBdev2 00:22:39.692 23:44:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b spare_malloc 00:22:39.950 spare_malloc 00:22:39.950 23:44:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:22:39.950 spare_delay 00:22:39.950 23:44:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:22:40.208 [2024-07-24 23:44:25.070397] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:22:40.208 [2024-07-24 23:44:25.070432] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:40.208 [2024-07-24 23:44:25.070451] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13c84f0 00:22:40.208 [2024-07-24 23:44:25.070457] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:40.208 [2024-07-24 23:44:25.071406] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:40.208 [2024-07-24 23:44:25.071427] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:22:40.208 spare 00:22:40.208 23:44:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:22:40.467 [2024-07-24 23:44:25.238850] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:40.467 [2024-07-24 23:44:25.239720] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:40.467 [2024-07-24 23:44:25.239857] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x13ca7f0 00:22:40.467 [2024-07-24 23:44:25.239866] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:22:40.467 [2024-07-24 23:44:25.239917] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1230e40 00:22:40.467 [2024-07-24 23:44:25.239976] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x13ca7f0 00:22:40.467 [2024-07-24 23:44:25.239981] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x13ca7f0 00:22:40.467 [2024-07-24 23:44:25.240017] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:40.467 23:44:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:40.467 23:44:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:40.467 23:44:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:40.467 23:44:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:40.467 23:44:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:40.467 23:44:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:40.467 23:44:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:40.467 23:44:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:40.467 23:44:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:40.467 23:44:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:40.467 23:44:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:40.467 23:44:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:40.467 23:44:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:40.467 "name": "raid_bdev1", 00:22:40.467 "uuid": "a5835d28-8795-429b-b7e1-bbd18f37e9e5", 00:22:40.467 "strip_size_kb": 0, 00:22:40.467 "state": "online", 00:22:40.467 "raid_level": "raid1", 00:22:40.467 "superblock": true, 00:22:40.467 "num_base_bdevs": 2, 00:22:40.467 "num_base_bdevs_discovered": 2, 00:22:40.467 "num_base_bdevs_operational": 2, 00:22:40.467 "base_bdevs_list": [ 00:22:40.467 { 00:22:40.467 "name": "BaseBdev1", 00:22:40.467 "uuid": "d86aa584-8e36-5cfe-9a57-42ac38f8a92f", 00:22:40.467 "is_configured": true, 00:22:40.467 "data_offset": 256, 00:22:40.467 "data_size": 7936 00:22:40.467 }, 00:22:40.467 { 00:22:40.467 "name": "BaseBdev2", 00:22:40.467 "uuid": "f8be1e73-c679-5acb-832f-28a8e57a03c7", 00:22:40.467 "is_configured": true, 00:22:40.467 "data_offset": 256, 00:22:40.467 "data_size": 7936 00:22:40.467 } 00:22:40.467 ] 00:22:40.467 }' 00:22:40.467 23:44:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:40.467 23:44:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:22:41.034 23:44:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:41.034 23:44:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:22:41.292 [2024-07-24 23:44:26.069155] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:41.292 23:44:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:22:41.292 23:44:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:41.292 23:44:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:22:41.292 23:44:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:22:41.292 23:44:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:22:41.292 23:44:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@623 -- # '[' false = true ']' 00:22:41.292 23:44:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:22:41.552 [2024-07-24 23:44:26.413866] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:41.552 23:44:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:41.552 23:44:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:41.552 23:44:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:41.552 23:44:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:41.552 23:44:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:41.552 23:44:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:41.552 23:44:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:41.552 23:44:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:41.552 23:44:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:41.552 23:44:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:41.552 23:44:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:41.552 23:44:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:41.811 23:44:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:41.811 "name": "raid_bdev1", 00:22:41.811 "uuid": "a5835d28-8795-429b-b7e1-bbd18f37e9e5", 00:22:41.811 "strip_size_kb": 0, 00:22:41.811 "state": "online", 00:22:41.811 "raid_level": "raid1", 00:22:41.811 "superblock": true, 00:22:41.811 "num_base_bdevs": 2, 00:22:41.811 "num_base_bdevs_discovered": 1, 00:22:41.811 "num_base_bdevs_operational": 1, 00:22:41.811 "base_bdevs_list": [ 00:22:41.811 { 00:22:41.811 "name": null, 00:22:41.811 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:41.811 "is_configured": false, 00:22:41.811 "data_offset": 256, 00:22:41.811 "data_size": 7936 00:22:41.811 }, 00:22:41.811 { 00:22:41.811 "name": "BaseBdev2", 00:22:41.811 "uuid": "f8be1e73-c679-5acb-832f-28a8e57a03c7", 00:22:41.811 "is_configured": true, 00:22:41.811 "data_offset": 256, 00:22:41.811 "data_size": 7936 00:22:41.811 } 00:22:41.811 ] 00:22:41.811 }' 00:22:41.811 23:44:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:41.811 23:44:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:22:42.377 23:44:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:42.378 [2024-07-24 23:44:27.264123] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:42.378 [2024-07-24 23:44:27.267207] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13ca700 00:22:42.378 [2024-07-24 23:44:27.268499] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:42.378 23:44:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@646 -- # sleep 1 00:22:43.313 23:44:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:43.313 23:44:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:43.313 23:44:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:43.313 23:44:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:43.313 23:44:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:43.313 23:44:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:43.313 23:44:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:43.571 23:44:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:43.571 "name": "raid_bdev1", 00:22:43.571 "uuid": "a5835d28-8795-429b-b7e1-bbd18f37e9e5", 00:22:43.571 "strip_size_kb": 0, 00:22:43.571 "state": "online", 00:22:43.571 "raid_level": "raid1", 00:22:43.571 "superblock": true, 00:22:43.571 "num_base_bdevs": 2, 00:22:43.571 "num_base_bdevs_discovered": 2, 00:22:43.571 "num_base_bdevs_operational": 2, 00:22:43.571 "process": { 00:22:43.571 "type": "rebuild", 00:22:43.571 "target": "spare", 00:22:43.571 "progress": { 00:22:43.571 "blocks": 2816, 00:22:43.571 "percent": 35 00:22:43.571 } 00:22:43.571 }, 00:22:43.571 "base_bdevs_list": [ 00:22:43.571 { 00:22:43.571 "name": "spare", 00:22:43.571 "uuid": "2b1b6af3-8e4b-5a13-a4fd-ecefd8b99920", 00:22:43.571 "is_configured": true, 00:22:43.571 "data_offset": 256, 00:22:43.571 "data_size": 7936 00:22:43.571 }, 00:22:43.571 { 00:22:43.571 "name": "BaseBdev2", 00:22:43.571 "uuid": "f8be1e73-c679-5acb-832f-28a8e57a03c7", 00:22:43.571 "is_configured": true, 00:22:43.571 "data_offset": 256, 00:22:43.571 "data_size": 7936 00:22:43.571 } 00:22:43.571 ] 00:22:43.571 }' 00:22:43.571 23:44:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:43.571 23:44:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:43.571 23:44:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:43.571 23:44:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:43.571 23:44:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:22:43.830 [2024-07-24 23:44:28.704782] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:43.830 [2024-07-24 23:44:28.779015] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:22:43.830 [2024-07-24 23:44:28.779049] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:43.830 [2024-07-24 23:44:28.779058] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:43.830 [2024-07-24 23:44:28.779078] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:22:43.830 23:44:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:43.830 23:44:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:43.830 23:44:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:43.830 23:44:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:43.830 23:44:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:43.830 23:44:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:43.830 23:44:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:43.830 23:44:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:43.830 23:44:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:43.830 23:44:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:43.830 23:44:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:43.830 23:44:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:44.089 23:44:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:44.089 "name": "raid_bdev1", 00:22:44.089 "uuid": "a5835d28-8795-429b-b7e1-bbd18f37e9e5", 00:22:44.089 "strip_size_kb": 0, 00:22:44.089 "state": "online", 00:22:44.089 "raid_level": "raid1", 00:22:44.089 "superblock": true, 00:22:44.089 "num_base_bdevs": 2, 00:22:44.089 "num_base_bdevs_discovered": 1, 00:22:44.089 "num_base_bdevs_operational": 1, 00:22:44.089 "base_bdevs_list": [ 00:22:44.089 { 00:22:44.089 "name": null, 00:22:44.089 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:44.089 "is_configured": false, 00:22:44.089 "data_offset": 256, 00:22:44.089 "data_size": 7936 00:22:44.089 }, 00:22:44.089 { 00:22:44.089 "name": "BaseBdev2", 00:22:44.089 "uuid": "f8be1e73-c679-5acb-832f-28a8e57a03c7", 00:22:44.089 "is_configured": true, 00:22:44.089 "data_offset": 256, 00:22:44.089 "data_size": 7936 00:22:44.089 } 00:22:44.089 ] 00:22:44.089 }' 00:22:44.089 23:44:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:44.089 23:44:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:22:44.656 23:44:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:44.657 23:44:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:44.657 23:44:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:44.657 23:44:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:44.657 23:44:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:44.657 23:44:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:44.657 23:44:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:44.657 23:44:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:44.657 "name": "raid_bdev1", 00:22:44.657 "uuid": "a5835d28-8795-429b-b7e1-bbd18f37e9e5", 00:22:44.657 "strip_size_kb": 0, 00:22:44.657 "state": "online", 00:22:44.657 "raid_level": "raid1", 00:22:44.657 "superblock": true, 00:22:44.657 "num_base_bdevs": 2, 00:22:44.657 "num_base_bdevs_discovered": 1, 00:22:44.657 "num_base_bdevs_operational": 1, 00:22:44.657 "base_bdevs_list": [ 00:22:44.657 { 00:22:44.657 "name": null, 00:22:44.657 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:44.657 "is_configured": false, 00:22:44.657 "data_offset": 256, 00:22:44.657 "data_size": 7936 00:22:44.657 }, 00:22:44.657 { 00:22:44.657 "name": "BaseBdev2", 00:22:44.657 "uuid": "f8be1e73-c679-5acb-832f-28a8e57a03c7", 00:22:44.657 "is_configured": true, 00:22:44.657 "data_offset": 256, 00:22:44.657 "data_size": 7936 00:22:44.657 } 00:22:44.657 ] 00:22:44.657 }' 00:22:44.657 23:44:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:44.916 23:44:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:44.916 23:44:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:44.916 23:44:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:44.916 23:44:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:44.916 [2024-07-24 23:44:29.877287] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:44.916 [2024-07-24 23:44:29.880365] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13cd270 00:22:44.916 [2024-07-24 23:44:29.881346] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:44.916 23:44:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@662 -- # sleep 1 00:22:46.292 23:44:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:46.292 23:44:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:46.292 23:44:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:46.292 23:44:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:46.292 23:44:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:46.292 23:44:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:46.293 23:44:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:46.293 23:44:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:46.293 "name": "raid_bdev1", 00:22:46.293 "uuid": "a5835d28-8795-429b-b7e1-bbd18f37e9e5", 00:22:46.293 "strip_size_kb": 0, 00:22:46.293 "state": "online", 00:22:46.293 "raid_level": "raid1", 00:22:46.293 "superblock": true, 00:22:46.293 "num_base_bdevs": 2, 00:22:46.293 "num_base_bdevs_discovered": 2, 00:22:46.293 "num_base_bdevs_operational": 2, 00:22:46.293 "process": { 00:22:46.293 "type": "rebuild", 00:22:46.293 "target": "spare", 00:22:46.293 "progress": { 00:22:46.293 "blocks": 2816, 00:22:46.293 "percent": 35 00:22:46.293 } 00:22:46.293 }, 00:22:46.293 "base_bdevs_list": [ 00:22:46.293 { 00:22:46.293 "name": "spare", 00:22:46.293 "uuid": "2b1b6af3-8e4b-5a13-a4fd-ecefd8b99920", 00:22:46.293 "is_configured": true, 00:22:46.293 "data_offset": 256, 00:22:46.293 "data_size": 7936 00:22:46.293 }, 00:22:46.293 { 00:22:46.293 "name": "BaseBdev2", 00:22:46.293 "uuid": "f8be1e73-c679-5acb-832f-28a8e57a03c7", 00:22:46.293 "is_configured": true, 00:22:46.293 "data_offset": 256, 00:22:46.293 "data_size": 7936 00:22:46.293 } 00:22:46.293 ] 00:22:46.293 }' 00:22:46.293 23:44:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:46.293 23:44:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:46.293 23:44:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:46.293 23:44:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:46.293 23:44:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:22:46.293 23:44:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:22:46.293 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:22:46.293 23:44:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:22:46.293 23:44:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:22:46.293 23:44:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:22:46.293 23:44:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@705 -- # local timeout=864 00:22:46.293 23:44:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:46.293 23:44:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:46.293 23:44:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:46.293 23:44:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:46.293 23:44:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:46.293 23:44:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:46.293 23:44:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:46.293 23:44:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:46.551 23:44:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:46.551 "name": "raid_bdev1", 00:22:46.552 "uuid": "a5835d28-8795-429b-b7e1-bbd18f37e9e5", 00:22:46.552 "strip_size_kb": 0, 00:22:46.552 "state": "online", 00:22:46.552 "raid_level": "raid1", 00:22:46.552 "superblock": true, 00:22:46.552 "num_base_bdevs": 2, 00:22:46.552 "num_base_bdevs_discovered": 2, 00:22:46.552 "num_base_bdevs_operational": 2, 00:22:46.552 "process": { 00:22:46.552 "type": "rebuild", 00:22:46.552 "target": "spare", 00:22:46.552 "progress": { 00:22:46.552 "blocks": 3584, 00:22:46.552 "percent": 45 00:22:46.552 } 00:22:46.552 }, 00:22:46.552 "base_bdevs_list": [ 00:22:46.552 { 00:22:46.552 "name": "spare", 00:22:46.552 "uuid": "2b1b6af3-8e4b-5a13-a4fd-ecefd8b99920", 00:22:46.552 "is_configured": true, 00:22:46.552 "data_offset": 256, 00:22:46.552 "data_size": 7936 00:22:46.552 }, 00:22:46.552 { 00:22:46.552 "name": "BaseBdev2", 00:22:46.552 "uuid": "f8be1e73-c679-5acb-832f-28a8e57a03c7", 00:22:46.552 "is_configured": true, 00:22:46.552 "data_offset": 256, 00:22:46.552 "data_size": 7936 00:22:46.552 } 00:22:46.552 ] 00:22:46.552 }' 00:22:46.552 23:44:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:46.552 23:44:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:46.552 23:44:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:46.552 23:44:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:46.552 23:44:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@710 -- # sleep 1 00:22:47.488 23:44:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:47.488 23:44:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:47.488 23:44:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:47.488 23:44:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:47.488 23:44:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:47.488 23:44:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:47.488 23:44:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:47.488 23:44:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:47.747 23:44:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:47.747 "name": "raid_bdev1", 00:22:47.747 "uuid": "a5835d28-8795-429b-b7e1-bbd18f37e9e5", 00:22:47.747 "strip_size_kb": 0, 00:22:47.747 "state": "online", 00:22:47.747 "raid_level": "raid1", 00:22:47.747 "superblock": true, 00:22:47.747 "num_base_bdevs": 2, 00:22:47.747 "num_base_bdevs_discovered": 2, 00:22:47.747 "num_base_bdevs_operational": 2, 00:22:47.747 "process": { 00:22:47.747 "type": "rebuild", 00:22:47.747 "target": "spare", 00:22:47.747 "progress": { 00:22:47.747 "blocks": 6656, 00:22:47.747 "percent": 83 00:22:47.747 } 00:22:47.747 }, 00:22:47.747 "base_bdevs_list": [ 00:22:47.747 { 00:22:47.747 "name": "spare", 00:22:47.747 "uuid": "2b1b6af3-8e4b-5a13-a4fd-ecefd8b99920", 00:22:47.747 "is_configured": true, 00:22:47.747 "data_offset": 256, 00:22:47.747 "data_size": 7936 00:22:47.747 }, 00:22:47.747 { 00:22:47.747 "name": "BaseBdev2", 00:22:47.747 "uuid": "f8be1e73-c679-5acb-832f-28a8e57a03c7", 00:22:47.747 "is_configured": true, 00:22:47.747 "data_offset": 256, 00:22:47.747 "data_size": 7936 00:22:47.747 } 00:22:47.747 ] 00:22:47.747 }' 00:22:47.747 23:44:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:47.747 23:44:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:47.747 23:44:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:47.747 23:44:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:47.747 23:44:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@710 -- # sleep 1 00:22:48.005 [2024-07-24 23:44:33.003044] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:22:48.005 [2024-07-24 23:44:33.003085] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:22:48.005 [2024-07-24 23:44:33.003147] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:48.944 23:44:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:48.944 23:44:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:48.944 23:44:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:48.944 23:44:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:48.944 23:44:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:48.944 23:44:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:48.945 23:44:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:48.945 23:44:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:48.945 23:44:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:48.945 "name": "raid_bdev1", 00:22:48.945 "uuid": "a5835d28-8795-429b-b7e1-bbd18f37e9e5", 00:22:48.945 "strip_size_kb": 0, 00:22:48.945 "state": "online", 00:22:48.945 "raid_level": "raid1", 00:22:48.945 "superblock": true, 00:22:48.945 "num_base_bdevs": 2, 00:22:48.945 "num_base_bdevs_discovered": 2, 00:22:48.945 "num_base_bdevs_operational": 2, 00:22:48.945 "base_bdevs_list": [ 00:22:48.945 { 00:22:48.945 "name": "spare", 00:22:48.945 "uuid": "2b1b6af3-8e4b-5a13-a4fd-ecefd8b99920", 00:22:48.945 "is_configured": true, 00:22:48.945 "data_offset": 256, 00:22:48.945 "data_size": 7936 00:22:48.945 }, 00:22:48.945 { 00:22:48.945 "name": "BaseBdev2", 00:22:48.945 "uuid": "f8be1e73-c679-5acb-832f-28a8e57a03c7", 00:22:48.945 "is_configured": true, 00:22:48.945 "data_offset": 256, 00:22:48.945 "data_size": 7936 00:22:48.945 } 00:22:48.945 ] 00:22:48.945 }' 00:22:48.945 23:44:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:48.945 23:44:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:22:48.945 23:44:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:48.945 23:44:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:22:48.945 23:44:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@708 -- # break 00:22:48.945 23:44:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:48.945 23:44:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:48.945 23:44:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:48.945 23:44:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:48.945 23:44:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:48.945 23:44:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:48.945 23:44:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:49.203 23:44:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:49.203 "name": "raid_bdev1", 00:22:49.203 "uuid": "a5835d28-8795-429b-b7e1-bbd18f37e9e5", 00:22:49.203 "strip_size_kb": 0, 00:22:49.203 "state": "online", 00:22:49.203 "raid_level": "raid1", 00:22:49.203 "superblock": true, 00:22:49.203 "num_base_bdevs": 2, 00:22:49.203 "num_base_bdevs_discovered": 2, 00:22:49.203 "num_base_bdevs_operational": 2, 00:22:49.203 "base_bdevs_list": [ 00:22:49.203 { 00:22:49.203 "name": "spare", 00:22:49.203 "uuid": "2b1b6af3-8e4b-5a13-a4fd-ecefd8b99920", 00:22:49.203 "is_configured": true, 00:22:49.203 "data_offset": 256, 00:22:49.203 "data_size": 7936 00:22:49.203 }, 00:22:49.203 { 00:22:49.203 "name": "BaseBdev2", 00:22:49.203 "uuid": "f8be1e73-c679-5acb-832f-28a8e57a03c7", 00:22:49.203 "is_configured": true, 00:22:49.203 "data_offset": 256, 00:22:49.203 "data_size": 7936 00:22:49.203 } 00:22:49.203 ] 00:22:49.203 }' 00:22:49.203 23:44:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:49.203 23:44:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:49.203 23:44:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:49.203 23:44:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:49.203 23:44:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:49.203 23:44:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:49.203 23:44:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:49.203 23:44:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:49.203 23:44:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:49.203 23:44:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:49.203 23:44:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:49.203 23:44:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:49.203 23:44:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:49.203 23:44:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:49.203 23:44:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:49.203 23:44:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:49.462 23:44:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:49.462 "name": "raid_bdev1", 00:22:49.462 "uuid": "a5835d28-8795-429b-b7e1-bbd18f37e9e5", 00:22:49.462 "strip_size_kb": 0, 00:22:49.462 "state": "online", 00:22:49.462 "raid_level": "raid1", 00:22:49.462 "superblock": true, 00:22:49.462 "num_base_bdevs": 2, 00:22:49.462 "num_base_bdevs_discovered": 2, 00:22:49.462 "num_base_bdevs_operational": 2, 00:22:49.462 "base_bdevs_list": [ 00:22:49.462 { 00:22:49.462 "name": "spare", 00:22:49.462 "uuid": "2b1b6af3-8e4b-5a13-a4fd-ecefd8b99920", 00:22:49.462 "is_configured": true, 00:22:49.462 "data_offset": 256, 00:22:49.462 "data_size": 7936 00:22:49.462 }, 00:22:49.462 { 00:22:49.462 "name": "BaseBdev2", 00:22:49.462 "uuid": "f8be1e73-c679-5acb-832f-28a8e57a03c7", 00:22:49.462 "is_configured": true, 00:22:49.462 "data_offset": 256, 00:22:49.462 "data_size": 7936 00:22:49.462 } 00:22:49.462 ] 00:22:49.462 }' 00:22:49.462 23:44:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:49.462 23:44:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:22:50.029 23:44:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:50.029 [2024-07-24 23:44:34.976318] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:50.029 [2024-07-24 23:44:34.976339] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:50.029 [2024-07-24 23:44:34.976381] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:50.029 [2024-07-24 23:44:34.976419] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:50.029 [2024-07-24 23:44:34.976425] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13ca7f0 name raid_bdev1, state offline 00:22:50.029 23:44:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:50.029 23:44:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # jq length 00:22:50.286 23:44:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:22:50.286 23:44:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@721 -- # '[' false = true ']' 00:22:50.286 23:44:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:22:50.286 23:44:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:22:50.544 23:44:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:22:50.544 [2024-07-24 23:44:35.473578] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:22:50.544 [2024-07-24 23:44:35.473605] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:50.544 [2024-07-24 23:44:35.473615] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13ca100 00:22:50.544 [2024-07-24 23:44:35.473621] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:50.544 [2024-07-24 23:44:35.474675] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:50.544 [2024-07-24 23:44:35.474693] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:22:50.544 [2024-07-24 23:44:35.474728] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:22:50.544 [2024-07-24 23:44:35.474745] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:50.544 [2024-07-24 23:44:35.474801] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:50.544 spare 00:22:50.544 23:44:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:50.544 23:44:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:50.544 23:44:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:50.544 23:44:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:50.544 23:44:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:50.544 23:44:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:50.544 23:44:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:50.544 23:44:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:50.544 23:44:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:50.544 23:44:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:50.545 23:44:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:50.545 23:44:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:50.803 [2024-07-24 23:44:35.575087] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x13cae30 00:22:50.803 [2024-07-24 23:44:35.575096] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:22:50.803 [2024-07-24 23:44:35.575136] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13bf980 00:22:50.803 [2024-07-24 23:44:35.575191] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x13cae30 00:22:50.803 [2024-07-24 23:44:35.575196] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x13cae30 00:22:50.803 [2024-07-24 23:44:35.575235] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:50.803 23:44:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:50.803 "name": "raid_bdev1", 00:22:50.803 "uuid": "a5835d28-8795-429b-b7e1-bbd18f37e9e5", 00:22:50.803 "strip_size_kb": 0, 00:22:50.803 "state": "online", 00:22:50.803 "raid_level": "raid1", 00:22:50.803 "superblock": true, 00:22:50.803 "num_base_bdevs": 2, 00:22:50.803 "num_base_bdevs_discovered": 2, 00:22:50.803 "num_base_bdevs_operational": 2, 00:22:50.803 "base_bdevs_list": [ 00:22:50.803 { 00:22:50.803 "name": "spare", 00:22:50.803 "uuid": "2b1b6af3-8e4b-5a13-a4fd-ecefd8b99920", 00:22:50.803 "is_configured": true, 00:22:50.803 "data_offset": 256, 00:22:50.803 "data_size": 7936 00:22:50.803 }, 00:22:50.803 { 00:22:50.803 "name": "BaseBdev2", 00:22:50.803 "uuid": "f8be1e73-c679-5acb-832f-28a8e57a03c7", 00:22:50.803 "is_configured": true, 00:22:50.803 "data_offset": 256, 00:22:50.803 "data_size": 7936 00:22:50.803 } 00:22:50.803 ] 00:22:50.803 }' 00:22:50.803 23:44:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:50.803 23:44:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:22:51.370 23:44:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:51.370 23:44:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:51.370 23:44:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:51.370 23:44:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:51.370 23:44:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:51.370 23:44:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:51.370 23:44:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:51.370 23:44:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:51.370 "name": "raid_bdev1", 00:22:51.370 "uuid": "a5835d28-8795-429b-b7e1-bbd18f37e9e5", 00:22:51.370 "strip_size_kb": 0, 00:22:51.370 "state": "online", 00:22:51.370 "raid_level": "raid1", 00:22:51.370 "superblock": true, 00:22:51.370 "num_base_bdevs": 2, 00:22:51.370 "num_base_bdevs_discovered": 2, 00:22:51.370 "num_base_bdevs_operational": 2, 00:22:51.370 "base_bdevs_list": [ 00:22:51.370 { 00:22:51.370 "name": "spare", 00:22:51.370 "uuid": "2b1b6af3-8e4b-5a13-a4fd-ecefd8b99920", 00:22:51.370 "is_configured": true, 00:22:51.370 "data_offset": 256, 00:22:51.370 "data_size": 7936 00:22:51.370 }, 00:22:51.370 { 00:22:51.370 "name": "BaseBdev2", 00:22:51.370 "uuid": "f8be1e73-c679-5acb-832f-28a8e57a03c7", 00:22:51.370 "is_configured": true, 00:22:51.370 "data_offset": 256, 00:22:51.370 "data_size": 7936 00:22:51.370 } 00:22:51.370 ] 00:22:51.370 }' 00:22:51.370 23:44:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:51.628 23:44:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:51.628 23:44:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:51.628 23:44:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:51.628 23:44:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:51.628 23:44:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:22:51.628 23:44:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:22:51.628 23:44:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:22:51.887 [2024-07-24 23:44:36.736912] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:51.887 23:44:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:51.887 23:44:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:51.887 23:44:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:51.887 23:44:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:51.887 23:44:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:51.887 23:44:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:51.887 23:44:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:51.887 23:44:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:51.887 23:44:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:51.887 23:44:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:51.887 23:44:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:51.887 23:44:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:52.145 23:44:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:52.145 "name": "raid_bdev1", 00:22:52.145 "uuid": "a5835d28-8795-429b-b7e1-bbd18f37e9e5", 00:22:52.145 "strip_size_kb": 0, 00:22:52.145 "state": "online", 00:22:52.145 "raid_level": "raid1", 00:22:52.145 "superblock": true, 00:22:52.145 "num_base_bdevs": 2, 00:22:52.145 "num_base_bdevs_discovered": 1, 00:22:52.145 "num_base_bdevs_operational": 1, 00:22:52.145 "base_bdevs_list": [ 00:22:52.145 { 00:22:52.145 "name": null, 00:22:52.145 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:52.145 "is_configured": false, 00:22:52.145 "data_offset": 256, 00:22:52.145 "data_size": 7936 00:22:52.145 }, 00:22:52.145 { 00:22:52.145 "name": "BaseBdev2", 00:22:52.145 "uuid": "f8be1e73-c679-5acb-832f-28a8e57a03c7", 00:22:52.145 "is_configured": true, 00:22:52.145 "data_offset": 256, 00:22:52.145 "data_size": 7936 00:22:52.145 } 00:22:52.145 ] 00:22:52.145 }' 00:22:52.145 23:44:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:52.145 23:44:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:22:52.711 23:44:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:52.711 [2024-07-24 23:44:37.571084] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:52.711 [2024-07-24 23:44:37.571195] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:22:52.711 [2024-07-24 23:44:37.571205] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:22:52.711 [2024-07-24 23:44:37.571224] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:52.711 [2024-07-24 23:44:37.574273] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1231470 00:22:52.711 [2024-07-24 23:44:37.575291] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:52.711 23:44:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@755 -- # sleep 1 00:22:53.646 23:44:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:53.646 23:44:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:53.646 23:44:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:53.646 23:44:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:53.646 23:44:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:53.646 23:44:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:53.646 23:44:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:53.904 23:44:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:53.904 "name": "raid_bdev1", 00:22:53.904 "uuid": "a5835d28-8795-429b-b7e1-bbd18f37e9e5", 00:22:53.904 "strip_size_kb": 0, 00:22:53.904 "state": "online", 00:22:53.904 "raid_level": "raid1", 00:22:53.904 "superblock": true, 00:22:53.904 "num_base_bdevs": 2, 00:22:53.904 "num_base_bdevs_discovered": 2, 00:22:53.904 "num_base_bdevs_operational": 2, 00:22:53.904 "process": { 00:22:53.904 "type": "rebuild", 00:22:53.904 "target": "spare", 00:22:53.904 "progress": { 00:22:53.904 "blocks": 2816, 00:22:53.904 "percent": 35 00:22:53.904 } 00:22:53.904 }, 00:22:53.904 "base_bdevs_list": [ 00:22:53.904 { 00:22:53.904 "name": "spare", 00:22:53.904 "uuid": "2b1b6af3-8e4b-5a13-a4fd-ecefd8b99920", 00:22:53.904 "is_configured": true, 00:22:53.904 "data_offset": 256, 00:22:53.904 "data_size": 7936 00:22:53.904 }, 00:22:53.904 { 00:22:53.904 "name": "BaseBdev2", 00:22:53.904 "uuid": "f8be1e73-c679-5acb-832f-28a8e57a03c7", 00:22:53.904 "is_configured": true, 00:22:53.904 "data_offset": 256, 00:22:53.904 "data_size": 7936 00:22:53.904 } 00:22:53.904 ] 00:22:53.904 }' 00:22:53.904 23:44:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:53.904 23:44:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:53.904 23:44:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:53.904 23:44:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:53.904 23:44:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:22:54.163 [2024-07-24 23:44:38.983854] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:54.163 [2024-07-24 23:44:38.985079] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:22:54.163 [2024-07-24 23:44:38.985109] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:54.163 [2024-07-24 23:44:38.985118] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:54.163 [2024-07-24 23:44:38.985122] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:22:54.163 23:44:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:54.163 23:44:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:54.163 23:44:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:54.163 23:44:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:54.163 23:44:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:54.163 23:44:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:54.163 23:44:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:54.163 23:44:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:54.163 23:44:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:54.163 23:44:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:54.163 23:44:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:54.163 23:44:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:54.421 23:44:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:54.421 "name": "raid_bdev1", 00:22:54.421 "uuid": "a5835d28-8795-429b-b7e1-bbd18f37e9e5", 00:22:54.421 "strip_size_kb": 0, 00:22:54.421 "state": "online", 00:22:54.421 "raid_level": "raid1", 00:22:54.421 "superblock": true, 00:22:54.421 "num_base_bdevs": 2, 00:22:54.421 "num_base_bdevs_discovered": 1, 00:22:54.421 "num_base_bdevs_operational": 1, 00:22:54.421 "base_bdevs_list": [ 00:22:54.421 { 00:22:54.421 "name": null, 00:22:54.421 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:54.421 "is_configured": false, 00:22:54.421 "data_offset": 256, 00:22:54.421 "data_size": 7936 00:22:54.421 }, 00:22:54.421 { 00:22:54.421 "name": "BaseBdev2", 00:22:54.421 "uuid": "f8be1e73-c679-5acb-832f-28a8e57a03c7", 00:22:54.421 "is_configured": true, 00:22:54.421 "data_offset": 256, 00:22:54.421 "data_size": 7936 00:22:54.421 } 00:22:54.421 ] 00:22:54.421 }' 00:22:54.421 23:44:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:54.421 23:44:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:22:54.679 23:44:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:22:54.942 [2024-07-24 23:44:39.798787] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:22:54.942 [2024-07-24 23:44:39.798825] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:54.942 [2024-07-24 23:44:39.798839] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13c8ac0 00:22:54.942 [2024-07-24 23:44:39.798845] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:54.942 [2024-07-24 23:44:39.798984] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:54.942 [2024-07-24 23:44:39.798993] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:22:54.942 [2024-07-24 23:44:39.799030] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:22:54.942 [2024-07-24 23:44:39.799037] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:22:54.942 [2024-07-24 23:44:39.799042] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:22:54.942 [2024-07-24 23:44:39.799052] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:54.942 [2024-07-24 23:44:39.802081] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13ce990 00:22:54.942 [2024-07-24 23:44:39.803044] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:54.942 spare 00:22:54.942 23:44:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@762 -- # sleep 1 00:22:55.919 23:44:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:55.919 23:44:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:55.919 23:44:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:55.919 23:44:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:55.919 23:44:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:55.919 23:44:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:55.919 23:44:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:56.178 23:44:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:56.178 "name": "raid_bdev1", 00:22:56.178 "uuid": "a5835d28-8795-429b-b7e1-bbd18f37e9e5", 00:22:56.178 "strip_size_kb": 0, 00:22:56.178 "state": "online", 00:22:56.178 "raid_level": "raid1", 00:22:56.178 "superblock": true, 00:22:56.178 "num_base_bdevs": 2, 00:22:56.178 "num_base_bdevs_discovered": 2, 00:22:56.178 "num_base_bdevs_operational": 2, 00:22:56.178 "process": { 00:22:56.178 "type": "rebuild", 00:22:56.178 "target": "spare", 00:22:56.178 "progress": { 00:22:56.178 "blocks": 2816, 00:22:56.178 "percent": 35 00:22:56.179 } 00:22:56.179 }, 00:22:56.179 "base_bdevs_list": [ 00:22:56.179 { 00:22:56.179 "name": "spare", 00:22:56.179 "uuid": "2b1b6af3-8e4b-5a13-a4fd-ecefd8b99920", 00:22:56.179 "is_configured": true, 00:22:56.179 "data_offset": 256, 00:22:56.179 "data_size": 7936 00:22:56.179 }, 00:22:56.179 { 00:22:56.179 "name": "BaseBdev2", 00:22:56.179 "uuid": "f8be1e73-c679-5acb-832f-28a8e57a03c7", 00:22:56.179 "is_configured": true, 00:22:56.179 "data_offset": 256, 00:22:56.179 "data_size": 7936 00:22:56.179 } 00:22:56.179 ] 00:22:56.179 }' 00:22:56.179 23:44:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:56.179 23:44:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:56.179 23:44:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:56.179 23:44:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:56.179 23:44:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:22:56.437 [2024-07-24 23:44:41.239348] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:56.437 [2024-07-24 23:44:41.313575] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:22:56.437 [2024-07-24 23:44:41.313608] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:56.437 [2024-07-24 23:44:41.313617] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:56.437 [2024-07-24 23:44:41.313621] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:22:56.437 23:44:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:56.438 23:44:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:56.438 23:44:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:56.438 23:44:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:56.438 23:44:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:56.438 23:44:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:56.438 23:44:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:56.438 23:44:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:56.438 23:44:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:56.438 23:44:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:56.438 23:44:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:56.438 23:44:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:56.696 23:44:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:56.696 "name": "raid_bdev1", 00:22:56.696 "uuid": "a5835d28-8795-429b-b7e1-bbd18f37e9e5", 00:22:56.696 "strip_size_kb": 0, 00:22:56.696 "state": "online", 00:22:56.696 "raid_level": "raid1", 00:22:56.696 "superblock": true, 00:22:56.696 "num_base_bdevs": 2, 00:22:56.696 "num_base_bdevs_discovered": 1, 00:22:56.696 "num_base_bdevs_operational": 1, 00:22:56.696 "base_bdevs_list": [ 00:22:56.696 { 00:22:56.696 "name": null, 00:22:56.696 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:56.696 "is_configured": false, 00:22:56.696 "data_offset": 256, 00:22:56.696 "data_size": 7936 00:22:56.696 }, 00:22:56.696 { 00:22:56.696 "name": "BaseBdev2", 00:22:56.696 "uuid": "f8be1e73-c679-5acb-832f-28a8e57a03c7", 00:22:56.696 "is_configured": true, 00:22:56.696 "data_offset": 256, 00:22:56.696 "data_size": 7936 00:22:56.696 } 00:22:56.696 ] 00:22:56.696 }' 00:22:56.696 23:44:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:56.696 23:44:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:22:57.262 23:44:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:57.262 23:44:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:57.263 23:44:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:57.263 23:44:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:57.263 23:44:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:57.263 23:44:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:57.263 23:44:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:57.263 23:44:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:57.263 "name": "raid_bdev1", 00:22:57.263 "uuid": "a5835d28-8795-429b-b7e1-bbd18f37e9e5", 00:22:57.263 "strip_size_kb": 0, 00:22:57.263 "state": "online", 00:22:57.263 "raid_level": "raid1", 00:22:57.263 "superblock": true, 00:22:57.263 "num_base_bdevs": 2, 00:22:57.263 "num_base_bdevs_discovered": 1, 00:22:57.263 "num_base_bdevs_operational": 1, 00:22:57.263 "base_bdevs_list": [ 00:22:57.263 { 00:22:57.263 "name": null, 00:22:57.263 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:57.263 "is_configured": false, 00:22:57.263 "data_offset": 256, 00:22:57.263 "data_size": 7936 00:22:57.263 }, 00:22:57.263 { 00:22:57.263 "name": "BaseBdev2", 00:22:57.263 "uuid": "f8be1e73-c679-5acb-832f-28a8e57a03c7", 00:22:57.263 "is_configured": true, 00:22:57.263 "data_offset": 256, 00:22:57.263 "data_size": 7936 00:22:57.263 } 00:22:57.263 ] 00:22:57.263 }' 00:22:57.263 23:44:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:57.263 23:44:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:57.263 23:44:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:57.263 23:44:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:57.263 23:44:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:22:57.521 23:44:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:22:57.521 [2024-07-24 23:44:42.520171] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:22:57.521 [2024-07-24 23:44:42.520205] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:57.521 [2024-07-24 23:44:42.520218] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13cc620 00:22:57.521 [2024-07-24 23:44:42.520224] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:57.521 [2024-07-24 23:44:42.520341] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:57.521 [2024-07-24 23:44:42.520350] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:57.521 [2024-07-24 23:44:42.520382] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:22:57.521 [2024-07-24 23:44:42.520389] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:22:57.521 [2024-07-24 23:44:42.520394] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:22:57.779 BaseBdev1 00:22:57.779 23:44:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@773 -- # sleep 1 00:22:58.712 23:44:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:58.712 23:44:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:58.712 23:44:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:58.712 23:44:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:58.712 23:44:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:58.712 23:44:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:58.712 23:44:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:58.712 23:44:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:58.712 23:44:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:58.712 23:44:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:58.712 23:44:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:58.712 23:44:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:58.712 23:44:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:58.712 "name": "raid_bdev1", 00:22:58.712 "uuid": "a5835d28-8795-429b-b7e1-bbd18f37e9e5", 00:22:58.712 "strip_size_kb": 0, 00:22:58.712 "state": "online", 00:22:58.712 "raid_level": "raid1", 00:22:58.712 "superblock": true, 00:22:58.712 "num_base_bdevs": 2, 00:22:58.712 "num_base_bdevs_discovered": 1, 00:22:58.712 "num_base_bdevs_operational": 1, 00:22:58.712 "base_bdevs_list": [ 00:22:58.712 { 00:22:58.712 "name": null, 00:22:58.712 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:58.712 "is_configured": false, 00:22:58.712 "data_offset": 256, 00:22:58.712 "data_size": 7936 00:22:58.712 }, 00:22:58.712 { 00:22:58.712 "name": "BaseBdev2", 00:22:58.712 "uuid": "f8be1e73-c679-5acb-832f-28a8e57a03c7", 00:22:58.712 "is_configured": true, 00:22:58.712 "data_offset": 256, 00:22:58.712 "data_size": 7936 00:22:58.712 } 00:22:58.712 ] 00:22:58.712 }' 00:22:58.712 23:44:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:58.712 23:44:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:22:59.278 23:44:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:59.278 23:44:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:59.278 23:44:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:59.278 23:44:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:59.278 23:44:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:59.278 23:44:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:59.278 23:44:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:59.536 23:44:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:59.536 "name": "raid_bdev1", 00:22:59.536 "uuid": "a5835d28-8795-429b-b7e1-bbd18f37e9e5", 00:22:59.536 "strip_size_kb": 0, 00:22:59.536 "state": "online", 00:22:59.536 "raid_level": "raid1", 00:22:59.536 "superblock": true, 00:22:59.537 "num_base_bdevs": 2, 00:22:59.537 "num_base_bdevs_discovered": 1, 00:22:59.537 "num_base_bdevs_operational": 1, 00:22:59.537 "base_bdevs_list": [ 00:22:59.537 { 00:22:59.537 "name": null, 00:22:59.537 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:59.537 "is_configured": false, 00:22:59.537 "data_offset": 256, 00:22:59.537 "data_size": 7936 00:22:59.537 }, 00:22:59.537 { 00:22:59.537 "name": "BaseBdev2", 00:22:59.537 "uuid": "f8be1e73-c679-5acb-832f-28a8e57a03c7", 00:22:59.537 "is_configured": true, 00:22:59.537 "data_offset": 256, 00:22:59.537 "data_size": 7936 00:22:59.537 } 00:22:59.537 ] 00:22:59.537 }' 00:22:59.537 23:44:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:59.537 23:44:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:59.537 23:44:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:59.537 23:44:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:59.537 23:44:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:22:59.537 23:44:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@650 -- # local es=0 00:22:59.537 23:44:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:22:59.537 23:44:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:59.537 23:44:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:22:59.537 23:44:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:59.537 23:44:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:22:59.537 23:44:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:59.537 23:44:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:22:59.537 23:44:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:59.537 23:44:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:22:59.537 23:44:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:22:59.795 [2024-07-24 23:44:44.581520] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:59.795 [2024-07-24 23:44:44.581611] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:22:59.795 [2024-07-24 23:44:44.581620] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:22:59.795 request: 00:22:59.795 { 00:22:59.795 "base_bdev": "BaseBdev1", 00:22:59.795 "raid_bdev": "raid_bdev1", 00:22:59.795 "method": "bdev_raid_add_base_bdev", 00:22:59.795 "req_id": 1 00:22:59.795 } 00:22:59.795 Got JSON-RPC error response 00:22:59.795 response: 00:22:59.795 { 00:22:59.795 "code": -22, 00:22:59.795 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:22:59.795 } 00:22:59.795 23:44:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@653 -- # es=1 00:22:59.795 23:44:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:22:59.795 23:44:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:22:59.795 23:44:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:22:59.795 23:44:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@777 -- # sleep 1 00:23:00.729 23:44:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:00.729 23:44:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:00.729 23:44:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:00.729 23:44:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:00.730 23:44:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:00.730 23:44:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:00.730 23:44:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:00.730 23:44:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:00.730 23:44:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:00.730 23:44:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:00.730 23:44:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:00.730 23:44:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:00.988 23:44:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:00.988 "name": "raid_bdev1", 00:23:00.988 "uuid": "a5835d28-8795-429b-b7e1-bbd18f37e9e5", 00:23:00.988 "strip_size_kb": 0, 00:23:00.988 "state": "online", 00:23:00.988 "raid_level": "raid1", 00:23:00.988 "superblock": true, 00:23:00.988 "num_base_bdevs": 2, 00:23:00.988 "num_base_bdevs_discovered": 1, 00:23:00.988 "num_base_bdevs_operational": 1, 00:23:00.988 "base_bdevs_list": [ 00:23:00.988 { 00:23:00.988 "name": null, 00:23:00.988 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:00.988 "is_configured": false, 00:23:00.988 "data_offset": 256, 00:23:00.988 "data_size": 7936 00:23:00.988 }, 00:23:00.988 { 00:23:00.988 "name": "BaseBdev2", 00:23:00.988 "uuid": "f8be1e73-c679-5acb-832f-28a8e57a03c7", 00:23:00.988 "is_configured": true, 00:23:00.988 "data_offset": 256, 00:23:00.988 "data_size": 7936 00:23:00.988 } 00:23:00.988 ] 00:23:00.988 }' 00:23:00.988 23:44:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:00.988 23:44:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:23:01.554 23:44:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:01.554 23:44:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:01.554 23:44:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:01.554 23:44:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:01.554 23:44:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:01.554 23:44:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:01.554 23:44:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:01.554 23:44:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:01.554 "name": "raid_bdev1", 00:23:01.554 "uuid": "a5835d28-8795-429b-b7e1-bbd18f37e9e5", 00:23:01.554 "strip_size_kb": 0, 00:23:01.554 "state": "online", 00:23:01.554 "raid_level": "raid1", 00:23:01.554 "superblock": true, 00:23:01.554 "num_base_bdevs": 2, 00:23:01.554 "num_base_bdevs_discovered": 1, 00:23:01.554 "num_base_bdevs_operational": 1, 00:23:01.554 "base_bdevs_list": [ 00:23:01.554 { 00:23:01.554 "name": null, 00:23:01.554 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:01.554 "is_configured": false, 00:23:01.554 "data_offset": 256, 00:23:01.554 "data_size": 7936 00:23:01.554 }, 00:23:01.554 { 00:23:01.554 "name": "BaseBdev2", 00:23:01.554 "uuid": "f8be1e73-c679-5acb-832f-28a8e57a03c7", 00:23:01.554 "is_configured": true, 00:23:01.554 "data_offset": 256, 00:23:01.554 "data_size": 7936 00:23:01.554 } 00:23:01.554 ] 00:23:01.554 }' 00:23:01.554 23:44:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:01.554 23:44:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:01.554 23:44:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:01.554 23:44:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:01.554 23:44:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@782 -- # killprocess 398881 00:23:01.554 23:44:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@950 -- # '[' -z 398881 ']' 00:23:01.554 23:44:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # kill -0 398881 00:23:01.554 23:44:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@955 -- # uname 00:23:01.554 23:44:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:23:01.554 23:44:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 398881 00:23:01.554 23:44:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:23:01.554 23:44:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:23:01.554 23:44:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@968 -- # echo 'killing process with pid 398881' 00:23:01.554 killing process with pid 398881 00:23:01.554 23:44:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@969 -- # kill 398881 00:23:01.554 Received shutdown signal, test time was about 53.577187 seconds 00:23:01.554 00:23:01.554 Latency(us) 00:23:01.554 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:01.554 =================================================================================================================== 00:23:01.554 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:23:01.554 [2024-07-24 23:44:46.553343] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:01.554 [2024-07-24 23:44:46.553407] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:01.554 [2024-07-24 23:44:46.553437] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:01.554 [2024-07-24 23:44:46.553443] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13cae30 name raid_bdev1, state offline 00:23:01.554 23:44:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@974 -- # wait 398881 00:23:01.812 [2024-07-24 23:44:46.577970] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:01.812 23:44:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@784 -- # return 0 00:23:01.812 00:23:01.812 real 0m23.720s 00:23:01.812 user 0m36.857s 00:23:01.812 sys 0m2.581s 00:23:01.812 23:44:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@1126 -- # xtrace_disable 00:23:01.812 23:44:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:23:01.812 ************************************ 00:23:01.812 END TEST raid_rebuild_test_sb_md_interleaved 00:23:01.812 ************************************ 00:23:01.812 23:44:46 bdev_raid -- bdev/bdev_raid.sh@916 -- # trap - EXIT 00:23:01.812 23:44:46 bdev_raid -- bdev/bdev_raid.sh@917 -- # cleanup 00:23:01.812 23:44:46 bdev_raid -- bdev/bdev_raid.sh@58 -- # '[' -n 398881 ']' 00:23:01.812 23:44:46 bdev_raid -- bdev/bdev_raid.sh@58 -- # ps -p 398881 00:23:02.070 23:44:46 bdev_raid -- bdev/bdev_raid.sh@62 -- # rm -rf /raidtest 00:23:02.070 00:23:02.070 real 14m9.491s 00:23:02.070 user 23m59.561s 00:23:02.070 sys 2m8.085s 00:23:02.070 23:44:46 bdev_raid -- common/autotest_common.sh@1126 -- # xtrace_disable 00:23:02.070 23:44:46 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:02.070 ************************************ 00:23:02.070 END TEST bdev_raid 00:23:02.070 ************************************ 00:23:02.070 23:44:46 -- spdk/autotest.sh@195 -- # run_test bdevperf_config /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test_config.sh 00:23:02.070 23:44:46 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:23:02.070 23:44:46 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:23:02.070 23:44:46 -- common/autotest_common.sh@10 -- # set +x 00:23:02.070 ************************************ 00:23:02.070 START TEST bdevperf_config 00:23:02.070 ************************************ 00:23:02.070 23:44:46 bdevperf_config -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test_config.sh 00:23:02.070 * Looking for test storage... 00:23:02.070 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf 00:23:02.070 23:44:46 bdevperf_config -- bdevperf/test_config.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/common.sh 00:23:02.070 23:44:46 bdevperf_config -- bdevperf/common.sh@5 -- # bdevperf=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf 00:23:02.070 23:44:46 bdevperf_config -- bdevperf/test_config.sh@12 -- # jsonconf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json 00:23:02.070 23:44:46 bdevperf_config -- bdevperf/test_config.sh@13 -- # testconf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:23:02.070 23:44:46 bdevperf_config -- bdevperf/test_config.sh@15 -- # trap 'cleanup; exit 1' SIGINT SIGTERM EXIT 00:23:02.070 23:44:46 bdevperf_config -- bdevperf/test_config.sh@17 -- # create_job global read Malloc0 00:23:02.070 23:44:46 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=global 00:23:02.070 23:44:46 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=read 00:23:02.070 23:44:46 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:23:02.070 23:44:46 bdevperf_config -- bdevperf/common.sh@12 -- # [[ global == \g\l\o\b\a\l ]] 00:23:02.070 23:44:46 bdevperf_config -- bdevperf/common.sh@13 -- # cat 00:23:02.070 23:44:46 bdevperf_config -- bdevperf/common.sh@18 -- # job='[global]' 00:23:02.070 23:44:46 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:23:02.070 00:23:02.070 23:44:46 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:23:02.070 23:44:46 bdevperf_config -- bdevperf/test_config.sh@18 -- # create_job job0 00:23:02.070 23:44:46 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:23:02.070 23:44:46 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:23:02.070 23:44:46 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:23:02.070 23:44:46 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:23:02.070 23:44:46 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:23:02.070 23:44:46 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:23:02.070 00:23:02.070 23:44:46 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:23:02.070 23:44:46 bdevperf_config -- bdevperf/test_config.sh@19 -- # create_job job1 00:23:02.070 23:44:46 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:23:02.070 23:44:46 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:23:02.070 23:44:46 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:23:02.070 23:44:46 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:23:02.070 23:44:46 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:23:02.070 23:44:46 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:23:02.070 00:23:02.070 23:44:46 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:23:02.070 23:44:46 bdevperf_config -- bdevperf/test_config.sh@20 -- # create_job job2 00:23:02.070 23:44:47 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:23:02.070 23:44:47 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:23:02.070 23:44:47 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:23:02.070 23:44:47 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:23:02.070 23:44:47 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:23:02.070 23:44:47 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:23:02.070 00:23:02.071 23:44:47 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:23:02.071 23:44:47 bdevperf_config -- bdevperf/test_config.sh@21 -- # create_job job3 00:23:02.071 23:44:47 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job3 00:23:02.071 23:44:47 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:23:02.071 23:44:47 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:23:02.071 23:44:47 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job3 == \g\l\o\b\a\l ]] 00:23:02.071 23:44:47 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job3]' 00:23:02.071 23:44:47 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:23:02.071 00:23:02.071 23:44:47 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:23:02.071 23:44:47 bdevperf_config -- bdevperf/test_config.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:23:04.599 23:44:49 bdevperf_config -- bdevperf/test_config.sh@22 -- # bdevperf_output='[2024-07-24 23:44:47.060896] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:23:04.600 [2024-07-24 23:44:47.060957] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid403282 ] 00:23:04.600 Using job config with 4 jobs 00:23:04.600 [2024-07-24 23:44:47.136979] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:04.600 [2024-07-24 23:44:47.224973] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:04.600 cpumask for '\''job0'\'' is too big 00:23:04.600 cpumask for '\''job1'\'' is too big 00:23:04.600 cpumask for '\''job2'\'' is too big 00:23:04.600 cpumask for '\''job3'\'' is too big 00:23:04.600 Running I/O for 2 seconds... 00:23:04.600 00:23:04.600 Latency(us) 00:23:04.600 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:04.600 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:23:04.600 Malloc0 : 2.01 37352.29 36.48 0.00 0.00 6849.46 1279.51 10360.93 00:23:04.600 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:23:04.600 Malloc0 : 2.01 37363.65 36.49 0.00 0.00 6837.12 1178.09 9175.04 00:23:04.600 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:23:04.600 Malloc0 : 2.02 37342.43 36.47 0.00 0.00 6831.11 1170.29 8051.57 00:23:04.600 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:23:04.600 Malloc0 : 2.02 37321.15 36.45 0.00 0.00 6825.20 1170.29 7645.87 00:23:04.600 =================================================================================================================== 00:23:04.600 Total : 149379.53 145.88 0.00 0.00 6835.71 1170.29 10360.93' 00:23:04.600 23:44:49 bdevperf_config -- bdevperf/test_config.sh@23 -- # get_num_jobs '[2024-07-24 23:44:47.060896] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:23:04.600 [2024-07-24 23:44:47.060957] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid403282 ] 00:23:04.600 Using job config with 4 jobs 00:23:04.600 [2024-07-24 23:44:47.136979] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:04.600 [2024-07-24 23:44:47.224973] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:04.600 cpumask for '\''job0'\'' is too big 00:23:04.600 cpumask for '\''job1'\'' is too big 00:23:04.600 cpumask for '\''job2'\'' is too big 00:23:04.600 cpumask for '\''job3'\'' is too big 00:23:04.600 Running I/O for 2 seconds... 00:23:04.600 00:23:04.600 Latency(us) 00:23:04.600 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:04.600 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:23:04.600 Malloc0 : 2.01 37352.29 36.48 0.00 0.00 6849.46 1279.51 10360.93 00:23:04.600 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:23:04.600 Malloc0 : 2.01 37363.65 36.49 0.00 0.00 6837.12 1178.09 9175.04 00:23:04.600 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:23:04.600 Malloc0 : 2.02 37342.43 36.47 0.00 0.00 6831.11 1170.29 8051.57 00:23:04.600 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:23:04.600 Malloc0 : 2.02 37321.15 36.45 0.00 0.00 6825.20 1170.29 7645.87 00:23:04.600 =================================================================================================================== 00:23:04.600 Total : 149379.53 145.88 0.00 0.00 6835.71 1170.29 10360.93' 00:23:04.600 23:44:49 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-24 23:44:47.060896] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:23:04.600 [2024-07-24 23:44:47.060957] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid403282 ] 00:23:04.600 Using job config with 4 jobs 00:23:04.600 [2024-07-24 23:44:47.136979] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:04.600 [2024-07-24 23:44:47.224973] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:04.600 cpumask for '\''job0'\'' is too big 00:23:04.600 cpumask for '\''job1'\'' is too big 00:23:04.600 cpumask for '\''job2'\'' is too big 00:23:04.600 cpumask for '\''job3'\'' is too big 00:23:04.600 Running I/O for 2 seconds... 00:23:04.600 00:23:04.600 Latency(us) 00:23:04.600 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:04.600 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:23:04.600 Malloc0 : 2.01 37352.29 36.48 0.00 0.00 6849.46 1279.51 10360.93 00:23:04.600 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:23:04.600 Malloc0 : 2.01 37363.65 36.49 0.00 0.00 6837.12 1178.09 9175.04 00:23:04.600 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:23:04.600 Malloc0 : 2.02 37342.43 36.47 0.00 0.00 6831.11 1170.29 8051.57 00:23:04.600 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:23:04.600 Malloc0 : 2.02 37321.15 36.45 0.00 0.00 6825.20 1170.29 7645.87 00:23:04.600 =================================================================================================================== 00:23:04.600 Total : 149379.53 145.88 0.00 0.00 6835.71 1170.29 10360.93' 00:23:04.600 23:44:49 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:23:04.600 23:44:49 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:23:04.600 23:44:49 bdevperf_config -- bdevperf/test_config.sh@23 -- # [[ 4 == \4 ]] 00:23:04.600 23:44:49 bdevperf_config -- bdevperf/test_config.sh@25 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -C -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:23:04.858 [2024-07-24 23:44:49.627036] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:23:04.858 [2024-07-24 23:44:49.627079] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid403664 ] 00:23:04.858 [2024-07-24 23:44:49.707668] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:04.858 [2024-07-24 23:44:49.801099] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:05.116 cpumask for 'job0' is too big 00:23:05.116 cpumask for 'job1' is too big 00:23:05.116 cpumask for 'job2' is too big 00:23:05.116 cpumask for 'job3' is too big 00:23:07.645 23:44:52 bdevperf_config -- bdevperf/test_config.sh@25 -- # bdevperf_output='Using job config with 4 jobs 00:23:07.645 Running I/O for 2 seconds... 00:23:07.645 00:23:07.645 Latency(us) 00:23:07.645 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:07.645 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:23:07.645 Malloc0 : 2.01 36730.77 35.87 0.00 0.00 6964.75 1287.31 11297.16 00:23:07.645 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:23:07.645 Malloc0 : 2.02 36709.59 35.85 0.00 0.00 6957.73 1295.12 9986.44 00:23:07.645 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:23:07.645 Malloc0 : 2.02 36688.60 35.83 0.00 0.00 6950.80 1271.71 8987.79 00:23:07.645 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:23:07.645 Malloc0 : 2.02 36667.33 35.81 0.00 0.00 6944.02 1263.91 8987.79 00:23:07.645 =================================================================================================================== 00:23:07.645 Total : 146796.30 143.36 0.00 0.00 6954.33 1263.91 11297.16' 00:23:07.645 23:44:52 bdevperf_config -- bdevperf/test_config.sh@27 -- # cleanup 00:23:07.645 23:44:52 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:23:07.645 23:44:52 bdevperf_config -- bdevperf/test_config.sh@29 -- # create_job job0 write Malloc0 00:23:07.645 23:44:52 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:23:07.645 23:44:52 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:23:07.645 23:44:52 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:23:07.645 23:44:52 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:23:07.645 23:44:52 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:23:07.645 23:44:52 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:23:07.645 00:23:07.645 23:44:52 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:23:07.645 23:44:52 bdevperf_config -- bdevperf/test_config.sh@30 -- # create_job job1 write Malloc0 00:23:07.645 23:44:52 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:23:07.645 23:44:52 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:23:07.645 23:44:52 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:23:07.645 23:44:52 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:23:07.645 23:44:52 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:23:07.645 23:44:52 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:23:07.645 00:23:07.645 23:44:52 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:23:07.645 23:44:52 bdevperf_config -- bdevperf/test_config.sh@31 -- # create_job job2 write Malloc0 00:23:07.645 23:44:52 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:23:07.645 23:44:52 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:23:07.645 23:44:52 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:23:07.645 23:44:52 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:23:07.645 23:44:52 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:23:07.645 23:44:52 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:23:07.645 00:23:07.645 23:44:52 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:23:07.645 23:44:52 bdevperf_config -- bdevperf/test_config.sh@32 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:23:10.176 23:44:54 bdevperf_config -- bdevperf/test_config.sh@32 -- # bdevperf_output='[2024-07-24 23:44:52.206268] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:23:10.176 [2024-07-24 23:44:52.206309] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid404121 ] 00:23:10.176 Using job config with 3 jobs 00:23:10.176 [2024-07-24 23:44:52.278244] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:10.176 [2024-07-24 23:44:52.361134] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:10.176 cpumask for '\''job0'\'' is too big 00:23:10.176 cpumask for '\''job1'\'' is too big 00:23:10.176 cpumask for '\''job2'\'' is too big 00:23:10.176 Running I/O for 2 seconds... 00:23:10.176 00:23:10.176 Latency(us) 00:23:10.176 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:10.176 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:23:10.176 Malloc0 : 2.01 50083.13 48.91 0.00 0.00 5104.78 1248.30 7989.15 00:23:10.176 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:23:10.176 Malloc0 : 2.01 50054.30 48.88 0.00 0.00 5099.96 1240.50 6740.85 00:23:10.176 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:23:10.176 Malloc0 : 2.01 50025.73 48.85 0.00 0.00 5094.99 1232.70 5867.03 00:23:10.176 =================================================================================================================== 00:23:10.176 Total : 150163.17 146.64 0.00 0.00 5099.91 1232.70 7989.15' 00:23:10.176 23:44:54 bdevperf_config -- bdevperf/test_config.sh@33 -- # get_num_jobs '[2024-07-24 23:44:52.206268] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:23:10.176 [2024-07-24 23:44:52.206309] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid404121 ] 00:23:10.176 Using job config with 3 jobs 00:23:10.176 [2024-07-24 23:44:52.278244] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:10.176 [2024-07-24 23:44:52.361134] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:10.176 cpumask for '\''job0'\'' is too big 00:23:10.176 cpumask for '\''job1'\'' is too big 00:23:10.176 cpumask for '\''job2'\'' is too big 00:23:10.176 Running I/O for 2 seconds... 00:23:10.176 00:23:10.176 Latency(us) 00:23:10.176 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:10.176 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:23:10.176 Malloc0 : 2.01 50083.13 48.91 0.00 0.00 5104.78 1248.30 7989.15 00:23:10.176 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:23:10.176 Malloc0 : 2.01 50054.30 48.88 0.00 0.00 5099.96 1240.50 6740.85 00:23:10.177 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:23:10.177 Malloc0 : 2.01 50025.73 48.85 0.00 0.00 5094.99 1232.70 5867.03 00:23:10.177 =================================================================================================================== 00:23:10.177 Total : 150163.17 146.64 0.00 0.00 5099.91 1232.70 7989.15' 00:23:10.177 23:44:54 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-24 23:44:52.206268] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:23:10.177 [2024-07-24 23:44:52.206309] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid404121 ] 00:23:10.177 Using job config with 3 jobs 00:23:10.177 [2024-07-24 23:44:52.278244] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:10.177 [2024-07-24 23:44:52.361134] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:10.177 cpumask for '\''job0'\'' is too big 00:23:10.177 cpumask for '\''job1'\'' is too big 00:23:10.177 cpumask for '\''job2'\'' is too big 00:23:10.177 Running I/O for 2 seconds... 00:23:10.177 00:23:10.177 Latency(us) 00:23:10.177 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:10.177 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:23:10.177 Malloc0 : 2.01 50083.13 48.91 0.00 0.00 5104.78 1248.30 7989.15 00:23:10.177 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:23:10.177 Malloc0 : 2.01 50054.30 48.88 0.00 0.00 5099.96 1240.50 6740.85 00:23:10.177 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:23:10.177 Malloc0 : 2.01 50025.73 48.85 0.00 0.00 5094.99 1232.70 5867.03 00:23:10.177 =================================================================================================================== 00:23:10.177 Total : 150163.17 146.64 0.00 0.00 5099.91 1232.70 7989.15' 00:23:10.177 23:44:54 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:23:10.177 23:44:54 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:23:10.177 23:44:54 bdevperf_config -- bdevperf/test_config.sh@33 -- # [[ 3 == \3 ]] 00:23:10.177 23:44:54 bdevperf_config -- bdevperf/test_config.sh@35 -- # cleanup 00:23:10.177 23:44:54 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:23:10.177 23:44:54 bdevperf_config -- bdevperf/test_config.sh@37 -- # create_job global rw Malloc0:Malloc1 00:23:10.177 23:44:54 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=global 00:23:10.177 23:44:54 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=rw 00:23:10.177 23:44:54 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0:Malloc1 00:23:10.177 23:44:54 bdevperf_config -- bdevperf/common.sh@12 -- # [[ global == \g\l\o\b\a\l ]] 00:23:10.177 23:44:54 bdevperf_config -- bdevperf/common.sh@13 -- # cat 00:23:10.177 23:44:54 bdevperf_config -- bdevperf/common.sh@18 -- # job='[global]' 00:23:10.177 23:44:54 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:23:10.177 00:23:10.177 23:44:54 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:23:10.177 23:44:54 bdevperf_config -- bdevperf/test_config.sh@38 -- # create_job job0 00:23:10.177 23:44:54 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:23:10.177 23:44:54 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:23:10.177 23:44:54 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:23:10.177 23:44:54 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:23:10.177 23:44:54 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:23:10.177 23:44:54 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:23:10.177 00:23:10.177 23:44:54 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:23:10.177 23:44:54 bdevperf_config -- bdevperf/test_config.sh@39 -- # create_job job1 00:23:10.177 23:44:54 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:23:10.177 23:44:54 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:23:10.177 23:44:54 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:23:10.177 23:44:54 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:23:10.177 23:44:54 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:23:10.177 23:44:54 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:23:10.177 00:23:10.177 23:44:54 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:23:10.177 23:44:54 bdevperf_config -- bdevperf/test_config.sh@40 -- # create_job job2 00:23:10.177 23:44:54 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:23:10.177 23:44:54 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:23:10.177 23:44:54 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:23:10.177 23:44:54 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:23:10.177 23:44:54 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:23:10.177 23:44:54 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:23:10.177 00:23:10.177 23:44:54 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:23:10.177 23:44:54 bdevperf_config -- bdevperf/test_config.sh@41 -- # create_job job3 00:23:10.177 23:44:54 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job3 00:23:10.177 23:44:54 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:23:10.177 23:44:54 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:23:10.177 23:44:54 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job3 == \g\l\o\b\a\l ]] 00:23:10.177 23:44:54 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job3]' 00:23:10.177 23:44:54 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:23:10.177 00:23:10.177 23:44:54 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:23:10.177 23:44:54 bdevperf_config -- bdevperf/test_config.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:23:12.712 23:44:57 bdevperf_config -- bdevperf/test_config.sh@42 -- # bdevperf_output='[2024-07-24 23:44:54.786157] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:23:12.712 [2024-07-24 23:44:54.786201] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid404575 ] 00:23:12.712 Using job config with 4 jobs 00:23:12.712 [2024-07-24 23:44:54.863896] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:12.712 [2024-07-24 23:44:54.950147] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:12.712 cpumask for '\''job0'\'' is too big 00:23:12.712 cpumask for '\''job1'\'' is too big 00:23:12.712 cpumask for '\''job2'\'' is too big 00:23:12.712 cpumask for '\''job3'\'' is too big 00:23:12.712 Running I/O for 2 seconds... 00:23:12.712 00:23:12.712 Latency(us) 00:23:12.712 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:12.712 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:23:12.712 Malloc0 : 2.02 18654.00 18.22 0.00 0.00 13717.71 2496.61 21096.35 00:23:12.712 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:23:12.712 Malloc1 : 2.02 18642.91 18.21 0.00 0.00 13717.33 2949.12 21096.35 00:23:12.712 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:23:12.712 Malloc0 : 2.02 18632.26 18.20 0.00 0.00 13691.72 2402.99 18849.40 00:23:12.712 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:23:12.712 Malloc1 : 2.03 18667.63 18.23 0.00 0.00 13656.19 2917.91 18974.23 00:23:12.712 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:23:12.712 Malloc0 : 2.03 18656.83 18.22 0.00 0.00 13635.34 2465.40 16477.62 00:23:12.712 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:23:12.712 Malloc1 : 2.03 18645.96 18.21 0.00 0.00 13634.32 3011.54 16602.45 00:23:12.712 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:23:12.712 Malloc0 : 2.03 18634.90 18.20 0.00 0.00 13611.22 2449.80 14605.17 00:23:12.712 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:23:12.712 Malloc1 : 2.03 18623.68 18.19 0.00 0.00 13610.78 3011.54 14730.00 00:23:12.712 =================================================================================================================== 00:23:12.712 Total : 149158.18 145.66 0.00 0.00 13659.20 2402.99 21096.35' 00:23:12.712 23:44:57 bdevperf_config -- bdevperf/test_config.sh@43 -- # get_num_jobs '[2024-07-24 23:44:54.786157] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:23:12.712 [2024-07-24 23:44:54.786201] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid404575 ] 00:23:12.712 Using job config with 4 jobs 00:23:12.712 [2024-07-24 23:44:54.863896] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:12.712 [2024-07-24 23:44:54.950147] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:12.712 cpumask for '\''job0'\'' is too big 00:23:12.712 cpumask for '\''job1'\'' is too big 00:23:12.712 cpumask for '\''job2'\'' is too big 00:23:12.712 cpumask for '\''job3'\'' is too big 00:23:12.712 Running I/O for 2 seconds... 00:23:12.712 00:23:12.712 Latency(us) 00:23:12.712 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:12.712 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:23:12.712 Malloc0 : 2.02 18654.00 18.22 0.00 0.00 13717.71 2496.61 21096.35 00:23:12.712 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:23:12.712 Malloc1 : 2.02 18642.91 18.21 0.00 0.00 13717.33 2949.12 21096.35 00:23:12.712 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:23:12.712 Malloc0 : 2.02 18632.26 18.20 0.00 0.00 13691.72 2402.99 18849.40 00:23:12.712 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:23:12.712 Malloc1 : 2.03 18667.63 18.23 0.00 0.00 13656.19 2917.91 18974.23 00:23:12.712 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:23:12.712 Malloc0 : 2.03 18656.83 18.22 0.00 0.00 13635.34 2465.40 16477.62 00:23:12.712 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:23:12.712 Malloc1 : 2.03 18645.96 18.21 0.00 0.00 13634.32 3011.54 16602.45 00:23:12.712 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:23:12.712 Malloc0 : 2.03 18634.90 18.20 0.00 0.00 13611.22 2449.80 14605.17 00:23:12.712 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:23:12.712 Malloc1 : 2.03 18623.68 18.19 0.00 0.00 13610.78 3011.54 14730.00 00:23:12.712 =================================================================================================================== 00:23:12.712 Total : 149158.18 145.66 0.00 0.00 13659.20 2402.99 21096.35' 00:23:12.712 23:44:57 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-24 23:44:54.786157] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:23:12.712 [2024-07-24 23:44:54.786201] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid404575 ] 00:23:12.712 Using job config with 4 jobs 00:23:12.712 [2024-07-24 23:44:54.863896] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:12.712 [2024-07-24 23:44:54.950147] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:12.712 cpumask for '\''job0'\'' is too big 00:23:12.712 cpumask for '\''job1'\'' is too big 00:23:12.712 cpumask for '\''job2'\'' is too big 00:23:12.712 cpumask for '\''job3'\'' is too big 00:23:12.713 Running I/O for 2 seconds... 00:23:12.713 00:23:12.713 Latency(us) 00:23:12.713 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:12.713 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:23:12.713 Malloc0 : 2.02 18654.00 18.22 0.00 0.00 13717.71 2496.61 21096.35 00:23:12.713 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:23:12.713 Malloc1 : 2.02 18642.91 18.21 0.00 0.00 13717.33 2949.12 21096.35 00:23:12.713 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:23:12.713 Malloc0 : 2.02 18632.26 18.20 0.00 0.00 13691.72 2402.99 18849.40 00:23:12.713 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:23:12.713 Malloc1 : 2.03 18667.63 18.23 0.00 0.00 13656.19 2917.91 18974.23 00:23:12.713 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:23:12.713 Malloc0 : 2.03 18656.83 18.22 0.00 0.00 13635.34 2465.40 16477.62 00:23:12.713 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:23:12.713 Malloc1 : 2.03 18645.96 18.21 0.00 0.00 13634.32 3011.54 16602.45 00:23:12.713 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:23:12.713 Malloc0 : 2.03 18634.90 18.20 0.00 0.00 13611.22 2449.80 14605.17 00:23:12.713 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:23:12.713 Malloc1 : 2.03 18623.68 18.19 0.00 0.00 13610.78 3011.54 14730.00 00:23:12.713 =================================================================================================================== 00:23:12.713 Total : 149158.18 145.66 0.00 0.00 13659.20 2402.99 21096.35' 00:23:12.713 23:44:57 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:23:12.713 23:44:57 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:23:12.713 23:44:57 bdevperf_config -- bdevperf/test_config.sh@43 -- # [[ 4 == \4 ]] 00:23:12.713 23:44:57 bdevperf_config -- bdevperf/test_config.sh@44 -- # cleanup 00:23:12.713 23:44:57 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:23:12.713 23:44:57 bdevperf_config -- bdevperf/test_config.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:23:12.713 00:23:12.713 real 0m10.434s 00:23:12.713 user 0m9.457s 00:23:12.713 sys 0m0.826s 00:23:12.713 23:44:57 bdevperf_config -- common/autotest_common.sh@1126 -- # xtrace_disable 00:23:12.713 23:44:57 bdevperf_config -- common/autotest_common.sh@10 -- # set +x 00:23:12.713 ************************************ 00:23:12.713 END TEST bdevperf_config 00:23:12.713 ************************************ 00:23:12.713 23:44:57 -- spdk/autotest.sh@196 -- # uname -s 00:23:12.713 23:44:57 -- spdk/autotest.sh@196 -- # [[ Linux == Linux ]] 00:23:12.713 23:44:57 -- spdk/autotest.sh@197 -- # run_test reactor_set_interrupt /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:23:12.713 23:44:57 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:23:12.713 23:44:57 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:23:12.713 23:44:57 -- common/autotest_common.sh@10 -- # set +x 00:23:12.713 ************************************ 00:23:12.713 START TEST reactor_set_interrupt 00:23:12.713 ************************************ 00:23:12.713 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:23:12.713 * Looking for test storage... 00:23:12.713 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:23:12.713 23:44:57 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/interrupt_common.sh 00:23:12.713 23:44:57 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:23:12.713 23:44:57 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:23:12.713 23:44:57 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # testdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:23:12.713 23:44:57 reactor_set_interrupt -- interrupt/interrupt_common.sh@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/../.. 00:23:12.713 23:44:57 reactor_set_interrupt -- interrupt/interrupt_common.sh@6 -- # rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:23:12.713 23:44:57 reactor_set_interrupt -- interrupt/interrupt_common.sh@7 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh 00:23:12.713 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:23:12.713 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@34 -- # set -e 00:23:12.713 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:23:12.713 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@36 -- # shopt -s extglob 00:23:12.713 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:23:12.713 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/crypto-phy-autotest/spdk/../output ']' 00:23:12.713 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh ]] 00:23:12.713 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh 00:23:12.713 23:44:57 reactor_set_interrupt -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:23:12.713 23:44:57 reactor_set_interrupt -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:23:12.713 23:44:57 reactor_set_interrupt -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=y 00:23:12.713 23:44:57 reactor_set_interrupt -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:23:12.713 23:44:57 reactor_set_interrupt -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:23:12.713 23:44:57 reactor_set_interrupt -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:23:12.713 23:44:57 reactor_set_interrupt -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:23:12.713 23:44:57 reactor_set_interrupt -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:23:12.713 23:44:57 reactor_set_interrupt -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:23:12.713 23:44:57 reactor_set_interrupt -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:23:12.713 23:44:57 reactor_set_interrupt -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:23:12.713 23:44:57 reactor_set_interrupt -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:23:12.713 23:44:57 reactor_set_interrupt -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:23:12.713 23:44:57 reactor_set_interrupt -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:23:12.713 23:44:57 reactor_set_interrupt -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:23:12.713 23:44:57 reactor_set_interrupt -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:23:12.713 23:44:57 reactor_set_interrupt -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:23:12.713 23:44:57 reactor_set_interrupt -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:23:12.713 23:44:57 reactor_set_interrupt -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:23:12.713 23:44:57 reactor_set_interrupt -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:23:12.713 23:44:57 reactor_set_interrupt -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:23:12.713 23:44:57 reactor_set_interrupt -- common/build_config.sh@22 -- # CONFIG_CET=n 00:23:12.713 23:44:57 reactor_set_interrupt -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=y 00:23:12.713 23:44:57 reactor_set_interrupt -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:23:12.713 23:44:57 reactor_set_interrupt -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:23:12.713 23:44:57 reactor_set_interrupt -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:23:12.713 23:44:57 reactor_set_interrupt -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:23:12.713 23:44:57 reactor_set_interrupt -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:23:12.713 23:44:57 reactor_set_interrupt -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:23:12.713 23:44:57 reactor_set_interrupt -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:23:12.713 23:44:57 reactor_set_interrupt -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:23:12.713 23:44:57 reactor_set_interrupt -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:23:12.713 23:44:57 reactor_set_interrupt -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:23:12.713 23:44:57 reactor_set_interrupt -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB= 00:23:12.713 23:44:57 reactor_set_interrupt -- common/build_config.sh@35 -- # CONFIG_FUZZER=n 00:23:12.713 23:44:57 reactor_set_interrupt -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:23:12.713 23:44:57 reactor_set_interrupt -- common/build_config.sh@37 -- # CONFIG_CRYPTO=y 00:23:12.713 23:44:57 reactor_set_interrupt -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:23:12.713 23:44:57 reactor_set_interrupt -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:23:12.713 23:44:57 reactor_set_interrupt -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:23:12.713 23:44:57 reactor_set_interrupt -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:23:12.713 23:44:57 reactor_set_interrupt -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:23:12.714 23:44:57 reactor_set_interrupt -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:23:12.714 23:44:57 reactor_set_interrupt -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:23:12.714 23:44:57 reactor_set_interrupt -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:23:12.714 23:44:57 reactor_set_interrupt -- common/build_config.sh@46 -- # CONFIG_DPDK_UADK=n 00:23:12.714 23:44:57 reactor_set_interrupt -- common/build_config.sh@47 -- # CONFIG_COVERAGE=y 00:23:12.714 23:44:57 reactor_set_interrupt -- common/build_config.sh@48 -- # CONFIG_RDMA=y 00:23:12.714 23:44:57 reactor_set_interrupt -- common/build_config.sh@49 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:23:12.714 23:44:57 reactor_set_interrupt -- common/build_config.sh@50 -- # CONFIG_URING_PATH= 00:23:12.714 23:44:57 reactor_set_interrupt -- common/build_config.sh@51 -- # CONFIG_XNVME=n 00:23:12.714 23:44:57 reactor_set_interrupt -- common/build_config.sh@52 -- # CONFIG_VFIO_USER=n 00:23:12.714 23:44:57 reactor_set_interrupt -- common/build_config.sh@53 -- # CONFIG_ARCH=native 00:23:12.714 23:44:57 reactor_set_interrupt -- common/build_config.sh@54 -- # CONFIG_HAVE_EVP_MAC=y 00:23:12.714 23:44:57 reactor_set_interrupt -- common/build_config.sh@55 -- # CONFIG_URING_ZNS=n 00:23:12.714 23:44:57 reactor_set_interrupt -- common/build_config.sh@56 -- # CONFIG_WERROR=y 00:23:12.714 23:44:57 reactor_set_interrupt -- common/build_config.sh@57 -- # CONFIG_HAVE_LIBBSD=n 00:23:12.714 23:44:57 reactor_set_interrupt -- common/build_config.sh@58 -- # CONFIG_UBSAN=y 00:23:12.714 23:44:57 reactor_set_interrupt -- common/build_config.sh@59 -- # CONFIG_IPSEC_MB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:23:12.714 23:44:57 reactor_set_interrupt -- common/build_config.sh@60 -- # CONFIG_GOLANG=n 00:23:12.714 23:44:57 reactor_set_interrupt -- common/build_config.sh@61 -- # CONFIG_ISAL=y 00:23:12.714 23:44:57 reactor_set_interrupt -- common/build_config.sh@62 -- # CONFIG_IDXD_KERNEL=y 00:23:12.714 23:44:57 reactor_set_interrupt -- common/build_config.sh@63 -- # CONFIG_DPDK_LIB_DIR= 00:23:12.714 23:44:57 reactor_set_interrupt -- common/build_config.sh@64 -- # CONFIG_RDMA_PROV=verbs 00:23:12.714 23:44:57 reactor_set_interrupt -- common/build_config.sh@65 -- # CONFIG_APPS=y 00:23:12.714 23:44:57 reactor_set_interrupt -- common/build_config.sh@66 -- # CONFIG_SHARED=y 00:23:12.714 23:44:57 reactor_set_interrupt -- common/build_config.sh@67 -- # CONFIG_HAVE_KEYUTILS=y 00:23:12.714 23:44:57 reactor_set_interrupt -- common/build_config.sh@68 -- # CONFIG_FC_PATH= 00:23:12.714 23:44:57 reactor_set_interrupt -- common/build_config.sh@69 -- # CONFIG_DPDK_PKG_CONFIG=n 00:23:12.714 23:44:57 reactor_set_interrupt -- common/build_config.sh@70 -- # CONFIG_FC=n 00:23:12.714 23:44:57 reactor_set_interrupt -- common/build_config.sh@71 -- # CONFIG_AVAHI=n 00:23:12.714 23:44:57 reactor_set_interrupt -- common/build_config.sh@72 -- # CONFIG_FIO_PLUGIN=y 00:23:12.714 23:44:57 reactor_set_interrupt -- common/build_config.sh@73 -- # CONFIG_RAID5F=n 00:23:12.714 23:44:57 reactor_set_interrupt -- common/build_config.sh@74 -- # CONFIG_EXAMPLES=y 00:23:12.714 23:44:57 reactor_set_interrupt -- common/build_config.sh@75 -- # CONFIG_TESTS=y 00:23:12.714 23:44:57 reactor_set_interrupt -- common/build_config.sh@76 -- # CONFIG_CRYPTO_MLX5=y 00:23:12.714 23:44:57 reactor_set_interrupt -- common/build_config.sh@77 -- # CONFIG_MAX_LCORES=128 00:23:12.714 23:44:57 reactor_set_interrupt -- common/build_config.sh@78 -- # CONFIG_IPSEC_MB=y 00:23:12.714 23:44:57 reactor_set_interrupt -- common/build_config.sh@79 -- # CONFIG_PGO_DIR= 00:23:12.714 23:44:57 reactor_set_interrupt -- common/build_config.sh@80 -- # CONFIG_DEBUG=y 00:23:12.714 23:44:57 reactor_set_interrupt -- common/build_config.sh@81 -- # CONFIG_DPDK_COMPRESSDEV=y 00:23:12.714 23:44:57 reactor_set_interrupt -- common/build_config.sh@82 -- # CONFIG_CROSS_PREFIX= 00:23:12.714 23:44:57 reactor_set_interrupt -- common/build_config.sh@83 -- # CONFIG_URING=n 00:23:12.714 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:23:12.714 23:44:57 reactor_set_interrupt -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:23:12.714 23:44:57 reactor_set_interrupt -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:23:12.714 23:44:57 reactor_set_interrupt -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:23:12.714 23:44:57 reactor_set_interrupt -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:23:12.714 23:44:57 reactor_set_interrupt -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:23:12.714 23:44:57 reactor_set_interrupt -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:23:12.714 23:44:57 reactor_set_interrupt -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:23:12.714 23:44:57 reactor_set_interrupt -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:23:12.714 23:44:57 reactor_set_interrupt -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:23:12.714 23:44:57 reactor_set_interrupt -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:23:12.714 23:44:57 reactor_set_interrupt -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:23:12.714 23:44:57 reactor_set_interrupt -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:23:12.714 23:44:57 reactor_set_interrupt -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:23:12.714 23:44:57 reactor_set_interrupt -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/config.h ]] 00:23:12.714 23:44:57 reactor_set_interrupt -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:23:12.714 #define SPDK_CONFIG_H 00:23:12.714 #define SPDK_CONFIG_APPS 1 00:23:12.714 #define SPDK_CONFIG_ARCH native 00:23:12.714 #undef SPDK_CONFIG_ASAN 00:23:12.714 #undef SPDK_CONFIG_AVAHI 00:23:12.714 #undef SPDK_CONFIG_CET 00:23:12.714 #define SPDK_CONFIG_COVERAGE 1 00:23:12.714 #define SPDK_CONFIG_CROSS_PREFIX 00:23:12.714 #define SPDK_CONFIG_CRYPTO 1 00:23:12.714 #define SPDK_CONFIG_CRYPTO_MLX5 1 00:23:12.714 #undef SPDK_CONFIG_CUSTOMOCF 00:23:12.714 #undef SPDK_CONFIG_DAOS 00:23:12.714 #define SPDK_CONFIG_DAOS_DIR 00:23:12.714 #define SPDK_CONFIG_DEBUG 1 00:23:12.714 #define SPDK_CONFIG_DPDK_COMPRESSDEV 1 00:23:12.714 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:23:12.714 #define SPDK_CONFIG_DPDK_INC_DIR 00:23:12.714 #define SPDK_CONFIG_DPDK_LIB_DIR 00:23:12.714 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:23:12.714 #undef SPDK_CONFIG_DPDK_UADK 00:23:12.714 #define SPDK_CONFIG_ENV /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:23:12.714 #define SPDK_CONFIG_EXAMPLES 1 00:23:12.714 #undef SPDK_CONFIG_FC 00:23:12.714 #define SPDK_CONFIG_FC_PATH 00:23:12.714 #define SPDK_CONFIG_FIO_PLUGIN 1 00:23:12.714 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:23:12.714 #undef SPDK_CONFIG_FUSE 00:23:12.714 #undef SPDK_CONFIG_FUZZER 00:23:12.714 #define SPDK_CONFIG_FUZZER_LIB 00:23:12.714 #undef SPDK_CONFIG_GOLANG 00:23:12.714 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:23:12.714 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:23:12.714 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:23:12.714 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:23:12.714 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:23:12.714 #undef SPDK_CONFIG_HAVE_LIBBSD 00:23:12.714 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:23:12.714 #define SPDK_CONFIG_IDXD 1 00:23:12.714 #define SPDK_CONFIG_IDXD_KERNEL 1 00:23:12.714 #define SPDK_CONFIG_IPSEC_MB 1 00:23:12.714 #define SPDK_CONFIG_IPSEC_MB_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:23:12.714 #define SPDK_CONFIG_ISAL 1 00:23:12.714 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:23:12.714 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:23:12.714 #define SPDK_CONFIG_LIBDIR 00:23:12.714 #undef SPDK_CONFIG_LTO 00:23:12.714 #define SPDK_CONFIG_MAX_LCORES 128 00:23:12.714 #define SPDK_CONFIG_NVME_CUSE 1 00:23:12.714 #undef SPDK_CONFIG_OCF 00:23:12.714 #define SPDK_CONFIG_OCF_PATH 00:23:12.714 #define SPDK_CONFIG_OPENSSL_PATH 00:23:12.714 #undef SPDK_CONFIG_PGO_CAPTURE 00:23:12.714 #define SPDK_CONFIG_PGO_DIR 00:23:12.714 #undef SPDK_CONFIG_PGO_USE 00:23:12.714 #define SPDK_CONFIG_PREFIX /usr/local 00:23:12.714 #undef SPDK_CONFIG_RAID5F 00:23:12.714 #undef SPDK_CONFIG_RBD 00:23:12.714 #define SPDK_CONFIG_RDMA 1 00:23:12.714 #define SPDK_CONFIG_RDMA_PROV verbs 00:23:12.714 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:23:12.714 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:23:12.714 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:23:12.714 #define SPDK_CONFIG_SHARED 1 00:23:12.714 #undef SPDK_CONFIG_SMA 00:23:12.714 #define SPDK_CONFIG_TESTS 1 00:23:12.714 #undef SPDK_CONFIG_TSAN 00:23:12.714 #define SPDK_CONFIG_UBLK 1 00:23:12.714 #define SPDK_CONFIG_UBSAN 1 00:23:12.714 #undef SPDK_CONFIG_UNIT_TESTS 00:23:12.714 #undef SPDK_CONFIG_URING 00:23:12.714 #define SPDK_CONFIG_URING_PATH 00:23:12.714 #undef SPDK_CONFIG_URING_ZNS 00:23:12.714 #undef SPDK_CONFIG_USDT 00:23:12.714 #define SPDK_CONFIG_VBDEV_COMPRESS 1 00:23:12.714 #define SPDK_CONFIG_VBDEV_COMPRESS_MLX5 1 00:23:12.714 #undef SPDK_CONFIG_VFIO_USER 00:23:12.714 #define SPDK_CONFIG_VFIO_USER_DIR 00:23:12.714 #define SPDK_CONFIG_VHOST 1 00:23:12.714 #define SPDK_CONFIG_VIRTIO 1 00:23:12.714 #undef SPDK_CONFIG_VTUNE 00:23:12.714 #define SPDK_CONFIG_VTUNE_DIR 00:23:12.714 #define SPDK_CONFIG_WERROR 1 00:23:12.714 #define SPDK_CONFIG_WPDK_DIR 00:23:12.714 #undef SPDK_CONFIG_XNVME 00:23:12.714 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:23:12.714 23:44:57 reactor_set_interrupt -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:23:12.714 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:23:12.714 23:44:57 reactor_set_interrupt -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:23:12.714 23:44:57 reactor_set_interrupt -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:23:12.714 23:44:57 reactor_set_interrupt -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:23:12.714 23:44:57 reactor_set_interrupt -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:23:12.715 23:44:57 reactor_set_interrupt -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:23:12.715 23:44:57 reactor_set_interrupt -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:23:12.715 23:44:57 reactor_set_interrupt -- paths/export.sh@5 -- # export PATH 00:23:12.715 23:44:57 reactor_set_interrupt -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:23:12.715 23:44:57 reactor_set_interrupt -- pm/common@6 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:23:12.715 23:44:57 reactor_set_interrupt -- pm/common@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:23:12.715 23:44:57 reactor_set_interrupt -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:23:12.715 23:44:57 reactor_set_interrupt -- pm/common@7 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/../../../ 00:23:12.715 23:44:57 reactor_set_interrupt -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:23:12.715 23:44:57 reactor_set_interrupt -- pm/common@64 -- # TEST_TAG=N/A 00:23:12.715 23:44:57 reactor_set_interrupt -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/crypto-phy-autotest/spdk/.run_test_name 00:23:12.715 23:44:57 reactor_set_interrupt -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power 00:23:12.715 23:44:57 reactor_set_interrupt -- pm/common@68 -- # uname -s 00:23:12.715 23:44:57 reactor_set_interrupt -- pm/common@68 -- # PM_OS=Linux 00:23:12.715 23:44:57 reactor_set_interrupt -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:23:12.715 23:44:57 reactor_set_interrupt -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:23:12.715 23:44:57 reactor_set_interrupt -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:23:12.715 23:44:57 reactor_set_interrupt -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:23:12.715 23:44:57 reactor_set_interrupt -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:23:12.715 23:44:57 reactor_set_interrupt -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:23:12.715 23:44:57 reactor_set_interrupt -- pm/common@76 -- # SUDO[0]= 00:23:12.715 23:44:57 reactor_set_interrupt -- pm/common@76 -- # SUDO[1]='sudo -E' 00:23:12.715 23:44:57 reactor_set_interrupt -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:23:12.715 23:44:57 reactor_set_interrupt -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:23:12.715 23:44:57 reactor_set_interrupt -- pm/common@81 -- # [[ Linux == Linux ]] 00:23:12.715 23:44:57 reactor_set_interrupt -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:23:12.715 23:44:57 reactor_set_interrupt -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:23:12.715 23:44:57 reactor_set_interrupt -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:23:12.715 23:44:57 reactor_set_interrupt -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:23:12.715 23:44:57 reactor_set_interrupt -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power ]] 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@58 -- # : 0 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@62 -- # : 0 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@64 -- # : 0 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@66 -- # : 1 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@68 -- # : 0 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@70 -- # : 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@72 -- # : 0 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@74 -- # : 1 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@76 -- # : 0 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@78 -- # : 0 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@80 -- # : 0 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@82 -- # : 0 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@84 -- # : 0 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@86 -- # : 0 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@88 -- # : 0 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@90 -- # : 0 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@92 -- # : 0 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@94 -- # : 0 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@96 -- # : 0 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@98 -- # : 0 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@100 -- # : 0 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@102 -- # : rdma 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@104 -- # : 0 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@106 -- # : 0 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@108 -- # : 1 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@110 -- # : 0 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@111 -- # export SPDK_TEST_IOAT 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@112 -- # : 0 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@113 -- # export SPDK_TEST_BLOBFS 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@114 -- # : 0 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@115 -- # export SPDK_TEST_VHOST_INIT 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@116 -- # : 0 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@117 -- # export SPDK_TEST_LVOL 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@118 -- # : 1 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@119 -- # export SPDK_TEST_VBDEV_COMPRESS 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@120 -- # : 0 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@121 -- # export SPDK_RUN_ASAN 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@122 -- # : 1 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@123 -- # export SPDK_RUN_UBSAN 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@124 -- # : 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@125 -- # export SPDK_RUN_EXTERNAL_DPDK 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@126 -- # : 0 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@127 -- # export SPDK_RUN_NON_ROOT 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@128 -- # : 1 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@129 -- # export SPDK_TEST_CRYPTO 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@130 -- # : 0 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@131 -- # export SPDK_TEST_FTL 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@132 -- # : 0 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@133 -- # export SPDK_TEST_OCF 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@134 -- # : 0 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@135 -- # export SPDK_TEST_VMD 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@136 -- # : 0 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@137 -- # export SPDK_TEST_OPAL 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@138 -- # : 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@139 -- # export SPDK_TEST_NATIVE_DPDK 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@140 -- # : true 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@141 -- # export SPDK_AUTOTEST_X 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@142 -- # : 0 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@143 -- # export SPDK_TEST_RAID5 00:23:12.715 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@144 -- # : 0 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@146 -- # : 0 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@148 -- # : 0 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@150 -- # : 0 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@152 -- # : 0 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@154 -- # : 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@156 -- # : 0 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@158 -- # : 0 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@160 -- # : 0 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@162 -- # : 1 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@164 -- # : 0 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_DSA 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@166 -- # : 0 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@167 -- # export SPDK_TEST_ACCEL_IAA 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@169 -- # : 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@170 -- # export SPDK_TEST_FUZZER_TARGET 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@171 -- # : 0 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@172 -- # export SPDK_TEST_NVMF_MDNS 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@173 -- # : 0 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@174 -- # export SPDK_JSONRPC_GO_CLIENT 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@177 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@177 -- # SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@178 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@178 -- # DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@179 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@179 -- # VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@180 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@180 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@183 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@183 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@187 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@187 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@191 -- # export PYTHONDONTWRITEBYTECODE=1 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@191 -- # PYTHONDONTWRITEBYTECODE=1 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@195 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@195 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@196 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@196 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@200 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@201 -- # rm -rf /var/tmp/asan_suppression_file 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@202 -- # cat 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@238 -- # echo leak:libfuse3.so 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@240 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@240 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@242 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@242 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@244 -- # '[' -z /var/spdk/dependencies ']' 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@247 -- # export DEPENDENCY_DIR 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@251 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@251 -- # SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@252 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@252 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@255 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@255 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@256 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@256 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@258 -- # export AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@258 -- # AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@261 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@261 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@264 -- # '[' 0 -eq 0 ']' 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@265 -- # export valgrind= 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@265 -- # valgrind= 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@271 -- # uname -s 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@271 -- # '[' Linux = Linux ']' 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@272 -- # HUGEMEM=4096 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@273 -- # export CLEAR_HUGE=yes 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@273 -- # CLEAR_HUGE=yes 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@274 -- # [[ 1 -eq 1 ]] 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@278 -- # export HUGE_EVEN_ALLOC=yes 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@278 -- # HUGE_EVEN_ALLOC=yes 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@281 -- # MAKE=make 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@282 -- # MAKEFLAGS=-j96 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@298 -- # export HUGEMEM=4096 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@298 -- # HUGEMEM=4096 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@300 -- # NO_HUGE=() 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@301 -- # TEST_MODE= 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@320 -- # [[ -z 405071 ]] 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@320 -- # kill -0 405071 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@1680 -- # set_test_storage 2147483648 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@330 -- # [[ -v testdir ]] 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@332 -- # local requested_size=2147483648 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@333 -- # local mount target_dir 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@335 -- # local -A mounts fss sizes avails uses 00:23:12.716 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@336 -- # local source fs size avail mount use 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@338 -- # local storage_fallback storage_candidates 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@340 -- # mktemp -udt spdk.XXXXXX 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@340 -- # storage_fallback=/tmp/spdk.RVqWzE 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@345 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@347 -- # [[ -n '' ]] 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@352 -- # [[ -n '' ]] 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@357 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt /tmp/spdk.RVqWzE/tests/interrupt /tmp/spdk.RVqWzE 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@360 -- # requested_size=2214592512 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@329 -- # df -T 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@329 -- # grep -v Filesystem 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@363 -- # mounts["$mount"]=spdk_devtmpfs 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@363 -- # fss["$mount"]=devtmpfs 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@364 -- # avails["$mount"]=67108864 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@364 -- # sizes["$mount"]=67108864 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@365 -- # uses["$mount"]=0 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@363 -- # mounts["$mount"]=/dev/pmem0 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@363 -- # fss["$mount"]=ext2 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@364 -- # avails["$mount"]=895512576 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@364 -- # sizes["$mount"]=5284429824 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@365 -- # uses["$mount"]=4388917248 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@363 -- # mounts["$mount"]=spdk_root 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@363 -- # fss["$mount"]=overlay 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@364 -- # avails["$mount"]=89523290112 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@364 -- # sizes["$mount"]=95562764288 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@365 -- # uses["$mount"]=6039474176 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@363 -- # mounts["$mount"]=tmpfs 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@363 -- # fss["$mount"]=tmpfs 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@364 -- # avails["$mount"]=47725748224 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@364 -- # sizes["$mount"]=47781380096 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@365 -- # uses["$mount"]=55631872 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@363 -- # mounts["$mount"]=tmpfs 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@363 -- # fss["$mount"]=tmpfs 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@364 -- # avails["$mount"]=19102932992 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@364 -- # sizes["$mount"]=19112554496 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@365 -- # uses["$mount"]=9621504 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@363 -- # mounts["$mount"]=tmpfs 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@363 -- # fss["$mount"]=tmpfs 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@364 -- # avails["$mount"]=47780855808 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@364 -- # sizes["$mount"]=47781384192 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@365 -- # uses["$mount"]=528384 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@363 -- # mounts["$mount"]=tmpfs 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@363 -- # fss["$mount"]=tmpfs 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@364 -- # avails["$mount"]=9556271104 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@364 -- # sizes["$mount"]=9556275200 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@365 -- # uses["$mount"]=4096 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@368 -- # printf '* Looking for test storage...\n' 00:23:12.717 * Looking for test storage... 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@370 -- # local target_space new_size 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@371 -- # for target_dir in "${storage_candidates[@]}" 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@374 -- # df /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@374 -- # awk '$1 !~ /Filesystem/{print $6}' 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@374 -- # mount=/ 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@376 -- # target_space=89523290112 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@377 -- # (( target_space == 0 || target_space < requested_size )) 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@380 -- # (( target_space >= requested_size )) 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@382 -- # [[ overlay == tmpfs ]] 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@382 -- # [[ overlay == ramfs ]] 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@382 -- # [[ / == / ]] 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@383 -- # new_size=8254066688 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@384 -- # (( new_size * 100 / sizes[/] > 95 )) 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@389 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@389 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@390 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:23:12.717 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@391 -- # return 0 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@1682 -- # set -o errtrace 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@1683 -- # shopt -s extdebug 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@1684 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@1686 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@1687 -- # true 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@1689 -- # xtrace_fd 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@25 -- # [[ -n 13 ]] 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/13 ]] 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@27 -- # exec 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@29 -- # exec 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@31 -- # xtrace_restore 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:23:12.717 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@18 -- # set -x 00:23:12.717 23:44:57 reactor_set_interrupt -- interrupt/interrupt_common.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/common.sh 00:23:12.717 23:44:57 reactor_set_interrupt -- interrupt/interrupt_common.sh@10 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:12.718 23:44:57 reactor_set_interrupt -- interrupt/interrupt_common.sh@12 -- # r0_mask=0x1 00:23:12.718 23:44:57 reactor_set_interrupt -- interrupt/interrupt_common.sh@13 -- # r1_mask=0x2 00:23:12.718 23:44:57 reactor_set_interrupt -- interrupt/interrupt_common.sh@14 -- # r2_mask=0x4 00:23:12.718 23:44:57 reactor_set_interrupt -- interrupt/interrupt_common.sh@16 -- # cpu_server_mask=0x07 00:23:12.718 23:44:57 reactor_set_interrupt -- interrupt/interrupt_common.sh@17 -- # rpc_server_addr=/var/tmp/spdk.sock 00:23:12.718 23:44:57 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@11 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:23:12.718 23:44:57 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@11 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:23:12.718 23:44:57 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@86 -- # start_intr_tgt 00:23:12.718 23:44:57 reactor_set_interrupt -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:23:12.718 23:44:57 reactor_set_interrupt -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:23:12.718 23:44:57 reactor_set_interrupt -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=405112 00:23:12.718 23:44:57 reactor_set_interrupt -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:23:12.718 23:44:57 reactor_set_interrupt -- interrupt/interrupt_common.sh@26 -- # waitforlisten 405112 /var/tmp/spdk.sock 00:23:12.718 23:44:57 reactor_set_interrupt -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:23:12.718 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@831 -- # '[' -z 405112 ']' 00:23:12.718 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:23:12.718 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@836 -- # local max_retries=100 00:23:12.718 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:23:12.718 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:23:12.718 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@840 -- # xtrace_disable 00:23:12.718 23:44:57 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:23:12.718 [2024-07-24 23:44:57.665517] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:23:12.718 [2024-07-24 23:44:57.665560] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid405112 ] 00:23:12.977 [2024-07-24 23:44:57.728245] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:23:12.977 [2024-07-24 23:44:57.807579] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:23:12.977 [2024-07-24 23:44:57.807676] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:23:12.977 [2024-07-24 23:44:57.807676] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:12.977 [2024-07-24 23:44:57.869704] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:23:13.543 23:44:58 reactor_set_interrupt -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:23:13.543 23:44:58 reactor_set_interrupt -- common/autotest_common.sh@864 -- # return 0 00:23:13.543 23:44:58 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@87 -- # setup_bdev_mem 00:23:13.543 23:44:58 reactor_set_interrupt -- interrupt/common.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:13.802 Malloc0 00:23:13.802 Malloc1 00:23:13.802 Malloc2 00:23:13.802 23:44:58 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@88 -- # setup_bdev_aio 00:23:13.802 23:44:58 reactor_set_interrupt -- interrupt/common.sh@75 -- # uname -s 00:23:13.802 23:44:58 reactor_set_interrupt -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:23:13.802 23:44:58 reactor_set_interrupt -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:23:13.802 5000+0 records in 00:23:13.802 5000+0 records out 00:23:13.802 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0156342 s, 655 MB/s 00:23:13.802 23:44:58 reactor_set_interrupt -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:23:14.060 AIO0 00:23:14.060 23:44:58 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@90 -- # reactor_set_mode_without_threads 405112 00:23:14.060 23:44:58 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@76 -- # reactor_set_intr_mode 405112 without_thd 00:23:14.060 23:44:58 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@14 -- # local spdk_pid=405112 00:23:14.060 23:44:58 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@15 -- # local without_thd=without_thd 00:23:14.060 23:44:58 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # thd0_ids=($(reactor_get_thread_ids $r0_mask)) 00:23:14.060 23:44:58 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # reactor_get_thread_ids 0x1 00:23:14.060 23:44:58 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x1 00:23:14.060 23:44:58 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:23:14.060 23:44:58 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=1 00:23:14.060 23:44:58 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:23:14.060 23:44:58 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:23:14.061 23:44:58 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 1 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:23:14.319 23:44:59 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo 1 00:23:14.320 23:44:59 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # thd2_ids=($(reactor_get_thread_ids $r2_mask)) 00:23:14.320 23:44:59 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # reactor_get_thread_ids 0x4 00:23:14.320 23:44:59 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x4 00:23:14.320 23:44:59 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:23:14.320 23:44:59 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=4 00:23:14.320 23:44:59 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:23:14.320 23:44:59 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:23:14.320 23:44:59 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 4 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:23:14.320 23:44:59 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo '' 00:23:14.320 23:44:59 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@21 -- # [[ 1 -eq 0 ]] 00:23:14.320 23:44:59 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@25 -- # echo 'spdk_thread ids are 1 on reactor0.' 00:23:14.320 spdk_thread ids are 1 on reactor0. 00:23:14.320 23:44:59 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:23:14.320 23:44:59 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 405112 0 00:23:14.320 23:44:59 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 405112 0 idle 00:23:14.320 23:44:59 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=405112 00:23:14.320 23:44:59 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:23:14.320 23:44:59 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:23:14.320 23:44:59 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:23:14.320 23:44:59 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:23:14.320 23:44:59 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:23:14.320 23:44:59 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:23:14.320 23:44:59 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:23:14.320 23:44:59 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 405112 -w 256 00:23:14.320 23:44:59 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:23:14.579 23:44:59 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 405112 root 20 0 128.2g 35328 23040 S 0.0 0.0 0:00.26 reactor_0' 00:23:14.579 23:44:59 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 405112 root 20 0 128.2g 35328 23040 S 0.0 0.0 0:00.26 reactor_0 00:23:14.579 23:44:59 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:23:14.579 23:44:59 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:23:14.579 23:44:59 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:23:14.579 23:44:59 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:23:14.579 23:44:59 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:23:14.579 23:44:59 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:23:14.579 23:44:59 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:23:14.579 23:44:59 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:23:14.579 23:44:59 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:23:14.579 23:44:59 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 405112 1 00:23:14.579 23:44:59 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 405112 1 idle 00:23:14.579 23:44:59 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=405112 00:23:14.579 23:44:59 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=1 00:23:14.579 23:44:59 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:23:14.579 23:44:59 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:23:14.579 23:44:59 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:23:14.579 23:44:59 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:23:14.579 23:44:59 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:23:14.579 23:44:59 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:23:14.579 23:44:59 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 405112 -w 256 00:23:14.579 23:44:59 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_1 00:23:14.857 23:44:59 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 405115 root 20 0 128.2g 35328 23040 S 0.0 0.0 0:00.00 reactor_1' 00:23:14.857 23:44:59 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 405115 root 20 0 128.2g 35328 23040 S 0.0 0.0 0:00.00 reactor_1 00:23:14.857 23:44:59 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:23:14.857 23:44:59 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:23:14.857 23:44:59 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:23:14.857 23:44:59 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:23:14.857 23:44:59 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:23:14.857 23:44:59 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:23:14.857 23:44:59 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:23:14.857 23:44:59 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:23:14.857 23:44:59 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:23:14.857 23:44:59 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 405112 2 00:23:14.857 23:44:59 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 405112 2 idle 00:23:14.857 23:44:59 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=405112 00:23:14.857 23:44:59 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:23:14.857 23:44:59 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:23:14.857 23:44:59 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:23:14.857 23:44:59 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:23:14.857 23:44:59 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:23:14.857 23:44:59 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:23:14.857 23:44:59 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:23:14.857 23:44:59 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 405112 -w 256 00:23:14.857 23:44:59 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:23:14.857 23:44:59 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 405116 root 20 0 128.2g 35328 23040 S 0.0 0.0 0:00.00 reactor_2' 00:23:14.857 23:44:59 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 405116 root 20 0 128.2g 35328 23040 S 0.0 0.0 0:00.00 reactor_2 00:23:14.857 23:44:59 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:23:14.857 23:44:59 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:23:14.857 23:44:59 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:23:14.857 23:44:59 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:23:14.857 23:44:59 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:23:14.857 23:44:59 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:23:14.857 23:44:59 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:23:14.857 23:44:59 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:23:14.857 23:44:59 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@33 -- # '[' without_thdx '!=' x ']' 00:23:14.857 23:44:59 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@35 -- # for i in "${thd0_ids[@]}" 00:23:14.857 23:44:59 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@36 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_set_cpumask -i 1 -m 0x2 00:23:15.115 [2024-07-24 23:44:59.928345] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:23:15.115 23:44:59 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@43 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 -d 00:23:15.115 [2024-07-24 23:45:00.104106] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 0. 00:23:15.115 [2024-07-24 23:45:00.104418] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:23:15.375 23:45:00 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 -d 00:23:15.375 [2024-07-24 23:45:00.276031] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 2. 00:23:15.375 [2024-07-24 23:45:00.276129] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:23:15.375 23:45:00 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:23:15.375 23:45:00 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 405112 0 00:23:15.375 23:45:00 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 405112 0 busy 00:23:15.375 23:45:00 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=405112 00:23:15.375 23:45:00 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:23:15.375 23:45:00 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:23:15.375 23:45:00 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:23:15.375 23:45:00 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:23:15.375 23:45:00 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:23:15.375 23:45:00 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:23:15.375 23:45:00 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 405112 -w 256 00:23:15.375 23:45:00 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:23:15.667 23:45:00 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 405112 root 20 0 128.2g 35328 23040 R 99.9 0.0 0:00.62 reactor_0' 00:23:15.667 23:45:00 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 405112 root 20 0 128.2g 35328 23040 R 99.9 0.0 0:00.62 reactor_0 00:23:15.667 23:45:00 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:23:15.667 23:45:00 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:23:15.667 23:45:00 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:23:15.667 23:45:00 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:23:15.667 23:45:00 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:23:15.667 23:45:00 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:23:15.667 23:45:00 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:23:15.667 23:45:00 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:23:15.667 23:45:00 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:23:15.667 23:45:00 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 405112 2 00:23:15.667 23:45:00 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 405112 2 busy 00:23:15.667 23:45:00 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=405112 00:23:15.667 23:45:00 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:23:15.667 23:45:00 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:23:15.667 23:45:00 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:23:15.667 23:45:00 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:23:15.667 23:45:00 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:23:15.667 23:45:00 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:23:15.667 23:45:00 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 405112 -w 256 00:23:15.667 23:45:00 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:23:15.667 23:45:00 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 405116 root 20 0 128.2g 35328 23040 R 99.9 0.0 0:00.35 reactor_2' 00:23:15.667 23:45:00 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 405116 root 20 0 128.2g 35328 23040 R 99.9 0.0 0:00.35 reactor_2 00:23:15.667 23:45:00 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:23:15.667 23:45:00 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:23:15.667 23:45:00 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:23:15.667 23:45:00 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:23:15.667 23:45:00 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:23:15.667 23:45:00 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:23:15.667 23:45:00 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:23:15.667 23:45:00 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:23:15.667 23:45:00 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 00:23:15.926 [2024-07-24 23:45:00.796030] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 2. 00:23:15.926 [2024-07-24 23:45:00.796134] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:23:15.926 23:45:00 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@52 -- # '[' without_thdx '!=' x ']' 00:23:15.926 23:45:00 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@59 -- # reactor_is_idle 405112 2 00:23:15.926 23:45:00 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 405112 2 idle 00:23:15.926 23:45:00 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=405112 00:23:15.926 23:45:00 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:23:15.926 23:45:00 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:23:15.926 23:45:00 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:23:15.926 23:45:00 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:23:15.926 23:45:00 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:23:15.926 23:45:00 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:23:15.926 23:45:00 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:23:15.926 23:45:00 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 405112 -w 256 00:23:15.926 23:45:00 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:23:16.184 23:45:00 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 405116 root 20 0 128.2g 35328 23040 S 0.0 0.0 0:00.51 reactor_2' 00:23:16.184 23:45:00 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 405116 root 20 0 128.2g 35328 23040 S 0.0 0.0 0:00.51 reactor_2 00:23:16.184 23:45:00 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:23:16.184 23:45:00 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:23:16.184 23:45:00 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:23:16.185 23:45:00 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:23:16.185 23:45:00 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:23:16.185 23:45:00 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:23:16.185 23:45:00 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:23:16.185 23:45:00 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:23:16.185 23:45:00 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 00:23:16.185 [2024-07-24 23:45:01.156027] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 0. 00:23:16.185 [2024-07-24 23:45:01.156144] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:23:16.185 23:45:01 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@63 -- # '[' without_thdx '!=' x ']' 00:23:16.185 23:45:01 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@65 -- # for i in "${thd0_ids[@]}" 00:23:16.185 23:45:01 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@66 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_set_cpumask -i 1 -m 0x1 00:23:16.443 [2024-07-24 23:45:01.328424] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:23:16.443 23:45:01 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@70 -- # reactor_is_idle 405112 0 00:23:16.443 23:45:01 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 405112 0 idle 00:23:16.443 23:45:01 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=405112 00:23:16.443 23:45:01 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:23:16.443 23:45:01 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:23:16.443 23:45:01 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:23:16.443 23:45:01 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:23:16.443 23:45:01 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:23:16.443 23:45:01 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:23:16.443 23:45:01 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:23:16.443 23:45:01 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 405112 -w 256 00:23:16.443 23:45:01 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:23:16.702 23:45:01 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 405112 root 20 0 128.2g 35328 23040 S 6.7 0.0 0:01.33 reactor_0' 00:23:16.702 23:45:01 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 405112 root 20 0 128.2g 35328 23040 S 6.7 0.0 0:01.33 reactor_0 00:23:16.702 23:45:01 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:23:16.702 23:45:01 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:23:16.702 23:45:01 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=6.7 00:23:16.702 23:45:01 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=6 00:23:16.702 23:45:01 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:23:16.702 23:45:01 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:23:16.702 23:45:01 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 6 -gt 30 ]] 00:23:16.702 23:45:01 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:23:16.702 23:45:01 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@72 -- # return 0 00:23:16.702 23:45:01 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@77 -- # return 0 00:23:16.702 23:45:01 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@92 -- # trap - SIGINT SIGTERM EXIT 00:23:16.702 23:45:01 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@93 -- # killprocess 405112 00:23:16.702 23:45:01 reactor_set_interrupt -- common/autotest_common.sh@950 -- # '[' -z 405112 ']' 00:23:16.702 23:45:01 reactor_set_interrupt -- common/autotest_common.sh@954 -- # kill -0 405112 00:23:16.702 23:45:01 reactor_set_interrupt -- common/autotest_common.sh@955 -- # uname 00:23:16.702 23:45:01 reactor_set_interrupt -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:23:16.702 23:45:01 reactor_set_interrupt -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 405112 00:23:16.702 23:45:01 reactor_set_interrupt -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:23:16.702 23:45:01 reactor_set_interrupt -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:23:16.702 23:45:01 reactor_set_interrupt -- common/autotest_common.sh@968 -- # echo 'killing process with pid 405112' 00:23:16.702 killing process with pid 405112 00:23:16.702 23:45:01 reactor_set_interrupt -- common/autotest_common.sh@969 -- # kill 405112 00:23:16.702 23:45:01 reactor_set_interrupt -- common/autotest_common.sh@974 -- # wait 405112 00:23:16.961 23:45:01 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@94 -- # cleanup 00:23:16.961 23:45:01 reactor_set_interrupt -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:23:16.961 23:45:01 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@97 -- # start_intr_tgt 00:23:16.961 23:45:01 reactor_set_interrupt -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:23:16.961 23:45:01 reactor_set_interrupt -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:23:16.961 23:45:01 reactor_set_interrupt -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=405976 00:23:16.961 23:45:01 reactor_set_interrupt -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:23:16.961 23:45:01 reactor_set_interrupt -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:23:16.961 23:45:01 reactor_set_interrupt -- interrupt/interrupt_common.sh@26 -- # waitforlisten 405976 /var/tmp/spdk.sock 00:23:16.961 23:45:01 reactor_set_interrupt -- common/autotest_common.sh@831 -- # '[' -z 405976 ']' 00:23:16.961 23:45:01 reactor_set_interrupt -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:23:16.961 23:45:01 reactor_set_interrupt -- common/autotest_common.sh@836 -- # local max_retries=100 00:23:16.961 23:45:01 reactor_set_interrupt -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:23:16.961 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:23:16.961 23:45:01 reactor_set_interrupt -- common/autotest_common.sh@840 -- # xtrace_disable 00:23:16.961 23:45:01 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:23:16.961 [2024-07-24 23:45:01.794023] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:23:16.961 [2024-07-24 23:45:01.794067] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid405976 ] 00:23:16.961 [2024-07-24 23:45:01.856845] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:23:16.961 [2024-07-24 23:45:01.935897] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:23:16.961 [2024-07-24 23:45:01.935992] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:23:16.961 [2024-07-24 23:45:01.935993] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:17.220 [2024-07-24 23:45:01.998412] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:23:17.787 23:45:02 reactor_set_interrupt -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:23:17.787 23:45:02 reactor_set_interrupt -- common/autotest_common.sh@864 -- # return 0 00:23:17.787 23:45:02 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@98 -- # setup_bdev_mem 00:23:17.787 23:45:02 reactor_set_interrupt -- interrupt/common.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:18.045 Malloc0 00:23:18.045 Malloc1 00:23:18.045 Malloc2 00:23:18.045 23:45:02 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@99 -- # setup_bdev_aio 00:23:18.045 23:45:02 reactor_set_interrupt -- interrupt/common.sh@75 -- # uname -s 00:23:18.045 23:45:02 reactor_set_interrupt -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:23:18.045 23:45:02 reactor_set_interrupt -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:23:18.045 5000+0 records in 00:23:18.045 5000+0 records out 00:23:18.045 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0177662 s, 576 MB/s 00:23:18.045 23:45:02 reactor_set_interrupt -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:23:18.045 AIO0 00:23:18.045 23:45:03 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@101 -- # reactor_set_mode_with_threads 405976 00:23:18.045 23:45:03 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@81 -- # reactor_set_intr_mode 405976 00:23:18.045 23:45:03 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@14 -- # local spdk_pid=405976 00:23:18.045 23:45:03 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@15 -- # local without_thd= 00:23:18.045 23:45:03 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # thd0_ids=($(reactor_get_thread_ids $r0_mask)) 00:23:18.045 23:45:03 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # reactor_get_thread_ids 0x1 00:23:18.045 23:45:03 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x1 00:23:18.045 23:45:03 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:23:18.045 23:45:03 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=1 00:23:18.045 23:45:03 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:23:18.045 23:45:03 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 1 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:23:18.045 23:45:03 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:23:18.303 23:45:03 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo 1 00:23:18.303 23:45:03 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # thd2_ids=($(reactor_get_thread_ids $r2_mask)) 00:23:18.303 23:45:03 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # reactor_get_thread_ids 0x4 00:23:18.303 23:45:03 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x4 00:23:18.303 23:45:03 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:23:18.303 23:45:03 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=4 00:23:18.303 23:45:03 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:23:18.303 23:45:03 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:23:18.303 23:45:03 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 4 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:23:18.561 23:45:03 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo '' 00:23:18.561 23:45:03 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@21 -- # [[ 1 -eq 0 ]] 00:23:18.561 23:45:03 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@25 -- # echo 'spdk_thread ids are 1 on reactor0.' 00:23:18.561 spdk_thread ids are 1 on reactor0. 00:23:18.561 23:45:03 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:23:18.561 23:45:03 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 405976 0 00:23:18.561 23:45:03 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 405976 0 idle 00:23:18.561 23:45:03 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=405976 00:23:18.561 23:45:03 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:23:18.561 23:45:03 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:23:18.561 23:45:03 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:23:18.561 23:45:03 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:23:18.561 23:45:03 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:23:18.561 23:45:03 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:23:18.561 23:45:03 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:23:18.561 23:45:03 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 405976 -w 256 00:23:18.561 23:45:03 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:23:18.561 23:45:03 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 405976 root 20 0 128.2g 36096 23040 S 0.0 0.0 0:00.27 reactor_0' 00:23:18.561 23:45:03 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 405976 root 20 0 128.2g 36096 23040 S 0.0 0.0 0:00.27 reactor_0 00:23:18.561 23:45:03 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:23:18.561 23:45:03 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:23:18.561 23:45:03 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:23:18.561 23:45:03 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:23:18.561 23:45:03 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:23:18.561 23:45:03 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:23:18.561 23:45:03 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:23:18.561 23:45:03 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:23:18.561 23:45:03 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:23:18.561 23:45:03 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 405976 1 00:23:18.561 23:45:03 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 405976 1 idle 00:23:18.561 23:45:03 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=405976 00:23:18.561 23:45:03 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=1 00:23:18.561 23:45:03 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:23:18.562 23:45:03 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:23:18.562 23:45:03 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:23:18.562 23:45:03 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:23:18.562 23:45:03 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:23:18.820 23:45:03 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:23:18.820 23:45:03 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_1 00:23:18.820 23:45:03 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 405976 -w 256 00:23:18.820 23:45:03 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 405983 root 20 0 128.2g 36096 23040 S 0.0 0.0 0:00.00 reactor_1' 00:23:18.820 23:45:03 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 405983 root 20 0 128.2g 36096 23040 S 0.0 0.0 0:00.00 reactor_1 00:23:18.820 23:45:03 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:23:18.820 23:45:03 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:23:18.820 23:45:03 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:23:18.820 23:45:03 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:23:18.820 23:45:03 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:23:18.820 23:45:03 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:23:18.820 23:45:03 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:23:18.820 23:45:03 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:23:18.820 23:45:03 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:23:18.820 23:45:03 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 405976 2 00:23:18.820 23:45:03 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 405976 2 idle 00:23:18.820 23:45:03 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=405976 00:23:18.820 23:45:03 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:23:18.820 23:45:03 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:23:18.820 23:45:03 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:23:18.820 23:45:03 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:23:18.820 23:45:03 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:23:18.820 23:45:03 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:23:18.820 23:45:03 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:23:18.820 23:45:03 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 405976 -w 256 00:23:18.820 23:45:03 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:23:19.078 23:45:03 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 405984 root 20 0 128.2g 36096 23040 S 0.0 0.0 0:00.00 reactor_2' 00:23:19.078 23:45:03 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 405984 root 20 0 128.2g 36096 23040 S 0.0 0.0 0:00.00 reactor_2 00:23:19.078 23:45:03 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:23:19.078 23:45:03 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:23:19.078 23:45:03 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:23:19.078 23:45:03 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:23:19.078 23:45:03 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:23:19.078 23:45:03 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:23:19.078 23:45:03 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:23:19.078 23:45:03 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:23:19.078 23:45:03 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@33 -- # '[' x '!=' x ']' 00:23:19.078 23:45:03 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@43 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 -d 00:23:19.078 [2024-07-24 23:45:04.056406] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 0. 00:23:19.078 [2024-07-24 23:45:04.056516] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to poll mode from intr mode. 00:23:19.078 [2024-07-24 23:45:04.056665] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:23:19.078 23:45:04 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 -d 00:23:19.337 [2024-07-24 23:45:04.224742] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 2. 00:23:19.337 [2024-07-24 23:45:04.224928] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:23:19.337 23:45:04 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:23:19.337 23:45:04 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 405976 0 00:23:19.337 23:45:04 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 405976 0 busy 00:23:19.337 23:45:04 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=405976 00:23:19.337 23:45:04 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:23:19.337 23:45:04 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:23:19.337 23:45:04 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:23:19.337 23:45:04 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:23:19.337 23:45:04 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:23:19.337 23:45:04 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:23:19.337 23:45:04 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 405976 -w 256 00:23:19.337 23:45:04 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:23:19.595 23:45:04 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 405976 root 20 0 128.2g 36096 23040 R 99.9 0.0 0:00.62 reactor_0' 00:23:19.595 23:45:04 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 405976 root 20 0 128.2g 36096 23040 R 99.9 0.0 0:00.62 reactor_0 00:23:19.595 23:45:04 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:23:19.595 23:45:04 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:23:19.595 23:45:04 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:23:19.595 23:45:04 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:23:19.595 23:45:04 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:23:19.595 23:45:04 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:23:19.595 23:45:04 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:23:19.595 23:45:04 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:23:19.595 23:45:04 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:23:19.595 23:45:04 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 405976 2 00:23:19.595 23:45:04 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 405976 2 busy 00:23:19.595 23:45:04 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=405976 00:23:19.595 23:45:04 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:23:19.595 23:45:04 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:23:19.595 23:45:04 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:23:19.595 23:45:04 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:23:19.595 23:45:04 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:23:19.595 23:45:04 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:23:19.595 23:45:04 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 405976 -w 256 00:23:19.595 23:45:04 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:23:19.595 23:45:04 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 405984 root 20 0 128.2g 36096 23040 R 99.9 0.0 0:00.35 reactor_2' 00:23:19.595 23:45:04 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 405984 root 20 0 128.2g 36096 23040 R 99.9 0.0 0:00.35 reactor_2 00:23:19.595 23:45:04 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:23:19.595 23:45:04 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:23:19.595 23:45:04 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:23:19.595 23:45:04 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:23:19.595 23:45:04 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:23:19.595 23:45:04 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:23:19.595 23:45:04 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:23:19.595 23:45:04 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:23:19.595 23:45:04 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 00:23:19.854 [2024-07-24 23:45:04.742179] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 2. 00:23:19.854 [2024-07-24 23:45:04.742258] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:23:19.854 23:45:04 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@52 -- # '[' x '!=' x ']' 00:23:19.854 23:45:04 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@59 -- # reactor_is_idle 405976 2 00:23:19.854 23:45:04 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 405976 2 idle 00:23:19.854 23:45:04 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=405976 00:23:19.854 23:45:04 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:23:19.854 23:45:04 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:23:19.854 23:45:04 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:23:19.854 23:45:04 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:23:19.854 23:45:04 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:23:19.854 23:45:04 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:23:19.854 23:45:04 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:23:19.854 23:45:04 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 405976 -w 256 00:23:19.854 23:45:04 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:23:20.112 23:45:04 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 405984 root 20 0 128.2g 36096 23040 S 0.0 0.0 0:00.51 reactor_2' 00:23:20.112 23:45:04 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 405984 root 20 0 128.2g 36096 23040 S 0.0 0.0 0:00.51 reactor_2 00:23:20.112 23:45:04 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:23:20.112 23:45:04 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:23:20.112 23:45:04 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:23:20.112 23:45:04 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:23:20.112 23:45:04 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:23:20.112 23:45:04 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:23:20.112 23:45:04 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:23:20.112 23:45:04 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:23:20.112 23:45:04 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 00:23:20.112 [2024-07-24 23:45:05.087052] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 0. 00:23:20.112 [2024-07-24 23:45:05.087173] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from poll mode. 00:23:20.112 [2024-07-24 23:45:05.087186] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:23:20.112 23:45:05 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@63 -- # '[' x '!=' x ']' 00:23:20.112 23:45:05 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@70 -- # reactor_is_idle 405976 0 00:23:20.112 23:45:05 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 405976 0 idle 00:23:20.112 23:45:05 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=405976 00:23:20.112 23:45:05 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:23:20.112 23:45:05 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:23:20.112 23:45:05 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:23:20.112 23:45:05 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:23:20.112 23:45:05 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:23:20.112 23:45:05 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:23:20.112 23:45:05 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:23:20.371 23:45:05 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 405976 -w 256 00:23:20.371 23:45:05 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:23:20.371 23:45:05 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 405976 root 20 0 128.2g 36096 23040 S 0.0 0.0 0:01.30 reactor_0' 00:23:20.371 23:45:05 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 405976 root 20 0 128.2g 36096 23040 S 0.0 0.0 0:01.30 reactor_0 00:23:20.371 23:45:05 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:23:20.371 23:45:05 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:23:20.371 23:45:05 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:23:20.371 23:45:05 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:23:20.371 23:45:05 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:23:20.371 23:45:05 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:23:20.371 23:45:05 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:23:20.371 23:45:05 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:23:20.371 23:45:05 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@72 -- # return 0 00:23:20.371 23:45:05 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@82 -- # return 0 00:23:20.371 23:45:05 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@103 -- # trap - SIGINT SIGTERM EXIT 00:23:20.371 23:45:05 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@104 -- # killprocess 405976 00:23:20.371 23:45:05 reactor_set_interrupt -- common/autotest_common.sh@950 -- # '[' -z 405976 ']' 00:23:20.371 23:45:05 reactor_set_interrupt -- common/autotest_common.sh@954 -- # kill -0 405976 00:23:20.371 23:45:05 reactor_set_interrupt -- common/autotest_common.sh@955 -- # uname 00:23:20.371 23:45:05 reactor_set_interrupt -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:23:20.371 23:45:05 reactor_set_interrupt -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 405976 00:23:20.371 23:45:05 reactor_set_interrupt -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:23:20.371 23:45:05 reactor_set_interrupt -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:23:20.371 23:45:05 reactor_set_interrupt -- common/autotest_common.sh@968 -- # echo 'killing process with pid 405976' 00:23:20.371 killing process with pid 405976 00:23:20.371 23:45:05 reactor_set_interrupt -- common/autotest_common.sh@969 -- # kill 405976 00:23:20.371 23:45:05 reactor_set_interrupt -- common/autotest_common.sh@974 -- # wait 405976 00:23:20.630 23:45:05 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@105 -- # cleanup 00:23:20.630 23:45:05 reactor_set_interrupt -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:23:20.630 00:23:20.630 real 0m8.128s 00:23:20.630 user 0m7.349s 00:23:20.630 sys 0m1.411s 00:23:20.630 23:45:05 reactor_set_interrupt -- common/autotest_common.sh@1126 -- # xtrace_disable 00:23:20.630 23:45:05 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:23:20.630 ************************************ 00:23:20.630 END TEST reactor_set_interrupt 00:23:20.630 ************************************ 00:23:20.630 23:45:05 -- spdk/autotest.sh@198 -- # run_test reap_unregistered_poller /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:23:20.630 23:45:05 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:23:20.630 23:45:05 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:23:20.630 23:45:05 -- common/autotest_common.sh@10 -- # set +x 00:23:20.630 ************************************ 00:23:20.630 START TEST reap_unregistered_poller 00:23:20.630 ************************************ 00:23:20.630 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:23:20.891 * Looking for test storage... 00:23:20.891 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:23:20.891 23:45:05 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/interrupt_common.sh 00:23:20.891 23:45:05 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:23:20.891 23:45:05 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:23:20.891 23:45:05 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # testdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:23:20.891 23:45:05 reap_unregistered_poller -- interrupt/interrupt_common.sh@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/../.. 00:23:20.891 23:45:05 reap_unregistered_poller -- interrupt/interrupt_common.sh@6 -- # rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:23:20.891 23:45:05 reap_unregistered_poller -- interrupt/interrupt_common.sh@7 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh 00:23:20.891 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:23:20.891 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@34 -- # set -e 00:23:20.891 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:23:20.891 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@36 -- # shopt -s extglob 00:23:20.891 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:23:20.891 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/crypto-phy-autotest/spdk/../output ']' 00:23:20.891 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh ]] 00:23:20.891 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh 00:23:20.891 23:45:05 reap_unregistered_poller -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:23:20.891 23:45:05 reap_unregistered_poller -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:23:20.891 23:45:05 reap_unregistered_poller -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=y 00:23:20.891 23:45:05 reap_unregistered_poller -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:23:20.891 23:45:05 reap_unregistered_poller -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:23:20.891 23:45:05 reap_unregistered_poller -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:23:20.891 23:45:05 reap_unregistered_poller -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:23:20.891 23:45:05 reap_unregistered_poller -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:23:20.891 23:45:05 reap_unregistered_poller -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:23:20.891 23:45:05 reap_unregistered_poller -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:23:20.891 23:45:05 reap_unregistered_poller -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:23:20.891 23:45:05 reap_unregistered_poller -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:23:20.891 23:45:05 reap_unregistered_poller -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:23:20.891 23:45:05 reap_unregistered_poller -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:23:20.891 23:45:05 reap_unregistered_poller -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:23:20.891 23:45:05 reap_unregistered_poller -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:23:20.891 23:45:05 reap_unregistered_poller -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:23:20.891 23:45:05 reap_unregistered_poller -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:23:20.891 23:45:05 reap_unregistered_poller -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:23:20.891 23:45:05 reap_unregistered_poller -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:23:20.891 23:45:05 reap_unregistered_poller -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:23:20.891 23:45:05 reap_unregistered_poller -- common/build_config.sh@22 -- # CONFIG_CET=n 00:23:20.891 23:45:05 reap_unregistered_poller -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=y 00:23:20.891 23:45:05 reap_unregistered_poller -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:23:20.891 23:45:05 reap_unregistered_poller -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:23:20.891 23:45:05 reap_unregistered_poller -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:23:20.891 23:45:05 reap_unregistered_poller -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:23:20.891 23:45:05 reap_unregistered_poller -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:23:20.892 23:45:05 reap_unregistered_poller -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:23:20.892 23:45:05 reap_unregistered_poller -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:23:20.892 23:45:05 reap_unregistered_poller -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:23:20.892 23:45:05 reap_unregistered_poller -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:23:20.892 23:45:05 reap_unregistered_poller -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:23:20.892 23:45:05 reap_unregistered_poller -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB= 00:23:20.892 23:45:05 reap_unregistered_poller -- common/build_config.sh@35 -- # CONFIG_FUZZER=n 00:23:20.892 23:45:05 reap_unregistered_poller -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:23:20.892 23:45:05 reap_unregistered_poller -- common/build_config.sh@37 -- # CONFIG_CRYPTO=y 00:23:20.892 23:45:05 reap_unregistered_poller -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:23:20.892 23:45:05 reap_unregistered_poller -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:23:20.892 23:45:05 reap_unregistered_poller -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:23:20.892 23:45:05 reap_unregistered_poller -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:23:20.892 23:45:05 reap_unregistered_poller -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:23:20.892 23:45:05 reap_unregistered_poller -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:23:20.892 23:45:05 reap_unregistered_poller -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:23:20.892 23:45:05 reap_unregistered_poller -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:23:20.892 23:45:05 reap_unregistered_poller -- common/build_config.sh@46 -- # CONFIG_DPDK_UADK=n 00:23:20.892 23:45:05 reap_unregistered_poller -- common/build_config.sh@47 -- # CONFIG_COVERAGE=y 00:23:20.892 23:45:05 reap_unregistered_poller -- common/build_config.sh@48 -- # CONFIG_RDMA=y 00:23:20.892 23:45:05 reap_unregistered_poller -- common/build_config.sh@49 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:23:20.892 23:45:05 reap_unregistered_poller -- common/build_config.sh@50 -- # CONFIG_URING_PATH= 00:23:20.892 23:45:05 reap_unregistered_poller -- common/build_config.sh@51 -- # CONFIG_XNVME=n 00:23:20.892 23:45:05 reap_unregistered_poller -- common/build_config.sh@52 -- # CONFIG_VFIO_USER=n 00:23:20.892 23:45:05 reap_unregistered_poller -- common/build_config.sh@53 -- # CONFIG_ARCH=native 00:23:20.892 23:45:05 reap_unregistered_poller -- common/build_config.sh@54 -- # CONFIG_HAVE_EVP_MAC=y 00:23:20.892 23:45:05 reap_unregistered_poller -- common/build_config.sh@55 -- # CONFIG_URING_ZNS=n 00:23:20.892 23:45:05 reap_unregistered_poller -- common/build_config.sh@56 -- # CONFIG_WERROR=y 00:23:20.892 23:45:05 reap_unregistered_poller -- common/build_config.sh@57 -- # CONFIG_HAVE_LIBBSD=n 00:23:20.892 23:45:05 reap_unregistered_poller -- common/build_config.sh@58 -- # CONFIG_UBSAN=y 00:23:20.892 23:45:05 reap_unregistered_poller -- common/build_config.sh@59 -- # CONFIG_IPSEC_MB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:23:20.892 23:45:05 reap_unregistered_poller -- common/build_config.sh@60 -- # CONFIG_GOLANG=n 00:23:20.892 23:45:05 reap_unregistered_poller -- common/build_config.sh@61 -- # CONFIG_ISAL=y 00:23:20.892 23:45:05 reap_unregistered_poller -- common/build_config.sh@62 -- # CONFIG_IDXD_KERNEL=y 00:23:20.892 23:45:05 reap_unregistered_poller -- common/build_config.sh@63 -- # CONFIG_DPDK_LIB_DIR= 00:23:20.892 23:45:05 reap_unregistered_poller -- common/build_config.sh@64 -- # CONFIG_RDMA_PROV=verbs 00:23:20.892 23:45:05 reap_unregistered_poller -- common/build_config.sh@65 -- # CONFIG_APPS=y 00:23:20.892 23:45:05 reap_unregistered_poller -- common/build_config.sh@66 -- # CONFIG_SHARED=y 00:23:20.892 23:45:05 reap_unregistered_poller -- common/build_config.sh@67 -- # CONFIG_HAVE_KEYUTILS=y 00:23:20.892 23:45:05 reap_unregistered_poller -- common/build_config.sh@68 -- # CONFIG_FC_PATH= 00:23:20.892 23:45:05 reap_unregistered_poller -- common/build_config.sh@69 -- # CONFIG_DPDK_PKG_CONFIG=n 00:23:20.892 23:45:05 reap_unregistered_poller -- common/build_config.sh@70 -- # CONFIG_FC=n 00:23:20.892 23:45:05 reap_unregistered_poller -- common/build_config.sh@71 -- # CONFIG_AVAHI=n 00:23:20.892 23:45:05 reap_unregistered_poller -- common/build_config.sh@72 -- # CONFIG_FIO_PLUGIN=y 00:23:20.892 23:45:05 reap_unregistered_poller -- common/build_config.sh@73 -- # CONFIG_RAID5F=n 00:23:20.892 23:45:05 reap_unregistered_poller -- common/build_config.sh@74 -- # CONFIG_EXAMPLES=y 00:23:20.892 23:45:05 reap_unregistered_poller -- common/build_config.sh@75 -- # CONFIG_TESTS=y 00:23:20.892 23:45:05 reap_unregistered_poller -- common/build_config.sh@76 -- # CONFIG_CRYPTO_MLX5=y 00:23:20.892 23:45:05 reap_unregistered_poller -- common/build_config.sh@77 -- # CONFIG_MAX_LCORES=128 00:23:20.892 23:45:05 reap_unregistered_poller -- common/build_config.sh@78 -- # CONFIG_IPSEC_MB=y 00:23:20.892 23:45:05 reap_unregistered_poller -- common/build_config.sh@79 -- # CONFIG_PGO_DIR= 00:23:20.892 23:45:05 reap_unregistered_poller -- common/build_config.sh@80 -- # CONFIG_DEBUG=y 00:23:20.892 23:45:05 reap_unregistered_poller -- common/build_config.sh@81 -- # CONFIG_DPDK_COMPRESSDEV=y 00:23:20.892 23:45:05 reap_unregistered_poller -- common/build_config.sh@82 -- # CONFIG_CROSS_PREFIX= 00:23:20.892 23:45:05 reap_unregistered_poller -- common/build_config.sh@83 -- # CONFIG_URING=n 00:23:20.892 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:23:20.892 23:45:05 reap_unregistered_poller -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:23:20.892 23:45:05 reap_unregistered_poller -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:23:20.892 23:45:05 reap_unregistered_poller -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:23:20.892 23:45:05 reap_unregistered_poller -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:23:20.892 23:45:05 reap_unregistered_poller -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:23:20.892 23:45:05 reap_unregistered_poller -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:23:20.892 23:45:05 reap_unregistered_poller -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:23:20.892 23:45:05 reap_unregistered_poller -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:23:20.892 23:45:05 reap_unregistered_poller -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:23:20.892 23:45:05 reap_unregistered_poller -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:23:20.892 23:45:05 reap_unregistered_poller -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:23:20.892 23:45:05 reap_unregistered_poller -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:23:20.892 23:45:05 reap_unregistered_poller -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:23:20.892 23:45:05 reap_unregistered_poller -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/config.h ]] 00:23:20.892 23:45:05 reap_unregistered_poller -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:23:20.892 #define SPDK_CONFIG_H 00:23:20.892 #define SPDK_CONFIG_APPS 1 00:23:20.892 #define SPDK_CONFIG_ARCH native 00:23:20.892 #undef SPDK_CONFIG_ASAN 00:23:20.892 #undef SPDK_CONFIG_AVAHI 00:23:20.892 #undef SPDK_CONFIG_CET 00:23:20.892 #define SPDK_CONFIG_COVERAGE 1 00:23:20.892 #define SPDK_CONFIG_CROSS_PREFIX 00:23:20.892 #define SPDK_CONFIG_CRYPTO 1 00:23:20.892 #define SPDK_CONFIG_CRYPTO_MLX5 1 00:23:20.892 #undef SPDK_CONFIG_CUSTOMOCF 00:23:20.892 #undef SPDK_CONFIG_DAOS 00:23:20.892 #define SPDK_CONFIG_DAOS_DIR 00:23:20.892 #define SPDK_CONFIG_DEBUG 1 00:23:20.892 #define SPDK_CONFIG_DPDK_COMPRESSDEV 1 00:23:20.892 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:23:20.892 #define SPDK_CONFIG_DPDK_INC_DIR 00:23:20.892 #define SPDK_CONFIG_DPDK_LIB_DIR 00:23:20.892 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:23:20.892 #undef SPDK_CONFIG_DPDK_UADK 00:23:20.892 #define SPDK_CONFIG_ENV /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:23:20.892 #define SPDK_CONFIG_EXAMPLES 1 00:23:20.892 #undef SPDK_CONFIG_FC 00:23:20.892 #define SPDK_CONFIG_FC_PATH 00:23:20.892 #define SPDK_CONFIG_FIO_PLUGIN 1 00:23:20.892 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:23:20.892 #undef SPDK_CONFIG_FUSE 00:23:20.892 #undef SPDK_CONFIG_FUZZER 00:23:20.892 #define SPDK_CONFIG_FUZZER_LIB 00:23:20.892 #undef SPDK_CONFIG_GOLANG 00:23:20.892 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:23:20.892 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:23:20.892 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:23:20.892 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:23:20.892 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:23:20.892 #undef SPDK_CONFIG_HAVE_LIBBSD 00:23:20.892 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:23:20.892 #define SPDK_CONFIG_IDXD 1 00:23:20.892 #define SPDK_CONFIG_IDXD_KERNEL 1 00:23:20.892 #define SPDK_CONFIG_IPSEC_MB 1 00:23:20.892 #define SPDK_CONFIG_IPSEC_MB_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:23:20.892 #define SPDK_CONFIG_ISAL 1 00:23:20.892 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:23:20.892 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:23:20.892 #define SPDK_CONFIG_LIBDIR 00:23:20.892 #undef SPDK_CONFIG_LTO 00:23:20.892 #define SPDK_CONFIG_MAX_LCORES 128 00:23:20.892 #define SPDK_CONFIG_NVME_CUSE 1 00:23:20.892 #undef SPDK_CONFIG_OCF 00:23:20.892 #define SPDK_CONFIG_OCF_PATH 00:23:20.892 #define SPDK_CONFIG_OPENSSL_PATH 00:23:20.892 #undef SPDK_CONFIG_PGO_CAPTURE 00:23:20.892 #define SPDK_CONFIG_PGO_DIR 00:23:20.892 #undef SPDK_CONFIG_PGO_USE 00:23:20.892 #define SPDK_CONFIG_PREFIX /usr/local 00:23:20.892 #undef SPDK_CONFIG_RAID5F 00:23:20.892 #undef SPDK_CONFIG_RBD 00:23:20.892 #define SPDK_CONFIG_RDMA 1 00:23:20.892 #define SPDK_CONFIG_RDMA_PROV verbs 00:23:20.892 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:23:20.892 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:23:20.892 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:23:20.892 #define SPDK_CONFIG_SHARED 1 00:23:20.892 #undef SPDK_CONFIG_SMA 00:23:20.892 #define SPDK_CONFIG_TESTS 1 00:23:20.892 #undef SPDK_CONFIG_TSAN 00:23:20.892 #define SPDK_CONFIG_UBLK 1 00:23:20.892 #define SPDK_CONFIG_UBSAN 1 00:23:20.892 #undef SPDK_CONFIG_UNIT_TESTS 00:23:20.892 #undef SPDK_CONFIG_URING 00:23:20.892 #define SPDK_CONFIG_URING_PATH 00:23:20.892 #undef SPDK_CONFIG_URING_ZNS 00:23:20.892 #undef SPDK_CONFIG_USDT 00:23:20.892 #define SPDK_CONFIG_VBDEV_COMPRESS 1 00:23:20.892 #define SPDK_CONFIG_VBDEV_COMPRESS_MLX5 1 00:23:20.892 #undef SPDK_CONFIG_VFIO_USER 00:23:20.893 #define SPDK_CONFIG_VFIO_USER_DIR 00:23:20.893 #define SPDK_CONFIG_VHOST 1 00:23:20.893 #define SPDK_CONFIG_VIRTIO 1 00:23:20.893 #undef SPDK_CONFIG_VTUNE 00:23:20.893 #define SPDK_CONFIG_VTUNE_DIR 00:23:20.893 #define SPDK_CONFIG_WERROR 1 00:23:20.893 #define SPDK_CONFIG_WPDK_DIR 00:23:20.893 #undef SPDK_CONFIG_XNVME 00:23:20.893 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:23:20.893 23:45:05 reap_unregistered_poller -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:23:20.893 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:23:20.893 23:45:05 reap_unregistered_poller -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:23:20.893 23:45:05 reap_unregistered_poller -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:23:20.893 23:45:05 reap_unregistered_poller -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:23:20.893 23:45:05 reap_unregistered_poller -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:23:20.893 23:45:05 reap_unregistered_poller -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:23:20.893 23:45:05 reap_unregistered_poller -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:23:20.893 23:45:05 reap_unregistered_poller -- paths/export.sh@5 -- # export PATH 00:23:20.893 23:45:05 reap_unregistered_poller -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:23:20.893 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:23:20.893 23:45:05 reap_unregistered_poller -- pm/common@6 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:23:20.893 23:45:05 reap_unregistered_poller -- pm/common@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:23:20.893 23:45:05 reap_unregistered_poller -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:23:20.893 23:45:05 reap_unregistered_poller -- pm/common@7 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/../../../ 00:23:20.893 23:45:05 reap_unregistered_poller -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:23:20.893 23:45:05 reap_unregistered_poller -- pm/common@64 -- # TEST_TAG=N/A 00:23:20.893 23:45:05 reap_unregistered_poller -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/crypto-phy-autotest/spdk/.run_test_name 00:23:20.893 23:45:05 reap_unregistered_poller -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power 00:23:20.893 23:45:05 reap_unregistered_poller -- pm/common@68 -- # uname -s 00:23:20.893 23:45:05 reap_unregistered_poller -- pm/common@68 -- # PM_OS=Linux 00:23:20.893 23:45:05 reap_unregistered_poller -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:23:20.893 23:45:05 reap_unregistered_poller -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:23:20.893 23:45:05 reap_unregistered_poller -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:23:20.893 23:45:05 reap_unregistered_poller -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:23:20.893 23:45:05 reap_unregistered_poller -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:23:20.893 23:45:05 reap_unregistered_poller -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:23:20.893 23:45:05 reap_unregistered_poller -- pm/common@76 -- # SUDO[0]= 00:23:20.893 23:45:05 reap_unregistered_poller -- pm/common@76 -- # SUDO[1]='sudo -E' 00:23:20.893 23:45:05 reap_unregistered_poller -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:23:20.893 23:45:05 reap_unregistered_poller -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:23:20.893 23:45:05 reap_unregistered_poller -- pm/common@81 -- # [[ Linux == Linux ]] 00:23:20.893 23:45:05 reap_unregistered_poller -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:23:20.893 23:45:05 reap_unregistered_poller -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:23:20.893 23:45:05 reap_unregistered_poller -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:23:20.893 23:45:05 reap_unregistered_poller -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:23:20.893 23:45:05 reap_unregistered_poller -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power ]] 00:23:20.893 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@58 -- # : 0 00:23:20.893 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:23:20.893 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@62 -- # : 0 00:23:20.893 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:23:20.893 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@64 -- # : 0 00:23:20.893 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:23:20.893 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@66 -- # : 1 00:23:20.893 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:23:20.893 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@68 -- # : 0 00:23:20.893 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:23:20.893 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@70 -- # : 00:23:20.893 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:23:20.893 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@72 -- # : 0 00:23:20.893 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:23:20.893 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@74 -- # : 1 00:23:20.893 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:23:20.893 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@76 -- # : 0 00:23:20.893 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:23:20.893 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@78 -- # : 0 00:23:20.893 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:23:20.893 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@80 -- # : 0 00:23:20.893 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:23:20.893 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@82 -- # : 0 00:23:20.893 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:23:20.893 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@84 -- # : 0 00:23:20.893 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:23:20.893 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@86 -- # : 0 00:23:20.893 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:23:20.893 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@88 -- # : 0 00:23:20.893 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:23:20.893 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@90 -- # : 0 00:23:20.893 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:23:20.893 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@92 -- # : 0 00:23:20.893 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:23:20.893 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@94 -- # : 0 00:23:20.893 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:23:20.893 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@96 -- # : 0 00:23:20.893 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:23:20.893 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@98 -- # : 0 00:23:20.893 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:23:20.893 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@100 -- # : 0 00:23:20.893 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:23:20.893 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@102 -- # : rdma 00:23:20.893 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:23:20.893 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@104 -- # : 0 00:23:20.893 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:23:20.893 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@106 -- # : 0 00:23:20.893 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:23:20.893 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@108 -- # : 1 00:23:20.893 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:23:20.893 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@110 -- # : 0 00:23:20.893 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@111 -- # export SPDK_TEST_IOAT 00:23:20.893 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@112 -- # : 0 00:23:20.893 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@113 -- # export SPDK_TEST_BLOBFS 00:23:20.893 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@114 -- # : 0 00:23:20.893 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@115 -- # export SPDK_TEST_VHOST_INIT 00:23:20.893 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@116 -- # : 0 00:23:20.893 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@117 -- # export SPDK_TEST_LVOL 00:23:20.893 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@118 -- # : 1 00:23:20.893 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@119 -- # export SPDK_TEST_VBDEV_COMPRESS 00:23:20.893 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@120 -- # : 0 00:23:20.893 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@121 -- # export SPDK_RUN_ASAN 00:23:20.893 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@122 -- # : 1 00:23:20.893 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@123 -- # export SPDK_RUN_UBSAN 00:23:20.893 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@124 -- # : 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@125 -- # export SPDK_RUN_EXTERNAL_DPDK 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@126 -- # : 0 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@127 -- # export SPDK_RUN_NON_ROOT 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@128 -- # : 1 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@129 -- # export SPDK_TEST_CRYPTO 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@130 -- # : 0 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@131 -- # export SPDK_TEST_FTL 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@132 -- # : 0 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@133 -- # export SPDK_TEST_OCF 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@134 -- # : 0 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@135 -- # export SPDK_TEST_VMD 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@136 -- # : 0 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@137 -- # export SPDK_TEST_OPAL 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@138 -- # : 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@139 -- # export SPDK_TEST_NATIVE_DPDK 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@140 -- # : true 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@141 -- # export SPDK_AUTOTEST_X 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@142 -- # : 0 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@143 -- # export SPDK_TEST_RAID5 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@144 -- # : 0 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@146 -- # : 0 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@148 -- # : 0 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@150 -- # : 0 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@152 -- # : 0 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@154 -- # : 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@156 -- # : 0 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@158 -- # : 0 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@160 -- # : 0 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@162 -- # : 1 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@164 -- # : 0 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_DSA 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@166 -- # : 0 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@167 -- # export SPDK_TEST_ACCEL_IAA 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@169 -- # : 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@170 -- # export SPDK_TEST_FUZZER_TARGET 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@171 -- # : 0 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@172 -- # export SPDK_TEST_NVMF_MDNS 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@173 -- # : 0 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@174 -- # export SPDK_JSONRPC_GO_CLIENT 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@177 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@177 -- # SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@178 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@178 -- # DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@179 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@179 -- # VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@180 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@180 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@183 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@183 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@187 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@187 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@191 -- # export PYTHONDONTWRITEBYTECODE=1 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@191 -- # PYTHONDONTWRITEBYTECODE=1 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@195 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@195 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@196 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@196 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@200 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@201 -- # rm -rf /var/tmp/asan_suppression_file 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@202 -- # cat 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@238 -- # echo leak:libfuse3.so 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@240 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@240 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@242 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@242 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@244 -- # '[' -z /var/spdk/dependencies ']' 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@247 -- # export DEPENDENCY_DIR 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@251 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@251 -- # SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@252 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@252 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@255 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@255 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@256 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@256 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@258 -- # export AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@258 -- # AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@261 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@261 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@264 -- # '[' 0 -eq 0 ']' 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@265 -- # export valgrind= 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@265 -- # valgrind= 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@271 -- # uname -s 00:23:20.894 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@271 -- # '[' Linux = Linux ']' 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@272 -- # HUGEMEM=4096 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@273 -- # export CLEAR_HUGE=yes 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@273 -- # CLEAR_HUGE=yes 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@274 -- # [[ 1 -eq 1 ]] 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@278 -- # export HUGE_EVEN_ALLOC=yes 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@278 -- # HUGE_EVEN_ALLOC=yes 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@281 -- # MAKE=make 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@282 -- # MAKEFLAGS=-j96 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@298 -- # export HUGEMEM=4096 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@298 -- # HUGEMEM=4096 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@300 -- # NO_HUGE=() 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@301 -- # TEST_MODE= 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@320 -- # [[ -z 406775 ]] 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@320 -- # kill -0 406775 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@1680 -- # set_test_storage 2147483648 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@330 -- # [[ -v testdir ]] 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@332 -- # local requested_size=2147483648 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@333 -- # local mount target_dir 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@335 -- # local -A mounts fss sizes avails uses 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@336 -- # local source fs size avail mount use 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@338 -- # local storage_fallback storage_candidates 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@340 -- # mktemp -udt spdk.XXXXXX 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@340 -- # storage_fallback=/tmp/spdk.jjyeWU 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@345 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@347 -- # [[ -n '' ]] 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@352 -- # [[ -n '' ]] 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@357 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt /tmp/spdk.jjyeWU/tests/interrupt /tmp/spdk.jjyeWU 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@360 -- # requested_size=2214592512 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@329 -- # df -T 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@329 -- # grep -v Filesystem 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@363 -- # mounts["$mount"]=spdk_devtmpfs 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@363 -- # fss["$mount"]=devtmpfs 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@364 -- # avails["$mount"]=67108864 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@364 -- # sizes["$mount"]=67108864 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@365 -- # uses["$mount"]=0 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@363 -- # mounts["$mount"]=/dev/pmem0 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@363 -- # fss["$mount"]=ext2 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@364 -- # avails["$mount"]=895512576 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@364 -- # sizes["$mount"]=5284429824 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@365 -- # uses["$mount"]=4388917248 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@363 -- # mounts["$mount"]=spdk_root 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@363 -- # fss["$mount"]=overlay 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@364 -- # avails["$mount"]=89523118080 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@364 -- # sizes["$mount"]=95562764288 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@365 -- # uses["$mount"]=6039646208 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@363 -- # mounts["$mount"]=tmpfs 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@363 -- # fss["$mount"]=tmpfs 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@364 -- # avails["$mount"]=47725748224 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@364 -- # sizes["$mount"]=47781380096 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@365 -- # uses["$mount"]=55631872 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@363 -- # mounts["$mount"]=tmpfs 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@363 -- # fss["$mount"]=tmpfs 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@364 -- # avails["$mount"]=19102932992 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@364 -- # sizes["$mount"]=19112554496 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@365 -- # uses["$mount"]=9621504 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@363 -- # mounts["$mount"]=tmpfs 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@363 -- # fss["$mount"]=tmpfs 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@364 -- # avails["$mount"]=47780855808 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@364 -- # sizes["$mount"]=47781384192 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@365 -- # uses["$mount"]=528384 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@363 -- # mounts["$mount"]=tmpfs 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@363 -- # fss["$mount"]=tmpfs 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@364 -- # avails["$mount"]=9556271104 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@364 -- # sizes["$mount"]=9556275200 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@365 -- # uses["$mount"]=4096 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@368 -- # printf '* Looking for test storage...\n' 00:23:20.895 * Looking for test storage... 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@370 -- # local target_space new_size 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@371 -- # for target_dir in "${storage_candidates[@]}" 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@374 -- # df /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@374 -- # awk '$1 !~ /Filesystem/{print $6}' 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@374 -- # mount=/ 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@376 -- # target_space=89523118080 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@377 -- # (( target_space == 0 || target_space < requested_size )) 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@380 -- # (( target_space >= requested_size )) 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@382 -- # [[ overlay == tmpfs ]] 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@382 -- # [[ overlay == ramfs ]] 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@382 -- # [[ / == / ]] 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@383 -- # new_size=8254238720 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@384 -- # (( new_size * 100 / sizes[/] > 95 )) 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@389 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@389 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@390 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:23:20.895 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@391 -- # return 0 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@1682 -- # set -o errtrace 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@1683 -- # shopt -s extdebug 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@1684 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@1686 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@1687 -- # true 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@1689 -- # xtrace_fd 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@25 -- # [[ -n 13 ]] 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/13 ]] 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@27 -- # exec 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@29 -- # exec 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@31 -- # xtrace_restore 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:23:20.895 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@18 -- # set -x 00:23:20.895 23:45:05 reap_unregistered_poller -- interrupt/interrupt_common.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/common.sh 00:23:20.895 23:45:05 reap_unregistered_poller -- interrupt/interrupt_common.sh@10 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:20.895 23:45:05 reap_unregistered_poller -- interrupt/interrupt_common.sh@12 -- # r0_mask=0x1 00:23:20.895 23:45:05 reap_unregistered_poller -- interrupt/interrupt_common.sh@13 -- # r1_mask=0x2 00:23:20.896 23:45:05 reap_unregistered_poller -- interrupt/interrupt_common.sh@14 -- # r2_mask=0x4 00:23:20.896 23:45:05 reap_unregistered_poller -- interrupt/interrupt_common.sh@16 -- # cpu_server_mask=0x07 00:23:20.896 23:45:05 reap_unregistered_poller -- interrupt/interrupt_common.sh@17 -- # rpc_server_addr=/var/tmp/spdk.sock 00:23:20.896 23:45:05 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@14 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:23:20.896 23:45:05 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@14 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:23:20.896 23:45:05 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@17 -- # start_intr_tgt 00:23:20.896 23:45:05 reap_unregistered_poller -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:23:20.896 23:45:05 reap_unregistered_poller -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:23:20.896 23:45:05 reap_unregistered_poller -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=406816 00:23:20.896 23:45:05 reap_unregistered_poller -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:23:20.896 23:45:05 reap_unregistered_poller -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:23:20.896 23:45:05 reap_unregistered_poller -- interrupt/interrupt_common.sh@26 -- # waitforlisten 406816 /var/tmp/spdk.sock 00:23:20.896 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@831 -- # '[' -z 406816 ']' 00:23:20.896 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:23:20.896 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@836 -- # local max_retries=100 00:23:20.896 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:23:20.896 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:23:20.896 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@840 -- # xtrace_disable 00:23:20.896 23:45:05 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:23:20.896 [2024-07-24 23:45:05.849016] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:23:20.896 [2024-07-24 23:45:05.849059] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid406816 ] 00:23:21.155 [2024-07-24 23:45:05.912648] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:23:21.155 [2024-07-24 23:45:05.991847] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:23:21.155 [2024-07-24 23:45:05.991940] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:21.155 [2024-07-24 23:45:05.991940] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:23:21.155 [2024-07-24 23:45:06.053977] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:23:21.722 23:45:06 reap_unregistered_poller -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:23:21.722 23:45:06 reap_unregistered_poller -- common/autotest_common.sh@864 -- # return 0 00:23:21.722 23:45:06 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # jq -r '.threads[0]' 00:23:21.722 23:45:06 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # rpc_cmd thread_get_pollers 00:23:21.722 23:45:06 reap_unregistered_poller -- common/autotest_common.sh@561 -- # xtrace_disable 00:23:21.722 23:45:06 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:23:21.722 23:45:06 reap_unregistered_poller -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:23:21.722 23:45:06 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # app_thread='{ 00:23:21.722 "name": "app_thread", 00:23:21.722 "id": 1, 00:23:21.722 "active_pollers": [], 00:23:21.722 "timed_pollers": [ 00:23:21.722 { 00:23:21.722 "name": "rpc_subsystem_poll_servers", 00:23:21.722 "id": 1, 00:23:21.722 "state": "waiting", 00:23:21.722 "run_count": 0, 00:23:21.722 "busy_count": 0, 00:23:21.722 "period_ticks": 8400000 00:23:21.722 } 00:23:21.722 ], 00:23:21.722 "paused_pollers": [] 00:23:21.722 }' 00:23:21.722 23:45:06 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@21 -- # jq -r '.active_pollers[].name' 00:23:21.722 23:45:06 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@21 -- # native_pollers= 00:23:21.722 23:45:06 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@22 -- # native_pollers+=' ' 00:23:21.722 23:45:06 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@23 -- # jq -r '.timed_pollers[].name' 00:23:21.980 23:45:06 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@23 -- # native_pollers+=rpc_subsystem_poll_servers 00:23:21.980 23:45:06 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@28 -- # setup_bdev_aio 00:23:21.980 23:45:06 reap_unregistered_poller -- interrupt/common.sh@75 -- # uname -s 00:23:21.980 23:45:06 reap_unregistered_poller -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:23:21.980 23:45:06 reap_unregistered_poller -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:23:21.980 5000+0 records in 00:23:21.980 5000+0 records out 00:23:21.980 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0162608 s, 630 MB/s 00:23:21.980 23:45:06 reap_unregistered_poller -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:23:21.980 AIO0 00:23:21.980 23:45:06 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:23:22.238 23:45:07 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@34 -- # sleep 0.1 00:23:22.238 23:45:07 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # rpc_cmd thread_get_pollers 00:23:22.238 23:45:07 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # jq -r '.threads[0]' 00:23:22.238 23:45:07 reap_unregistered_poller -- common/autotest_common.sh@561 -- # xtrace_disable 00:23:22.238 23:45:07 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:23:22.238 23:45:07 reap_unregistered_poller -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:23:22.497 23:45:07 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # app_thread='{ 00:23:22.497 "name": "app_thread", 00:23:22.497 "id": 1, 00:23:22.497 "active_pollers": [], 00:23:22.497 "timed_pollers": [ 00:23:22.497 { 00:23:22.497 "name": "rpc_subsystem_poll_servers", 00:23:22.497 "id": 1, 00:23:22.497 "state": "waiting", 00:23:22.497 "run_count": 0, 00:23:22.497 "busy_count": 0, 00:23:22.497 "period_ticks": 8400000 00:23:22.497 } 00:23:22.497 ], 00:23:22.497 "paused_pollers": [] 00:23:22.497 }' 00:23:22.497 23:45:07 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@38 -- # jq -r '.active_pollers[].name' 00:23:22.497 23:45:07 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@38 -- # remaining_pollers= 00:23:22.497 23:45:07 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@39 -- # remaining_pollers+=' ' 00:23:22.497 23:45:07 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@40 -- # jq -r '.timed_pollers[].name' 00:23:22.497 23:45:07 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@40 -- # remaining_pollers+=rpc_subsystem_poll_servers 00:23:22.497 23:45:07 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@44 -- # [[ rpc_subsystem_poll_servers == \ \r\p\c\_\s\u\b\s\y\s\t\e\m\_\p\o\l\l\_\s\e\r\v\e\r\s ]] 00:23:22.497 23:45:07 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@46 -- # trap - SIGINT SIGTERM EXIT 00:23:22.497 23:45:07 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@47 -- # killprocess 406816 00:23:22.497 23:45:07 reap_unregistered_poller -- common/autotest_common.sh@950 -- # '[' -z 406816 ']' 00:23:22.497 23:45:07 reap_unregistered_poller -- common/autotest_common.sh@954 -- # kill -0 406816 00:23:22.497 23:45:07 reap_unregistered_poller -- common/autotest_common.sh@955 -- # uname 00:23:22.497 23:45:07 reap_unregistered_poller -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:23:22.497 23:45:07 reap_unregistered_poller -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 406816 00:23:22.497 23:45:07 reap_unregistered_poller -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:23:22.497 23:45:07 reap_unregistered_poller -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:23:22.497 23:45:07 reap_unregistered_poller -- common/autotest_common.sh@968 -- # echo 'killing process with pid 406816' 00:23:22.497 killing process with pid 406816 00:23:22.497 23:45:07 reap_unregistered_poller -- common/autotest_common.sh@969 -- # kill 406816 00:23:22.497 23:45:07 reap_unregistered_poller -- common/autotest_common.sh@974 -- # wait 406816 00:23:22.756 23:45:07 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@48 -- # cleanup 00:23:22.756 23:45:07 reap_unregistered_poller -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:23:22.756 00:23:22.756 real 0m1.924s 00:23:22.756 user 0m1.140s 00:23:22.756 sys 0m0.421s 00:23:22.756 23:45:07 reap_unregistered_poller -- common/autotest_common.sh@1126 -- # xtrace_disable 00:23:22.756 23:45:07 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:23:22.756 ************************************ 00:23:22.756 END TEST reap_unregistered_poller 00:23:22.756 ************************************ 00:23:22.756 23:45:07 -- spdk/autotest.sh@202 -- # uname -s 00:23:22.756 23:45:07 -- spdk/autotest.sh@202 -- # [[ Linux == Linux ]] 00:23:22.756 23:45:07 -- spdk/autotest.sh@203 -- # [[ 1 -eq 1 ]] 00:23:22.756 23:45:07 -- spdk/autotest.sh@209 -- # [[ 1 -eq 0 ]] 00:23:22.756 23:45:07 -- spdk/autotest.sh@215 -- # '[' 0 -eq 1 ']' 00:23:22.756 23:45:07 -- spdk/autotest.sh@260 -- # '[' 0 -eq 1 ']' 00:23:22.756 23:45:07 -- spdk/autotest.sh@264 -- # timing_exit lib 00:23:22.756 23:45:07 -- common/autotest_common.sh@730 -- # xtrace_disable 00:23:22.756 23:45:07 -- common/autotest_common.sh@10 -- # set +x 00:23:22.756 23:45:07 -- spdk/autotest.sh@266 -- # '[' 0 -eq 1 ']' 00:23:22.756 23:45:07 -- spdk/autotest.sh@274 -- # '[' 0 -eq 1 ']' 00:23:22.756 23:45:07 -- spdk/autotest.sh@283 -- # '[' 0 -eq 1 ']' 00:23:22.756 23:45:07 -- spdk/autotest.sh@312 -- # '[' 0 -eq 1 ']' 00:23:22.756 23:45:07 -- spdk/autotest.sh@316 -- # '[' 0 -eq 1 ']' 00:23:22.756 23:45:07 -- spdk/autotest.sh@320 -- # '[' 0 -eq 1 ']' 00:23:22.756 23:45:07 -- spdk/autotest.sh@325 -- # '[' 0 -eq 1 ']' 00:23:22.756 23:45:07 -- spdk/autotest.sh@334 -- # '[' 0 -eq 1 ']' 00:23:22.756 23:45:07 -- spdk/autotest.sh@339 -- # '[' 0 -eq 1 ']' 00:23:22.756 23:45:07 -- spdk/autotest.sh@343 -- # '[' 0 -eq 1 ']' 00:23:22.756 23:45:07 -- spdk/autotest.sh@347 -- # '[' 0 -eq 1 ']' 00:23:22.756 23:45:07 -- spdk/autotest.sh@351 -- # '[' 1 -eq 1 ']' 00:23:22.756 23:45:07 -- spdk/autotest.sh@352 -- # run_test compress_compdev /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh compdev 00:23:22.756 23:45:07 -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:23:22.756 23:45:07 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:23:22.756 23:45:07 -- common/autotest_common.sh@10 -- # set +x 00:23:22.756 ************************************ 00:23:22.756 START TEST compress_compdev 00:23:22.756 ************************************ 00:23:22.756 23:45:07 compress_compdev -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh compdev 00:23:22.756 * Looking for test storage... 00:23:22.756 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress 00:23:22.756 23:45:07 compress_compdev -- compress/compress.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:23:22.756 23:45:07 compress_compdev -- nvmf/common.sh@7 -- # uname -s 00:23:22.756 23:45:07 compress_compdev -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:23:22.757 23:45:07 compress_compdev -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:23:22.757 23:45:07 compress_compdev -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:23:22.757 23:45:07 compress_compdev -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:23:22.757 23:45:07 compress_compdev -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:23:22.757 23:45:07 compress_compdev -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:23:22.757 23:45:07 compress_compdev -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:23:22.757 23:45:07 compress_compdev -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:23:22.757 23:45:07 compress_compdev -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:23:22.757 23:45:07 compress_compdev -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:23:22.757 23:45:07 compress_compdev -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:801347e8-3fd0-e911-906e-0017a4403562 00:23:22.757 23:45:07 compress_compdev -- nvmf/common.sh@18 -- # NVME_HOSTID=801347e8-3fd0-e911-906e-0017a4403562 00:23:22.757 23:45:07 compress_compdev -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:23:22.757 23:45:07 compress_compdev -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:23:22.757 23:45:07 compress_compdev -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:23:22.757 23:45:07 compress_compdev -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:23:22.757 23:45:07 compress_compdev -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:23:22.757 23:45:07 compress_compdev -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:23:22.757 23:45:07 compress_compdev -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:23:22.757 23:45:07 compress_compdev -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:23:22.757 23:45:07 compress_compdev -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:23:22.757 23:45:07 compress_compdev -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:23:22.757 23:45:07 compress_compdev -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:23:22.757 23:45:07 compress_compdev -- paths/export.sh@5 -- # export PATH 00:23:22.757 23:45:07 compress_compdev -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:23:22.757 23:45:07 compress_compdev -- nvmf/common.sh@47 -- # : 0 00:23:22.757 23:45:07 compress_compdev -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:23:22.757 23:45:07 compress_compdev -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:23:22.757 23:45:07 compress_compdev -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:23:22.757 23:45:07 compress_compdev -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:23:22.757 23:45:07 compress_compdev -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:23:22.757 23:45:07 compress_compdev -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:23:22.757 23:45:07 compress_compdev -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:23:22.757 23:45:07 compress_compdev -- nvmf/common.sh@51 -- # have_pci_nics=0 00:23:22.757 23:45:07 compress_compdev -- compress/compress.sh@17 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:22.757 23:45:07 compress_compdev -- compress/compress.sh@81 -- # mkdir -p /tmp/pmem 00:23:22.757 23:45:07 compress_compdev -- compress/compress.sh@82 -- # test_type=compdev 00:23:22.757 23:45:07 compress_compdev -- compress/compress.sh@86 -- # run_bdevperf 32 4096 3 00:23:22.757 23:45:07 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:23:22.757 23:45:07 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=407148 00:23:22.757 23:45:07 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:23:22.757 23:45:07 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 407148 00:23:22.757 23:45:07 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:23:22.757 23:45:07 compress_compdev -- common/autotest_common.sh@831 -- # '[' -z 407148 ']' 00:23:22.757 23:45:07 compress_compdev -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:23:22.757 23:45:07 compress_compdev -- common/autotest_common.sh@836 -- # local max_retries=100 00:23:22.757 23:45:07 compress_compdev -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:23:22.757 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:23:22.757 23:45:07 compress_compdev -- common/autotest_common.sh@840 -- # xtrace_disable 00:23:22.757 23:45:07 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:23:23.016 [2024-07-24 23:45:07.776170] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:23:23.016 [2024-07-24 23:45:07.776218] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid407148 ] 00:23:23.016 [2024-07-24 23:45:07.839708] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:23:23.016 [2024-07-24 23:45:07.918338] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:23:23.016 [2024-07-24 23:45:07.918341] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:23:23.583 [2024-07-24 23:45:08.286694] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:23:23.583 23:45:08 compress_compdev -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:23:23.583 23:45:08 compress_compdev -- common/autotest_common.sh@864 -- # return 0 00:23:23.583 23:45:08 compress_compdev -- compress/compress.sh@74 -- # create_vols 00:23:23.583 23:45:08 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:23:23.842 23:45:08 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:23:27.128 [2024-07-24 23:45:11.582562] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x197bc00 PMD being used: compress_qat 00:23:27.128 23:45:11 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:23:27.128 23:45:11 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_name=Nvme0n1 00:23:27.128 23:45:11 compress_compdev -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:23:27.128 23:45:11 compress_compdev -- common/autotest_common.sh@901 -- # local i 00:23:27.128 23:45:11 compress_compdev -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:23:27.128 23:45:11 compress_compdev -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:23:27.128 23:45:11 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:23:27.128 23:45:11 compress_compdev -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:23:27.128 [ 00:23:27.128 { 00:23:27.128 "name": "Nvme0n1", 00:23:27.128 "aliases": [ 00:23:27.128 "59f6a1e0-5c91-459d-8ed7-0a68af7b6b18" 00:23:27.128 ], 00:23:27.128 "product_name": "NVMe disk", 00:23:27.128 "block_size": 512, 00:23:27.128 "num_blocks": 1953525168, 00:23:27.128 "uuid": "59f6a1e0-5c91-459d-8ed7-0a68af7b6b18", 00:23:27.128 "assigned_rate_limits": { 00:23:27.128 "rw_ios_per_sec": 0, 00:23:27.128 "rw_mbytes_per_sec": 0, 00:23:27.129 "r_mbytes_per_sec": 0, 00:23:27.129 "w_mbytes_per_sec": 0 00:23:27.129 }, 00:23:27.129 "claimed": false, 00:23:27.129 "zoned": false, 00:23:27.129 "supported_io_types": { 00:23:27.129 "read": true, 00:23:27.129 "write": true, 00:23:27.129 "unmap": true, 00:23:27.129 "flush": true, 00:23:27.129 "reset": true, 00:23:27.129 "nvme_admin": true, 00:23:27.129 "nvme_io": true, 00:23:27.129 "nvme_io_md": false, 00:23:27.129 "write_zeroes": true, 00:23:27.129 "zcopy": false, 00:23:27.129 "get_zone_info": false, 00:23:27.129 "zone_management": false, 00:23:27.129 "zone_append": false, 00:23:27.129 "compare": false, 00:23:27.129 "compare_and_write": false, 00:23:27.129 "abort": true, 00:23:27.129 "seek_hole": false, 00:23:27.129 "seek_data": false, 00:23:27.129 "copy": false, 00:23:27.129 "nvme_iov_md": false 00:23:27.129 }, 00:23:27.129 "driver_specific": { 00:23:27.129 "nvme": [ 00:23:27.129 { 00:23:27.129 "pci_address": "0000:5e:00.0", 00:23:27.129 "trid": { 00:23:27.129 "trtype": "PCIe", 00:23:27.129 "traddr": "0000:5e:00.0" 00:23:27.129 }, 00:23:27.129 "ctrlr_data": { 00:23:27.129 "cntlid": 0, 00:23:27.129 "vendor_id": "0x8086", 00:23:27.129 "model_number": "INTEL SSDPE2KX010T8", 00:23:27.129 "serial_number": "BTLJ807001JM1P0FGN", 00:23:27.129 "firmware_revision": "VDV10170", 00:23:27.129 "oacs": { 00:23:27.129 "security": 1, 00:23:27.129 "format": 1, 00:23:27.129 "firmware": 1, 00:23:27.129 "ns_manage": 1 00:23:27.129 }, 00:23:27.129 "multi_ctrlr": false, 00:23:27.129 "ana_reporting": false 00:23:27.129 }, 00:23:27.129 "vs": { 00:23:27.129 "nvme_version": "1.2" 00:23:27.129 }, 00:23:27.129 "ns_data": { 00:23:27.129 "id": 1, 00:23:27.129 "can_share": false 00:23:27.129 }, 00:23:27.129 "security": { 00:23:27.129 "opal": true 00:23:27.129 } 00:23:27.129 } 00:23:27.129 ], 00:23:27.129 "mp_policy": "active_passive" 00:23:27.129 } 00:23:27.129 } 00:23:27.129 ] 00:23:27.129 23:45:11 compress_compdev -- common/autotest_common.sh@907 -- # return 0 00:23:27.129 23:45:11 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:23:27.129 [2024-07-24 23:45:12.110623] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x197e0a0 PMD being used: compress_qat 00:23:28.064 c170fa8d-ce7e-43e3-ba1f-52f3dc5aaaa1 00:23:28.064 23:45:12 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:23:28.322 6a971ab2-4924-43d1-90cc-3def06db8734 00:23:28.322 23:45:13 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:23:28.322 23:45:13 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_name=lvs0/lv0 00:23:28.322 23:45:13 compress_compdev -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:23:28.322 23:45:13 compress_compdev -- common/autotest_common.sh@901 -- # local i 00:23:28.322 23:45:13 compress_compdev -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:23:28.322 23:45:13 compress_compdev -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:23:28.322 23:45:13 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:23:28.322 23:45:13 compress_compdev -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:23:28.584 [ 00:23:28.584 { 00:23:28.584 "name": "6a971ab2-4924-43d1-90cc-3def06db8734", 00:23:28.584 "aliases": [ 00:23:28.584 "lvs0/lv0" 00:23:28.584 ], 00:23:28.584 "product_name": "Logical Volume", 00:23:28.584 "block_size": 512, 00:23:28.584 "num_blocks": 204800, 00:23:28.584 "uuid": "6a971ab2-4924-43d1-90cc-3def06db8734", 00:23:28.584 "assigned_rate_limits": { 00:23:28.584 "rw_ios_per_sec": 0, 00:23:28.584 "rw_mbytes_per_sec": 0, 00:23:28.584 "r_mbytes_per_sec": 0, 00:23:28.584 "w_mbytes_per_sec": 0 00:23:28.584 }, 00:23:28.584 "claimed": false, 00:23:28.584 "zoned": false, 00:23:28.584 "supported_io_types": { 00:23:28.584 "read": true, 00:23:28.584 "write": true, 00:23:28.584 "unmap": true, 00:23:28.584 "flush": false, 00:23:28.584 "reset": true, 00:23:28.584 "nvme_admin": false, 00:23:28.584 "nvme_io": false, 00:23:28.584 "nvme_io_md": false, 00:23:28.584 "write_zeroes": true, 00:23:28.584 "zcopy": false, 00:23:28.584 "get_zone_info": false, 00:23:28.584 "zone_management": false, 00:23:28.584 "zone_append": false, 00:23:28.584 "compare": false, 00:23:28.584 "compare_and_write": false, 00:23:28.584 "abort": false, 00:23:28.584 "seek_hole": true, 00:23:28.584 "seek_data": true, 00:23:28.584 "copy": false, 00:23:28.584 "nvme_iov_md": false 00:23:28.584 }, 00:23:28.584 "driver_specific": { 00:23:28.584 "lvol": { 00:23:28.584 "lvol_store_uuid": "c170fa8d-ce7e-43e3-ba1f-52f3dc5aaaa1", 00:23:28.584 "base_bdev": "Nvme0n1", 00:23:28.584 "thin_provision": true, 00:23:28.584 "num_allocated_clusters": 0, 00:23:28.584 "snapshot": false, 00:23:28.584 "clone": false, 00:23:28.584 "esnap_clone": false 00:23:28.584 } 00:23:28.584 } 00:23:28.584 } 00:23:28.584 ] 00:23:28.584 23:45:13 compress_compdev -- common/autotest_common.sh@907 -- # return 0 00:23:28.584 23:45:13 compress_compdev -- compress/compress.sh@41 -- # '[' -z '' ']' 00:23:28.584 23:45:13 compress_compdev -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:23:28.843 [2024-07-24 23:45:13.633185] vbdev_compress.c: 999:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:23:28.843 COMP_lvs0/lv0 00:23:28.843 23:45:13 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:23:28.843 23:45:13 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_name=COMP_lvs0/lv0 00:23:28.843 23:45:13 compress_compdev -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:23:28.843 23:45:13 compress_compdev -- common/autotest_common.sh@901 -- # local i 00:23:28.843 23:45:13 compress_compdev -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:23:28.843 23:45:13 compress_compdev -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:23:28.843 23:45:13 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:23:28.843 23:45:13 compress_compdev -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:23:29.101 [ 00:23:29.101 { 00:23:29.101 "name": "COMP_lvs0/lv0", 00:23:29.101 "aliases": [ 00:23:29.101 "3e7b7d1d-e5c1-5acd-9189-abbeda9d2be9" 00:23:29.101 ], 00:23:29.101 "product_name": "compress", 00:23:29.101 "block_size": 512, 00:23:29.101 "num_blocks": 200704, 00:23:29.101 "uuid": "3e7b7d1d-e5c1-5acd-9189-abbeda9d2be9", 00:23:29.101 "assigned_rate_limits": { 00:23:29.101 "rw_ios_per_sec": 0, 00:23:29.101 "rw_mbytes_per_sec": 0, 00:23:29.101 "r_mbytes_per_sec": 0, 00:23:29.101 "w_mbytes_per_sec": 0 00:23:29.102 }, 00:23:29.102 "claimed": false, 00:23:29.102 "zoned": false, 00:23:29.102 "supported_io_types": { 00:23:29.102 "read": true, 00:23:29.102 "write": true, 00:23:29.102 "unmap": false, 00:23:29.102 "flush": false, 00:23:29.102 "reset": false, 00:23:29.102 "nvme_admin": false, 00:23:29.102 "nvme_io": false, 00:23:29.102 "nvme_io_md": false, 00:23:29.102 "write_zeroes": true, 00:23:29.102 "zcopy": false, 00:23:29.102 "get_zone_info": false, 00:23:29.102 "zone_management": false, 00:23:29.102 "zone_append": false, 00:23:29.102 "compare": false, 00:23:29.102 "compare_and_write": false, 00:23:29.102 "abort": false, 00:23:29.102 "seek_hole": false, 00:23:29.102 "seek_data": false, 00:23:29.102 "copy": false, 00:23:29.102 "nvme_iov_md": false 00:23:29.102 }, 00:23:29.102 "driver_specific": { 00:23:29.102 "compress": { 00:23:29.102 "name": "COMP_lvs0/lv0", 00:23:29.102 "base_bdev_name": "6a971ab2-4924-43d1-90cc-3def06db8734", 00:23:29.102 "pm_path": "/tmp/pmem/821f0c2b-08c5-45dc-8060-4c0ef3c5b0d9" 00:23:29.102 } 00:23:29.102 } 00:23:29.102 } 00:23:29.102 ] 00:23:29.102 23:45:13 compress_compdev -- common/autotest_common.sh@907 -- # return 0 00:23:29.102 23:45:13 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:23:29.102 [2024-07-24 23:45:14.063017] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f7fc41b15c0 PMD being used: compress_qat 00:23:29.102 [2024-07-24 23:45:14.064487] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x197aa80 PMD being used: compress_qat 00:23:29.102 Running I/O for 3 seconds... 00:23:32.384 00:23:32.384 Latency(us) 00:23:32.384 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:32.384 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:23:32.384 Verification LBA range: start 0x0 length 0x3100 00:23:32.384 COMP_lvs0/lv0 : 3.01 4046.93 15.81 0.00 0.00 7866.85 129.71 13856.18 00:23:32.384 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:23:32.384 Verification LBA range: start 0x3100 length 0x3100 00:23:32.384 COMP_lvs0/lv0 : 3.01 4161.47 16.26 0.00 0.00 7650.70 120.44 14417.92 00:23:32.384 =================================================================================================================== 00:23:32.384 Total : 8208.40 32.06 0.00 0.00 7757.26 120.44 14417.92 00:23:32.384 0 00:23:32.384 23:45:17 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:23:32.384 23:45:17 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:23:32.384 23:45:17 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:23:32.643 23:45:17 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:23:32.643 23:45:17 compress_compdev -- compress/compress.sh@78 -- # killprocess 407148 00:23:32.643 23:45:17 compress_compdev -- common/autotest_common.sh@950 -- # '[' -z 407148 ']' 00:23:32.643 23:45:17 compress_compdev -- common/autotest_common.sh@954 -- # kill -0 407148 00:23:32.643 23:45:17 compress_compdev -- common/autotest_common.sh@955 -- # uname 00:23:32.643 23:45:17 compress_compdev -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:23:32.643 23:45:17 compress_compdev -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 407148 00:23:32.643 23:45:17 compress_compdev -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:23:32.643 23:45:17 compress_compdev -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:23:32.643 23:45:17 compress_compdev -- common/autotest_common.sh@968 -- # echo 'killing process with pid 407148' 00:23:32.643 killing process with pid 407148 00:23:32.643 23:45:17 compress_compdev -- common/autotest_common.sh@969 -- # kill 407148 00:23:32.643 Received shutdown signal, test time was about 3.000000 seconds 00:23:32.643 00:23:32.643 Latency(us) 00:23:32.643 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:32.643 =================================================================================================================== 00:23:32.643 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:23:32.643 23:45:17 compress_compdev -- common/autotest_common.sh@974 -- # wait 407148 00:23:34.017 23:45:18 compress_compdev -- compress/compress.sh@87 -- # run_bdevperf 32 4096 3 512 00:23:34.017 23:45:18 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:23:34.017 23:45:18 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=409367 00:23:34.017 23:45:18 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:23:34.017 23:45:18 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:23:34.017 23:45:18 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 409367 00:23:34.017 23:45:18 compress_compdev -- common/autotest_common.sh@831 -- # '[' -z 409367 ']' 00:23:34.017 23:45:18 compress_compdev -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:23:34.017 23:45:18 compress_compdev -- common/autotest_common.sh@836 -- # local max_retries=100 00:23:34.017 23:45:18 compress_compdev -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:23:34.017 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:23:34.017 23:45:18 compress_compdev -- common/autotest_common.sh@840 -- # xtrace_disable 00:23:34.017 23:45:18 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:23:34.017 [2024-07-24 23:45:18.964253] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:23:34.017 [2024-07-24 23:45:18.964297] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid409367 ] 00:23:34.275 [2024-07-24 23:45:19.026540] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:23:34.275 [2024-07-24 23:45:19.109937] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:23:34.275 [2024-07-24 23:45:19.109940] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:23:34.533 [2024-07-24 23:45:19.475146] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:23:34.790 23:45:19 compress_compdev -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:23:34.790 23:45:19 compress_compdev -- common/autotest_common.sh@864 -- # return 0 00:23:34.790 23:45:19 compress_compdev -- compress/compress.sh@74 -- # create_vols 512 00:23:34.790 23:45:19 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:23:34.790 23:45:19 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:23:38.077 [2024-07-24 23:45:22.758694] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x17c0c00 PMD being used: compress_qat 00:23:38.077 23:45:22 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:23:38.077 23:45:22 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_name=Nvme0n1 00:23:38.077 23:45:22 compress_compdev -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:23:38.077 23:45:22 compress_compdev -- common/autotest_common.sh@901 -- # local i 00:23:38.077 23:45:22 compress_compdev -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:23:38.077 23:45:22 compress_compdev -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:23:38.077 23:45:22 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:23:38.077 23:45:22 compress_compdev -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:23:38.336 [ 00:23:38.336 { 00:23:38.336 "name": "Nvme0n1", 00:23:38.336 "aliases": [ 00:23:38.336 "cfef0795-c58b-4bf4-9d8a-df4626993c44" 00:23:38.336 ], 00:23:38.336 "product_name": "NVMe disk", 00:23:38.336 "block_size": 512, 00:23:38.336 "num_blocks": 1953525168, 00:23:38.336 "uuid": "cfef0795-c58b-4bf4-9d8a-df4626993c44", 00:23:38.336 "assigned_rate_limits": { 00:23:38.336 "rw_ios_per_sec": 0, 00:23:38.336 "rw_mbytes_per_sec": 0, 00:23:38.336 "r_mbytes_per_sec": 0, 00:23:38.336 "w_mbytes_per_sec": 0 00:23:38.336 }, 00:23:38.336 "claimed": false, 00:23:38.336 "zoned": false, 00:23:38.336 "supported_io_types": { 00:23:38.336 "read": true, 00:23:38.336 "write": true, 00:23:38.336 "unmap": true, 00:23:38.336 "flush": true, 00:23:38.336 "reset": true, 00:23:38.336 "nvme_admin": true, 00:23:38.336 "nvme_io": true, 00:23:38.336 "nvme_io_md": false, 00:23:38.336 "write_zeroes": true, 00:23:38.336 "zcopy": false, 00:23:38.336 "get_zone_info": false, 00:23:38.336 "zone_management": false, 00:23:38.336 "zone_append": false, 00:23:38.336 "compare": false, 00:23:38.336 "compare_and_write": false, 00:23:38.336 "abort": true, 00:23:38.336 "seek_hole": false, 00:23:38.336 "seek_data": false, 00:23:38.336 "copy": false, 00:23:38.336 "nvme_iov_md": false 00:23:38.336 }, 00:23:38.336 "driver_specific": { 00:23:38.336 "nvme": [ 00:23:38.336 { 00:23:38.336 "pci_address": "0000:5e:00.0", 00:23:38.336 "trid": { 00:23:38.336 "trtype": "PCIe", 00:23:38.336 "traddr": "0000:5e:00.0" 00:23:38.336 }, 00:23:38.336 "ctrlr_data": { 00:23:38.336 "cntlid": 0, 00:23:38.336 "vendor_id": "0x8086", 00:23:38.336 "model_number": "INTEL SSDPE2KX010T8", 00:23:38.336 "serial_number": "BTLJ807001JM1P0FGN", 00:23:38.336 "firmware_revision": "VDV10170", 00:23:38.336 "oacs": { 00:23:38.336 "security": 1, 00:23:38.336 "format": 1, 00:23:38.336 "firmware": 1, 00:23:38.336 "ns_manage": 1 00:23:38.336 }, 00:23:38.336 "multi_ctrlr": false, 00:23:38.336 "ana_reporting": false 00:23:38.336 }, 00:23:38.336 "vs": { 00:23:38.336 "nvme_version": "1.2" 00:23:38.336 }, 00:23:38.336 "ns_data": { 00:23:38.336 "id": 1, 00:23:38.336 "can_share": false 00:23:38.336 }, 00:23:38.336 "security": { 00:23:38.336 "opal": true 00:23:38.336 } 00:23:38.336 } 00:23:38.336 ], 00:23:38.336 "mp_policy": "active_passive" 00:23:38.336 } 00:23:38.336 } 00:23:38.336 ] 00:23:38.336 23:45:23 compress_compdev -- common/autotest_common.sh@907 -- # return 0 00:23:38.336 23:45:23 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:23:38.336 [2024-07-24 23:45:23.298801] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x17c1570 PMD being used: compress_qat 00:23:39.268 6c9ff70c-dc13-4550-9dd9-f1294d2e9b0f 00:23:39.268 23:45:24 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:23:39.526 76846c8f-9101-4409-bbf0-e657c9aade98 00:23:39.526 23:45:24 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:23:39.526 23:45:24 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_name=lvs0/lv0 00:23:39.526 23:45:24 compress_compdev -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:23:39.526 23:45:24 compress_compdev -- common/autotest_common.sh@901 -- # local i 00:23:39.526 23:45:24 compress_compdev -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:23:39.526 23:45:24 compress_compdev -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:23:39.526 23:45:24 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:23:39.526 23:45:24 compress_compdev -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:23:39.784 [ 00:23:39.784 { 00:23:39.784 "name": "76846c8f-9101-4409-bbf0-e657c9aade98", 00:23:39.784 "aliases": [ 00:23:39.784 "lvs0/lv0" 00:23:39.784 ], 00:23:39.784 "product_name": "Logical Volume", 00:23:39.784 "block_size": 512, 00:23:39.784 "num_blocks": 204800, 00:23:39.784 "uuid": "76846c8f-9101-4409-bbf0-e657c9aade98", 00:23:39.784 "assigned_rate_limits": { 00:23:39.784 "rw_ios_per_sec": 0, 00:23:39.784 "rw_mbytes_per_sec": 0, 00:23:39.784 "r_mbytes_per_sec": 0, 00:23:39.784 "w_mbytes_per_sec": 0 00:23:39.784 }, 00:23:39.784 "claimed": false, 00:23:39.784 "zoned": false, 00:23:39.784 "supported_io_types": { 00:23:39.784 "read": true, 00:23:39.784 "write": true, 00:23:39.784 "unmap": true, 00:23:39.784 "flush": false, 00:23:39.784 "reset": true, 00:23:39.784 "nvme_admin": false, 00:23:39.784 "nvme_io": false, 00:23:39.784 "nvme_io_md": false, 00:23:39.784 "write_zeroes": true, 00:23:39.784 "zcopy": false, 00:23:39.784 "get_zone_info": false, 00:23:39.784 "zone_management": false, 00:23:39.784 "zone_append": false, 00:23:39.784 "compare": false, 00:23:39.784 "compare_and_write": false, 00:23:39.784 "abort": false, 00:23:39.784 "seek_hole": true, 00:23:39.784 "seek_data": true, 00:23:39.784 "copy": false, 00:23:39.784 "nvme_iov_md": false 00:23:39.784 }, 00:23:39.784 "driver_specific": { 00:23:39.784 "lvol": { 00:23:39.784 "lvol_store_uuid": "6c9ff70c-dc13-4550-9dd9-f1294d2e9b0f", 00:23:39.784 "base_bdev": "Nvme0n1", 00:23:39.784 "thin_provision": true, 00:23:39.784 "num_allocated_clusters": 0, 00:23:39.784 "snapshot": false, 00:23:39.784 "clone": false, 00:23:39.784 "esnap_clone": false 00:23:39.784 } 00:23:39.784 } 00:23:39.784 } 00:23:39.784 ] 00:23:39.784 23:45:24 compress_compdev -- common/autotest_common.sh@907 -- # return 0 00:23:39.784 23:45:24 compress_compdev -- compress/compress.sh@41 -- # '[' -z 512 ']' 00:23:39.784 23:45:24 compress_compdev -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 512 00:23:40.042 [2024-07-24 23:45:24.827398] vbdev_compress.c: 999:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:23:40.042 COMP_lvs0/lv0 00:23:40.042 23:45:24 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:23:40.042 23:45:24 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_name=COMP_lvs0/lv0 00:23:40.042 23:45:24 compress_compdev -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:23:40.042 23:45:24 compress_compdev -- common/autotest_common.sh@901 -- # local i 00:23:40.043 23:45:24 compress_compdev -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:23:40.043 23:45:24 compress_compdev -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:23:40.043 23:45:24 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:23:40.043 23:45:25 compress_compdev -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:23:40.301 [ 00:23:40.301 { 00:23:40.301 "name": "COMP_lvs0/lv0", 00:23:40.301 "aliases": [ 00:23:40.301 "905f6555-f70a-5fec-9ddf-be9bb0fd681a" 00:23:40.301 ], 00:23:40.301 "product_name": "compress", 00:23:40.301 "block_size": 512, 00:23:40.301 "num_blocks": 200704, 00:23:40.301 "uuid": "905f6555-f70a-5fec-9ddf-be9bb0fd681a", 00:23:40.301 "assigned_rate_limits": { 00:23:40.301 "rw_ios_per_sec": 0, 00:23:40.301 "rw_mbytes_per_sec": 0, 00:23:40.301 "r_mbytes_per_sec": 0, 00:23:40.301 "w_mbytes_per_sec": 0 00:23:40.301 }, 00:23:40.301 "claimed": false, 00:23:40.301 "zoned": false, 00:23:40.301 "supported_io_types": { 00:23:40.301 "read": true, 00:23:40.301 "write": true, 00:23:40.301 "unmap": false, 00:23:40.301 "flush": false, 00:23:40.301 "reset": false, 00:23:40.301 "nvme_admin": false, 00:23:40.301 "nvme_io": false, 00:23:40.301 "nvme_io_md": false, 00:23:40.301 "write_zeroes": true, 00:23:40.301 "zcopy": false, 00:23:40.301 "get_zone_info": false, 00:23:40.301 "zone_management": false, 00:23:40.301 "zone_append": false, 00:23:40.301 "compare": false, 00:23:40.301 "compare_and_write": false, 00:23:40.301 "abort": false, 00:23:40.301 "seek_hole": false, 00:23:40.301 "seek_data": false, 00:23:40.301 "copy": false, 00:23:40.301 "nvme_iov_md": false 00:23:40.301 }, 00:23:40.301 "driver_specific": { 00:23:40.301 "compress": { 00:23:40.301 "name": "COMP_lvs0/lv0", 00:23:40.301 "base_bdev_name": "76846c8f-9101-4409-bbf0-e657c9aade98", 00:23:40.301 "pm_path": "/tmp/pmem/5edc5793-1914-4ee3-ac47-d43be6c2ee0d" 00:23:40.301 } 00:23:40.301 } 00:23:40.301 } 00:23:40.301 ] 00:23:40.301 23:45:25 compress_compdev -- common/autotest_common.sh@907 -- # return 0 00:23:40.301 23:45:25 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:23:40.301 [2024-07-24 23:45:25.261304] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f3db81b15c0 PMD being used: compress_qat 00:23:40.301 [2024-07-24 23:45:25.262787] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x17ee560 PMD being used: compress_qat 00:23:40.301 Running I/O for 3 seconds... 00:23:43.585 00:23:43.585 Latency(us) 00:23:43.585 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:43.585 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:23:43.585 Verification LBA range: start 0x0 length 0x3100 00:23:43.585 COMP_lvs0/lv0 : 3.00 4080.43 15.94 0.00 0.00 7807.46 130.68 14667.58 00:23:43.585 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:23:43.585 Verification LBA range: start 0x3100 length 0x3100 00:23:43.585 COMP_lvs0/lv0 : 3.00 4136.32 16.16 0.00 0.00 7705.68 118.98 14168.26 00:23:43.585 =================================================================================================================== 00:23:43.585 Total : 8216.74 32.10 0.00 0.00 7756.23 118.98 14667.58 00:23:43.585 0 00:23:43.585 23:45:28 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:23:43.585 23:45:28 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:23:43.585 23:45:28 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:23:43.843 23:45:28 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:23:43.843 23:45:28 compress_compdev -- compress/compress.sh@78 -- # killprocess 409367 00:23:43.843 23:45:28 compress_compdev -- common/autotest_common.sh@950 -- # '[' -z 409367 ']' 00:23:43.843 23:45:28 compress_compdev -- common/autotest_common.sh@954 -- # kill -0 409367 00:23:43.843 23:45:28 compress_compdev -- common/autotest_common.sh@955 -- # uname 00:23:43.843 23:45:28 compress_compdev -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:23:43.843 23:45:28 compress_compdev -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 409367 00:23:43.844 23:45:28 compress_compdev -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:23:43.844 23:45:28 compress_compdev -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:23:43.844 23:45:28 compress_compdev -- common/autotest_common.sh@968 -- # echo 'killing process with pid 409367' 00:23:43.844 killing process with pid 409367 00:23:43.844 23:45:28 compress_compdev -- common/autotest_common.sh@969 -- # kill 409367 00:23:43.844 Received shutdown signal, test time was about 3.000000 seconds 00:23:43.844 00:23:43.844 Latency(us) 00:23:43.844 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:43.844 =================================================================================================================== 00:23:43.844 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:23:43.844 23:45:28 compress_compdev -- common/autotest_common.sh@974 -- # wait 409367 00:23:45.220 23:45:30 compress_compdev -- compress/compress.sh@88 -- # run_bdevperf 32 4096 3 4096 00:23:45.220 23:45:30 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:23:45.220 23:45:30 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=411209 00:23:45.220 23:45:30 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:23:45.220 23:45:30 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:23:45.220 23:45:30 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 411209 00:23:45.220 23:45:30 compress_compdev -- common/autotest_common.sh@831 -- # '[' -z 411209 ']' 00:23:45.220 23:45:30 compress_compdev -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:23:45.220 23:45:30 compress_compdev -- common/autotest_common.sh@836 -- # local max_retries=100 00:23:45.220 23:45:30 compress_compdev -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:23:45.220 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:23:45.220 23:45:30 compress_compdev -- common/autotest_common.sh@840 -- # xtrace_disable 00:23:45.220 23:45:30 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:23:45.220 [2024-07-24 23:45:30.177870] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:23:45.220 [2024-07-24 23:45:30.177911] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid411209 ] 00:23:45.478 [2024-07-24 23:45:30.241040] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:23:45.478 [2024-07-24 23:45:30.310207] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:23:45.478 [2024-07-24 23:45:30.310208] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:23:45.737 [2024-07-24 23:45:30.676659] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:23:45.995 23:45:30 compress_compdev -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:23:45.995 23:45:30 compress_compdev -- common/autotest_common.sh@864 -- # return 0 00:23:45.995 23:45:30 compress_compdev -- compress/compress.sh@74 -- # create_vols 4096 00:23:45.995 23:45:30 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:23:45.995 23:45:30 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:23:49.277 [2024-07-24 23:45:33.968653] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x227cc00 PMD being used: compress_qat 00:23:49.277 23:45:33 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:23:49.277 23:45:33 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_name=Nvme0n1 00:23:49.277 23:45:33 compress_compdev -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:23:49.277 23:45:33 compress_compdev -- common/autotest_common.sh@901 -- # local i 00:23:49.277 23:45:33 compress_compdev -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:23:49.277 23:45:33 compress_compdev -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:23:49.277 23:45:33 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:23:49.277 23:45:34 compress_compdev -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:23:49.535 [ 00:23:49.535 { 00:23:49.535 "name": "Nvme0n1", 00:23:49.535 "aliases": [ 00:23:49.535 "5daf6b82-0848-44ff-8fcf-3132de2ee399" 00:23:49.535 ], 00:23:49.535 "product_name": "NVMe disk", 00:23:49.535 "block_size": 512, 00:23:49.535 "num_blocks": 1953525168, 00:23:49.535 "uuid": "5daf6b82-0848-44ff-8fcf-3132de2ee399", 00:23:49.535 "assigned_rate_limits": { 00:23:49.535 "rw_ios_per_sec": 0, 00:23:49.535 "rw_mbytes_per_sec": 0, 00:23:49.535 "r_mbytes_per_sec": 0, 00:23:49.535 "w_mbytes_per_sec": 0 00:23:49.535 }, 00:23:49.535 "claimed": false, 00:23:49.535 "zoned": false, 00:23:49.535 "supported_io_types": { 00:23:49.535 "read": true, 00:23:49.535 "write": true, 00:23:49.535 "unmap": true, 00:23:49.535 "flush": true, 00:23:49.535 "reset": true, 00:23:49.535 "nvme_admin": true, 00:23:49.535 "nvme_io": true, 00:23:49.535 "nvme_io_md": false, 00:23:49.535 "write_zeroes": true, 00:23:49.535 "zcopy": false, 00:23:49.535 "get_zone_info": false, 00:23:49.535 "zone_management": false, 00:23:49.535 "zone_append": false, 00:23:49.535 "compare": false, 00:23:49.535 "compare_and_write": false, 00:23:49.535 "abort": true, 00:23:49.535 "seek_hole": false, 00:23:49.535 "seek_data": false, 00:23:49.535 "copy": false, 00:23:49.535 "nvme_iov_md": false 00:23:49.535 }, 00:23:49.535 "driver_specific": { 00:23:49.535 "nvme": [ 00:23:49.535 { 00:23:49.535 "pci_address": "0000:5e:00.0", 00:23:49.535 "trid": { 00:23:49.535 "trtype": "PCIe", 00:23:49.535 "traddr": "0000:5e:00.0" 00:23:49.535 }, 00:23:49.535 "ctrlr_data": { 00:23:49.535 "cntlid": 0, 00:23:49.535 "vendor_id": "0x8086", 00:23:49.535 "model_number": "INTEL SSDPE2KX010T8", 00:23:49.535 "serial_number": "BTLJ807001JM1P0FGN", 00:23:49.535 "firmware_revision": "VDV10170", 00:23:49.535 "oacs": { 00:23:49.535 "security": 1, 00:23:49.535 "format": 1, 00:23:49.535 "firmware": 1, 00:23:49.535 "ns_manage": 1 00:23:49.535 }, 00:23:49.535 "multi_ctrlr": false, 00:23:49.535 "ana_reporting": false 00:23:49.535 }, 00:23:49.535 "vs": { 00:23:49.535 "nvme_version": "1.2" 00:23:49.535 }, 00:23:49.535 "ns_data": { 00:23:49.535 "id": 1, 00:23:49.535 "can_share": false 00:23:49.535 }, 00:23:49.535 "security": { 00:23:49.535 "opal": true 00:23:49.535 } 00:23:49.535 } 00:23:49.535 ], 00:23:49.535 "mp_policy": "active_passive" 00:23:49.535 } 00:23:49.535 } 00:23:49.535 ] 00:23:49.535 23:45:34 compress_compdev -- common/autotest_common.sh@907 -- # return 0 00:23:49.535 23:45:34 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:23:49.535 [2024-07-24 23:45:34.484355] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x227d570 PMD being used: compress_qat 00:23:50.471 1874cb37-0980-404c-9c73-1cc35860c3ab 00:23:50.471 23:45:35 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:23:50.730 da8d841f-922d-485b-bb09-278eca865c7d 00:23:50.730 23:45:35 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:23:50.730 23:45:35 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_name=lvs0/lv0 00:23:50.730 23:45:35 compress_compdev -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:23:50.730 23:45:35 compress_compdev -- common/autotest_common.sh@901 -- # local i 00:23:50.730 23:45:35 compress_compdev -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:23:50.730 23:45:35 compress_compdev -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:23:50.730 23:45:35 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:23:50.730 23:45:35 compress_compdev -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:23:50.988 [ 00:23:50.988 { 00:23:50.988 "name": "da8d841f-922d-485b-bb09-278eca865c7d", 00:23:50.988 "aliases": [ 00:23:50.988 "lvs0/lv0" 00:23:50.988 ], 00:23:50.988 "product_name": "Logical Volume", 00:23:50.988 "block_size": 512, 00:23:50.988 "num_blocks": 204800, 00:23:50.988 "uuid": "da8d841f-922d-485b-bb09-278eca865c7d", 00:23:50.988 "assigned_rate_limits": { 00:23:50.988 "rw_ios_per_sec": 0, 00:23:50.988 "rw_mbytes_per_sec": 0, 00:23:50.988 "r_mbytes_per_sec": 0, 00:23:50.988 "w_mbytes_per_sec": 0 00:23:50.988 }, 00:23:50.988 "claimed": false, 00:23:50.988 "zoned": false, 00:23:50.988 "supported_io_types": { 00:23:50.988 "read": true, 00:23:50.988 "write": true, 00:23:50.988 "unmap": true, 00:23:50.988 "flush": false, 00:23:50.988 "reset": true, 00:23:50.988 "nvme_admin": false, 00:23:50.988 "nvme_io": false, 00:23:50.988 "nvme_io_md": false, 00:23:50.988 "write_zeroes": true, 00:23:50.988 "zcopy": false, 00:23:50.988 "get_zone_info": false, 00:23:50.988 "zone_management": false, 00:23:50.988 "zone_append": false, 00:23:50.988 "compare": false, 00:23:50.988 "compare_and_write": false, 00:23:50.988 "abort": false, 00:23:50.988 "seek_hole": true, 00:23:50.988 "seek_data": true, 00:23:50.988 "copy": false, 00:23:50.988 "nvme_iov_md": false 00:23:50.988 }, 00:23:50.988 "driver_specific": { 00:23:50.988 "lvol": { 00:23:50.988 "lvol_store_uuid": "1874cb37-0980-404c-9c73-1cc35860c3ab", 00:23:50.988 "base_bdev": "Nvme0n1", 00:23:50.988 "thin_provision": true, 00:23:50.988 "num_allocated_clusters": 0, 00:23:50.988 "snapshot": false, 00:23:50.988 "clone": false, 00:23:50.988 "esnap_clone": false 00:23:50.988 } 00:23:50.988 } 00:23:50.988 } 00:23:50.988 ] 00:23:50.988 23:45:35 compress_compdev -- common/autotest_common.sh@907 -- # return 0 00:23:50.988 23:45:35 compress_compdev -- compress/compress.sh@41 -- # '[' -z 4096 ']' 00:23:50.988 23:45:35 compress_compdev -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 4096 00:23:51.247 [2024-07-24 23:45:36.029087] vbdev_compress.c: 999:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:23:51.247 COMP_lvs0/lv0 00:23:51.247 23:45:36 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:23:51.247 23:45:36 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_name=COMP_lvs0/lv0 00:23:51.247 23:45:36 compress_compdev -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:23:51.247 23:45:36 compress_compdev -- common/autotest_common.sh@901 -- # local i 00:23:51.247 23:45:36 compress_compdev -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:23:51.247 23:45:36 compress_compdev -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:23:51.247 23:45:36 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:23:51.247 23:45:36 compress_compdev -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:23:51.506 [ 00:23:51.506 { 00:23:51.506 "name": "COMP_lvs0/lv0", 00:23:51.506 "aliases": [ 00:23:51.506 "04676947-1311-5675-9184-368006d075ae" 00:23:51.506 ], 00:23:51.506 "product_name": "compress", 00:23:51.506 "block_size": 4096, 00:23:51.506 "num_blocks": 25088, 00:23:51.506 "uuid": "04676947-1311-5675-9184-368006d075ae", 00:23:51.506 "assigned_rate_limits": { 00:23:51.506 "rw_ios_per_sec": 0, 00:23:51.506 "rw_mbytes_per_sec": 0, 00:23:51.506 "r_mbytes_per_sec": 0, 00:23:51.506 "w_mbytes_per_sec": 0 00:23:51.506 }, 00:23:51.506 "claimed": false, 00:23:51.506 "zoned": false, 00:23:51.506 "supported_io_types": { 00:23:51.506 "read": true, 00:23:51.506 "write": true, 00:23:51.506 "unmap": false, 00:23:51.506 "flush": false, 00:23:51.506 "reset": false, 00:23:51.506 "nvme_admin": false, 00:23:51.506 "nvme_io": false, 00:23:51.506 "nvme_io_md": false, 00:23:51.506 "write_zeroes": true, 00:23:51.506 "zcopy": false, 00:23:51.506 "get_zone_info": false, 00:23:51.506 "zone_management": false, 00:23:51.506 "zone_append": false, 00:23:51.506 "compare": false, 00:23:51.506 "compare_and_write": false, 00:23:51.506 "abort": false, 00:23:51.506 "seek_hole": false, 00:23:51.506 "seek_data": false, 00:23:51.506 "copy": false, 00:23:51.506 "nvme_iov_md": false 00:23:51.506 }, 00:23:51.506 "driver_specific": { 00:23:51.506 "compress": { 00:23:51.506 "name": "COMP_lvs0/lv0", 00:23:51.506 "base_bdev_name": "da8d841f-922d-485b-bb09-278eca865c7d", 00:23:51.506 "pm_path": "/tmp/pmem/a4d4d46f-1dbc-4d6c-95e4-d59aade69d27" 00:23:51.506 } 00:23:51.506 } 00:23:51.506 } 00:23:51.506 ] 00:23:51.506 23:45:36 compress_compdev -- common/autotest_common.sh@907 -- # return 0 00:23:51.506 23:45:36 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:23:51.506 [2024-07-24 23:45:36.491073] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f1b381b15c0 PMD being used: compress_qat 00:23:51.506 [2024-07-24 23:45:36.492503] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x22aa330 PMD being used: compress_qat 00:23:51.506 Running I/O for 3 seconds... 00:23:54.790 00:23:54.790 Latency(us) 00:23:54.790 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:54.790 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:23:54.790 Verification LBA range: start 0x0 length 0x3100 00:23:54.790 COMP_lvs0/lv0 : 3.01 3987.98 15.58 0.00 0.00 7981.30 173.59 13606.52 00:23:54.790 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:23:54.790 Verification LBA range: start 0x3100 length 0x3100 00:23:54.790 COMP_lvs0/lv0 : 3.01 4062.15 15.87 0.00 0.00 7841.99 165.79 13668.94 00:23:54.790 =================================================================================================================== 00:23:54.790 Total : 8050.13 31.45 0.00 0.00 7911.03 165.79 13668.94 00:23:54.790 0 00:23:54.790 23:45:39 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:23:54.790 23:45:39 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:23:54.790 23:45:39 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:23:55.049 23:45:39 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:23:55.049 23:45:39 compress_compdev -- compress/compress.sh@78 -- # killprocess 411209 00:23:55.049 23:45:39 compress_compdev -- common/autotest_common.sh@950 -- # '[' -z 411209 ']' 00:23:55.049 23:45:39 compress_compdev -- common/autotest_common.sh@954 -- # kill -0 411209 00:23:55.049 23:45:39 compress_compdev -- common/autotest_common.sh@955 -- # uname 00:23:55.049 23:45:39 compress_compdev -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:23:55.049 23:45:39 compress_compdev -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 411209 00:23:55.049 23:45:39 compress_compdev -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:23:55.049 23:45:39 compress_compdev -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:23:55.049 23:45:39 compress_compdev -- common/autotest_common.sh@968 -- # echo 'killing process with pid 411209' 00:23:55.049 killing process with pid 411209 00:23:55.049 23:45:39 compress_compdev -- common/autotest_common.sh@969 -- # kill 411209 00:23:55.049 Received shutdown signal, test time was about 3.000000 seconds 00:23:55.049 00:23:55.049 Latency(us) 00:23:55.049 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:55.049 =================================================================================================================== 00:23:55.049 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:23:55.049 23:45:39 compress_compdev -- common/autotest_common.sh@974 -- # wait 411209 00:23:56.425 23:45:41 compress_compdev -- compress/compress.sh@89 -- # run_bdevio 00:23:56.425 23:45:41 compress_compdev -- compress/compress.sh@50 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:23:56.425 23:45:41 compress_compdev -- compress/compress.sh@55 -- # bdevio_pid=413048 00:23:56.425 23:45:41 compress_compdev -- compress/compress.sh@56 -- # trap 'killprocess $bdevio_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:23:56.425 23:45:41 compress_compdev -- compress/compress.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json -w 00:23:56.425 23:45:41 compress_compdev -- compress/compress.sh@57 -- # waitforlisten 413048 00:23:56.425 23:45:41 compress_compdev -- common/autotest_common.sh@831 -- # '[' -z 413048 ']' 00:23:56.425 23:45:41 compress_compdev -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:23:56.425 23:45:41 compress_compdev -- common/autotest_common.sh@836 -- # local max_retries=100 00:23:56.425 23:45:41 compress_compdev -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:23:56.425 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:23:56.425 23:45:41 compress_compdev -- common/autotest_common.sh@840 -- # xtrace_disable 00:23:56.425 23:45:41 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:23:56.425 [2024-07-24 23:45:41.401721] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:23:56.425 [2024-07-24 23:45:41.401764] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid413048 ] 00:23:56.684 [2024-07-24 23:45:41.467347] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:23:56.684 [2024-07-24 23:45:41.546235] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:23:56.684 [2024-07-24 23:45:41.546333] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:23:56.684 [2024-07-24 23:45:41.546335] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:56.943 [2024-07-24 23:45:41.915069] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:23:57.510 23:45:42 compress_compdev -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:23:57.510 23:45:42 compress_compdev -- common/autotest_common.sh@864 -- # return 0 00:23:57.510 23:45:42 compress_compdev -- compress/compress.sh@58 -- # create_vols 00:23:57.510 23:45:42 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:23:57.510 23:45:42 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:24:00.863 [2024-07-24 23:45:45.198724] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x17a4820 PMD being used: compress_qat 00:24:00.863 23:45:45 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:24:00.863 23:45:45 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_name=Nvme0n1 00:24:00.863 23:45:45 compress_compdev -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:24:00.863 23:45:45 compress_compdev -- common/autotest_common.sh@901 -- # local i 00:24:00.863 23:45:45 compress_compdev -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:24:00.863 23:45:45 compress_compdev -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:24:00.863 23:45:45 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:24:00.863 23:45:45 compress_compdev -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:24:00.863 [ 00:24:00.863 { 00:24:00.863 "name": "Nvme0n1", 00:24:00.863 "aliases": [ 00:24:00.863 "8baf6f25-c9d2-46dd-918d-24f7886a5aac" 00:24:00.863 ], 00:24:00.864 "product_name": "NVMe disk", 00:24:00.864 "block_size": 512, 00:24:00.864 "num_blocks": 1953525168, 00:24:00.864 "uuid": "8baf6f25-c9d2-46dd-918d-24f7886a5aac", 00:24:00.864 "assigned_rate_limits": { 00:24:00.864 "rw_ios_per_sec": 0, 00:24:00.864 "rw_mbytes_per_sec": 0, 00:24:00.864 "r_mbytes_per_sec": 0, 00:24:00.864 "w_mbytes_per_sec": 0 00:24:00.864 }, 00:24:00.864 "claimed": false, 00:24:00.864 "zoned": false, 00:24:00.864 "supported_io_types": { 00:24:00.864 "read": true, 00:24:00.864 "write": true, 00:24:00.864 "unmap": true, 00:24:00.864 "flush": true, 00:24:00.864 "reset": true, 00:24:00.864 "nvme_admin": true, 00:24:00.864 "nvme_io": true, 00:24:00.864 "nvme_io_md": false, 00:24:00.864 "write_zeroes": true, 00:24:00.864 "zcopy": false, 00:24:00.864 "get_zone_info": false, 00:24:00.864 "zone_management": false, 00:24:00.864 "zone_append": false, 00:24:00.864 "compare": false, 00:24:00.864 "compare_and_write": false, 00:24:00.864 "abort": true, 00:24:00.864 "seek_hole": false, 00:24:00.864 "seek_data": false, 00:24:00.864 "copy": false, 00:24:00.864 "nvme_iov_md": false 00:24:00.864 }, 00:24:00.864 "driver_specific": { 00:24:00.864 "nvme": [ 00:24:00.864 { 00:24:00.864 "pci_address": "0000:5e:00.0", 00:24:00.864 "trid": { 00:24:00.864 "trtype": "PCIe", 00:24:00.864 "traddr": "0000:5e:00.0" 00:24:00.864 }, 00:24:00.864 "ctrlr_data": { 00:24:00.864 "cntlid": 0, 00:24:00.864 "vendor_id": "0x8086", 00:24:00.864 "model_number": "INTEL SSDPE2KX010T8", 00:24:00.864 "serial_number": "BTLJ807001JM1P0FGN", 00:24:00.864 "firmware_revision": "VDV10170", 00:24:00.864 "oacs": { 00:24:00.864 "security": 1, 00:24:00.864 "format": 1, 00:24:00.864 "firmware": 1, 00:24:00.864 "ns_manage": 1 00:24:00.864 }, 00:24:00.864 "multi_ctrlr": false, 00:24:00.864 "ana_reporting": false 00:24:00.864 }, 00:24:00.864 "vs": { 00:24:00.864 "nvme_version": "1.2" 00:24:00.864 }, 00:24:00.864 "ns_data": { 00:24:00.864 "id": 1, 00:24:00.864 "can_share": false 00:24:00.864 }, 00:24:00.864 "security": { 00:24:00.864 "opal": true 00:24:00.864 } 00:24:00.864 } 00:24:00.864 ], 00:24:00.864 "mp_policy": "active_passive" 00:24:00.864 } 00:24:00.864 } 00:24:00.864 ] 00:24:00.864 23:45:45 compress_compdev -- common/autotest_common.sh@907 -- # return 0 00:24:00.864 23:45:45 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:24:00.864 [2024-07-24 23:45:45.698306] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x17a5190 PMD being used: compress_qat 00:24:01.799 23538bf0-03f1-43bf-9697-aa71ad5dc607 00:24:01.799 23:45:46 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:24:01.799 2b1315aa-4066-46ef-8e10-8d2032e496e1 00:24:01.799 23:45:46 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:24:01.799 23:45:46 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_name=lvs0/lv0 00:24:01.799 23:45:46 compress_compdev -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:24:01.799 23:45:46 compress_compdev -- common/autotest_common.sh@901 -- # local i 00:24:01.799 23:45:46 compress_compdev -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:24:01.799 23:45:46 compress_compdev -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:24:01.799 23:45:46 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:24:02.057 23:45:46 compress_compdev -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:24:02.315 [ 00:24:02.315 { 00:24:02.315 "name": "2b1315aa-4066-46ef-8e10-8d2032e496e1", 00:24:02.315 "aliases": [ 00:24:02.315 "lvs0/lv0" 00:24:02.315 ], 00:24:02.315 "product_name": "Logical Volume", 00:24:02.315 "block_size": 512, 00:24:02.315 "num_blocks": 204800, 00:24:02.315 "uuid": "2b1315aa-4066-46ef-8e10-8d2032e496e1", 00:24:02.315 "assigned_rate_limits": { 00:24:02.315 "rw_ios_per_sec": 0, 00:24:02.315 "rw_mbytes_per_sec": 0, 00:24:02.315 "r_mbytes_per_sec": 0, 00:24:02.315 "w_mbytes_per_sec": 0 00:24:02.315 }, 00:24:02.315 "claimed": false, 00:24:02.315 "zoned": false, 00:24:02.315 "supported_io_types": { 00:24:02.315 "read": true, 00:24:02.315 "write": true, 00:24:02.315 "unmap": true, 00:24:02.315 "flush": false, 00:24:02.315 "reset": true, 00:24:02.315 "nvme_admin": false, 00:24:02.315 "nvme_io": false, 00:24:02.315 "nvme_io_md": false, 00:24:02.315 "write_zeroes": true, 00:24:02.315 "zcopy": false, 00:24:02.315 "get_zone_info": false, 00:24:02.315 "zone_management": false, 00:24:02.315 "zone_append": false, 00:24:02.315 "compare": false, 00:24:02.315 "compare_and_write": false, 00:24:02.315 "abort": false, 00:24:02.315 "seek_hole": true, 00:24:02.315 "seek_data": true, 00:24:02.315 "copy": false, 00:24:02.315 "nvme_iov_md": false 00:24:02.315 }, 00:24:02.315 "driver_specific": { 00:24:02.315 "lvol": { 00:24:02.315 "lvol_store_uuid": "23538bf0-03f1-43bf-9697-aa71ad5dc607", 00:24:02.315 "base_bdev": "Nvme0n1", 00:24:02.315 "thin_provision": true, 00:24:02.315 "num_allocated_clusters": 0, 00:24:02.315 "snapshot": false, 00:24:02.315 "clone": false, 00:24:02.315 "esnap_clone": false 00:24:02.315 } 00:24:02.315 } 00:24:02.315 } 00:24:02.315 ] 00:24:02.315 23:45:47 compress_compdev -- common/autotest_common.sh@907 -- # return 0 00:24:02.315 23:45:47 compress_compdev -- compress/compress.sh@41 -- # '[' -z '' ']' 00:24:02.315 23:45:47 compress_compdev -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:24:02.315 [2024-07-24 23:45:47.248285] vbdev_compress.c: 999:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:24:02.315 COMP_lvs0/lv0 00:24:02.315 23:45:47 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:24:02.315 23:45:47 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_name=COMP_lvs0/lv0 00:24:02.315 23:45:47 compress_compdev -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:24:02.315 23:45:47 compress_compdev -- common/autotest_common.sh@901 -- # local i 00:24:02.315 23:45:47 compress_compdev -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:24:02.315 23:45:47 compress_compdev -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:24:02.316 23:45:47 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:24:02.574 23:45:47 compress_compdev -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:24:02.833 [ 00:24:02.833 { 00:24:02.833 "name": "COMP_lvs0/lv0", 00:24:02.833 "aliases": [ 00:24:02.833 "2a28159f-5351-5b0a-8ef9-bfd0860af095" 00:24:02.833 ], 00:24:02.833 "product_name": "compress", 00:24:02.833 "block_size": 512, 00:24:02.833 "num_blocks": 200704, 00:24:02.833 "uuid": "2a28159f-5351-5b0a-8ef9-bfd0860af095", 00:24:02.833 "assigned_rate_limits": { 00:24:02.833 "rw_ios_per_sec": 0, 00:24:02.833 "rw_mbytes_per_sec": 0, 00:24:02.833 "r_mbytes_per_sec": 0, 00:24:02.833 "w_mbytes_per_sec": 0 00:24:02.833 }, 00:24:02.833 "claimed": false, 00:24:02.833 "zoned": false, 00:24:02.833 "supported_io_types": { 00:24:02.833 "read": true, 00:24:02.833 "write": true, 00:24:02.833 "unmap": false, 00:24:02.833 "flush": false, 00:24:02.833 "reset": false, 00:24:02.833 "nvme_admin": false, 00:24:02.833 "nvme_io": false, 00:24:02.833 "nvme_io_md": false, 00:24:02.833 "write_zeroes": true, 00:24:02.833 "zcopy": false, 00:24:02.833 "get_zone_info": false, 00:24:02.833 "zone_management": false, 00:24:02.833 "zone_append": false, 00:24:02.833 "compare": false, 00:24:02.833 "compare_and_write": false, 00:24:02.833 "abort": false, 00:24:02.833 "seek_hole": false, 00:24:02.833 "seek_data": false, 00:24:02.833 "copy": false, 00:24:02.833 "nvme_iov_md": false 00:24:02.833 }, 00:24:02.833 "driver_specific": { 00:24:02.833 "compress": { 00:24:02.833 "name": "COMP_lvs0/lv0", 00:24:02.833 "base_bdev_name": "2b1315aa-4066-46ef-8e10-8d2032e496e1", 00:24:02.833 "pm_path": "/tmp/pmem/27a0684a-32a2-4ace-9ca4-fb46e0071e92" 00:24:02.833 } 00:24:02.833 } 00:24:02.833 } 00:24:02.833 ] 00:24:02.833 23:45:47 compress_compdev -- common/autotest_common.sh@907 -- # return 0 00:24:02.833 23:45:47 compress_compdev -- compress/compress.sh@59 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:24:02.833 [2024-07-24 23:45:47.677131] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f45f01b1350 PMD being used: compress_qat 00:24:02.833 I/O targets: 00:24:02.833 COMP_lvs0/lv0: 200704 blocks of 512 bytes (98 MiB) 00:24:02.833 00:24:02.833 00:24:02.833 CUnit - A unit testing framework for C - Version 2.1-3 00:24:02.833 http://cunit.sourceforge.net/ 00:24:02.833 00:24:02.833 00:24:02.833 Suite: bdevio tests on: COMP_lvs0/lv0 00:24:02.833 Test: blockdev write read block ...passed 00:24:02.833 Test: blockdev write zeroes read block ...passed 00:24:02.833 Test: blockdev write zeroes read no split ...passed 00:24:02.833 Test: blockdev write zeroes read split ...passed 00:24:02.833 Test: blockdev write zeroes read split partial ...passed 00:24:02.833 Test: blockdev reset ...[2024-07-24 23:45:47.734201] vbdev_compress.c: 252:vbdev_compress_submit_request: *ERROR*: Unknown I/O type 5 00:24:02.833 passed 00:24:02.833 Test: blockdev write read 8 blocks ...passed 00:24:02.833 Test: blockdev write read size > 128k ...passed 00:24:02.833 Test: blockdev write read invalid size ...passed 00:24:02.833 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:24:02.833 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:24:02.833 Test: blockdev write read max offset ...passed 00:24:02.833 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:24:02.833 Test: blockdev writev readv 8 blocks ...passed 00:24:02.833 Test: blockdev writev readv 30 x 1block ...passed 00:24:02.833 Test: blockdev writev readv block ...passed 00:24:02.833 Test: blockdev writev readv size > 128k ...passed 00:24:02.833 Test: blockdev writev readv size > 128k in two iovs ...passed 00:24:02.833 Test: blockdev comparev and writev ...passed 00:24:02.833 Test: blockdev nvme passthru rw ...passed 00:24:02.833 Test: blockdev nvme passthru vendor specific ...passed 00:24:02.833 Test: blockdev nvme admin passthru ...passed 00:24:02.833 Test: blockdev copy ...passed 00:24:02.833 00:24:02.833 Run Summary: Type Total Ran Passed Failed Inactive 00:24:02.833 suites 1 1 n/a 0 0 00:24:02.833 tests 23 23 23 0 0 00:24:02.833 asserts 130 130 130 0 n/a 00:24:02.833 00:24:02.833 Elapsed time = 0.156 seconds 00:24:02.833 0 00:24:02.833 23:45:47 compress_compdev -- compress/compress.sh@60 -- # destroy_vols 00:24:02.833 23:45:47 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:24:03.091 23:45:47 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:24:03.349 23:45:48 compress_compdev -- compress/compress.sh@61 -- # trap - SIGINT SIGTERM EXIT 00:24:03.349 23:45:48 compress_compdev -- compress/compress.sh@62 -- # killprocess 413048 00:24:03.349 23:45:48 compress_compdev -- common/autotest_common.sh@950 -- # '[' -z 413048 ']' 00:24:03.349 23:45:48 compress_compdev -- common/autotest_common.sh@954 -- # kill -0 413048 00:24:03.349 23:45:48 compress_compdev -- common/autotest_common.sh@955 -- # uname 00:24:03.349 23:45:48 compress_compdev -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:24:03.349 23:45:48 compress_compdev -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 413048 00:24:03.349 23:45:48 compress_compdev -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:24:03.349 23:45:48 compress_compdev -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:24:03.349 23:45:48 compress_compdev -- common/autotest_common.sh@968 -- # echo 'killing process with pid 413048' 00:24:03.349 killing process with pid 413048 00:24:03.349 23:45:48 compress_compdev -- common/autotest_common.sh@969 -- # kill 413048 00:24:03.349 23:45:48 compress_compdev -- common/autotest_common.sh@974 -- # wait 413048 00:24:04.724 23:45:49 compress_compdev -- compress/compress.sh@91 -- # '[' 0 -eq 1 ']' 00:24:04.724 23:45:49 compress_compdev -- compress/compress.sh@120 -- # rm -rf /tmp/pmem 00:24:04.724 00:24:04.724 real 0m42.014s 00:24:04.724 user 1m34.275s 00:24:04.724 sys 0m3.332s 00:24:04.724 23:45:49 compress_compdev -- common/autotest_common.sh@1126 -- # xtrace_disable 00:24:04.724 23:45:49 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:24:04.724 ************************************ 00:24:04.724 END TEST compress_compdev 00:24:04.724 ************************************ 00:24:04.724 23:45:49 -- spdk/autotest.sh@353 -- # run_test compress_isal /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh isal 00:24:04.724 23:45:49 -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:24:04.724 23:45:49 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:24:04.724 23:45:49 -- common/autotest_common.sh@10 -- # set +x 00:24:04.724 ************************************ 00:24:04.724 START TEST compress_isal 00:24:04.724 ************************************ 00:24:04.724 23:45:49 compress_isal -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh isal 00:24:04.984 * Looking for test storage... 00:24:04.984 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress 00:24:04.984 23:45:49 compress_isal -- compress/compress.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:24:04.984 23:45:49 compress_isal -- nvmf/common.sh@7 -- # uname -s 00:24:04.984 23:45:49 compress_isal -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:24:04.984 23:45:49 compress_isal -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:24:04.984 23:45:49 compress_isal -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:24:04.984 23:45:49 compress_isal -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:24:04.984 23:45:49 compress_isal -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:24:04.984 23:45:49 compress_isal -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:24:04.984 23:45:49 compress_isal -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:24:04.984 23:45:49 compress_isal -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:24:04.984 23:45:49 compress_isal -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:24:04.984 23:45:49 compress_isal -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:24:04.984 23:45:49 compress_isal -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:801347e8-3fd0-e911-906e-0017a4403562 00:24:04.984 23:45:49 compress_isal -- nvmf/common.sh@18 -- # NVME_HOSTID=801347e8-3fd0-e911-906e-0017a4403562 00:24:04.984 23:45:49 compress_isal -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:24:04.984 23:45:49 compress_isal -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:24:04.984 23:45:49 compress_isal -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:24:04.984 23:45:49 compress_isal -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:24:04.984 23:45:49 compress_isal -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:24:04.984 23:45:49 compress_isal -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:24:04.984 23:45:49 compress_isal -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:24:04.984 23:45:49 compress_isal -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:24:04.984 23:45:49 compress_isal -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:24:04.984 23:45:49 compress_isal -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:24:04.984 23:45:49 compress_isal -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:24:04.984 23:45:49 compress_isal -- paths/export.sh@5 -- # export PATH 00:24:04.984 23:45:49 compress_isal -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:24:04.984 23:45:49 compress_isal -- nvmf/common.sh@47 -- # : 0 00:24:04.984 23:45:49 compress_isal -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:24:04.984 23:45:49 compress_isal -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:24:04.984 23:45:49 compress_isal -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:24:04.984 23:45:49 compress_isal -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:24:04.984 23:45:49 compress_isal -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:24:04.984 23:45:49 compress_isal -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:24:04.984 23:45:49 compress_isal -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:24:04.984 23:45:49 compress_isal -- nvmf/common.sh@51 -- # have_pci_nics=0 00:24:04.984 23:45:49 compress_isal -- compress/compress.sh@17 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:04.984 23:45:49 compress_isal -- compress/compress.sh@81 -- # mkdir -p /tmp/pmem 00:24:04.984 23:45:49 compress_isal -- compress/compress.sh@82 -- # test_type=isal 00:24:04.984 23:45:49 compress_isal -- compress/compress.sh@86 -- # run_bdevperf 32 4096 3 00:24:04.984 23:45:49 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:24:04.984 23:45:49 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=414488 00:24:04.984 23:45:49 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:24:04.984 23:45:49 compress_isal -- compress/compress.sh@73 -- # waitforlisten 414488 00:24:04.984 23:45:49 compress_isal -- common/autotest_common.sh@831 -- # '[' -z 414488 ']' 00:24:04.984 23:45:49 compress_isal -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:24:04.984 23:45:49 compress_isal -- common/autotest_common.sh@836 -- # local max_retries=100 00:24:04.984 23:45:49 compress_isal -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:24:04.984 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:24:04.984 23:45:49 compress_isal -- common/autotest_common.sh@840 -- # xtrace_disable 00:24:04.984 23:45:49 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:24:04.984 23:45:49 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:24:04.984 [2024-07-24 23:45:49.856868] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:24:04.984 [2024-07-24 23:45:49.856914] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid414488 ] 00:24:04.984 [2024-07-24 23:45:49.921796] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:24:05.242 [2024-07-24 23:45:50.000354] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:24:05.242 [2024-07-24 23:45:50.000357] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:24:05.808 23:45:50 compress_isal -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:24:05.808 23:45:50 compress_isal -- common/autotest_common.sh@864 -- # return 0 00:24:05.808 23:45:50 compress_isal -- compress/compress.sh@74 -- # create_vols 00:24:05.808 23:45:50 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:24:05.808 23:45:50 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:24:09.091 23:45:53 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:24:09.091 23:45:53 compress_isal -- common/autotest_common.sh@899 -- # local bdev_name=Nvme0n1 00:24:09.091 23:45:53 compress_isal -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:24:09.091 23:45:53 compress_isal -- common/autotest_common.sh@901 -- # local i 00:24:09.091 23:45:53 compress_isal -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:24:09.091 23:45:53 compress_isal -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:24:09.091 23:45:53 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:24:09.091 23:45:53 compress_isal -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:24:09.091 [ 00:24:09.091 { 00:24:09.091 "name": "Nvme0n1", 00:24:09.091 "aliases": [ 00:24:09.091 "ed94ca00-3044-4c4f-8378-285b3b6fea60" 00:24:09.091 ], 00:24:09.091 "product_name": "NVMe disk", 00:24:09.091 "block_size": 512, 00:24:09.091 "num_blocks": 1953525168, 00:24:09.091 "uuid": "ed94ca00-3044-4c4f-8378-285b3b6fea60", 00:24:09.091 "assigned_rate_limits": { 00:24:09.091 "rw_ios_per_sec": 0, 00:24:09.091 "rw_mbytes_per_sec": 0, 00:24:09.091 "r_mbytes_per_sec": 0, 00:24:09.091 "w_mbytes_per_sec": 0 00:24:09.091 }, 00:24:09.091 "claimed": false, 00:24:09.091 "zoned": false, 00:24:09.091 "supported_io_types": { 00:24:09.091 "read": true, 00:24:09.091 "write": true, 00:24:09.091 "unmap": true, 00:24:09.091 "flush": true, 00:24:09.091 "reset": true, 00:24:09.091 "nvme_admin": true, 00:24:09.091 "nvme_io": true, 00:24:09.091 "nvme_io_md": false, 00:24:09.091 "write_zeroes": true, 00:24:09.091 "zcopy": false, 00:24:09.091 "get_zone_info": false, 00:24:09.091 "zone_management": false, 00:24:09.091 "zone_append": false, 00:24:09.091 "compare": false, 00:24:09.091 "compare_and_write": false, 00:24:09.091 "abort": true, 00:24:09.091 "seek_hole": false, 00:24:09.091 "seek_data": false, 00:24:09.091 "copy": false, 00:24:09.091 "nvme_iov_md": false 00:24:09.091 }, 00:24:09.091 "driver_specific": { 00:24:09.091 "nvme": [ 00:24:09.091 { 00:24:09.091 "pci_address": "0000:5e:00.0", 00:24:09.091 "trid": { 00:24:09.091 "trtype": "PCIe", 00:24:09.091 "traddr": "0000:5e:00.0" 00:24:09.091 }, 00:24:09.091 "ctrlr_data": { 00:24:09.091 "cntlid": 0, 00:24:09.091 "vendor_id": "0x8086", 00:24:09.091 "model_number": "INTEL SSDPE2KX010T8", 00:24:09.091 "serial_number": "BTLJ807001JM1P0FGN", 00:24:09.091 "firmware_revision": "VDV10170", 00:24:09.091 "oacs": { 00:24:09.091 "security": 1, 00:24:09.091 "format": 1, 00:24:09.091 "firmware": 1, 00:24:09.091 "ns_manage": 1 00:24:09.091 }, 00:24:09.091 "multi_ctrlr": false, 00:24:09.091 "ana_reporting": false 00:24:09.091 }, 00:24:09.091 "vs": { 00:24:09.091 "nvme_version": "1.2" 00:24:09.091 }, 00:24:09.091 "ns_data": { 00:24:09.091 "id": 1, 00:24:09.091 "can_share": false 00:24:09.091 }, 00:24:09.091 "security": { 00:24:09.091 "opal": true 00:24:09.091 } 00:24:09.091 } 00:24:09.091 ], 00:24:09.091 "mp_policy": "active_passive" 00:24:09.091 } 00:24:09.091 } 00:24:09.091 ] 00:24:09.091 23:45:54 compress_isal -- common/autotest_common.sh@907 -- # return 0 00:24:09.091 23:45:54 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:24:10.466 2f24c30f-f735-4f64-a21e-fb7f9a4b3632 00:24:10.466 23:45:55 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:24:10.466 ef8474b8-b965-4b2e-bb45-bd8a36b1f260 00:24:10.466 23:45:55 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:24:10.466 23:45:55 compress_isal -- common/autotest_common.sh@899 -- # local bdev_name=lvs0/lv0 00:24:10.466 23:45:55 compress_isal -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:24:10.466 23:45:55 compress_isal -- common/autotest_common.sh@901 -- # local i 00:24:10.466 23:45:55 compress_isal -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:24:10.466 23:45:55 compress_isal -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:24:10.466 23:45:55 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:24:10.466 23:45:55 compress_isal -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:24:10.725 [ 00:24:10.725 { 00:24:10.725 "name": "ef8474b8-b965-4b2e-bb45-bd8a36b1f260", 00:24:10.725 "aliases": [ 00:24:10.725 "lvs0/lv0" 00:24:10.725 ], 00:24:10.725 "product_name": "Logical Volume", 00:24:10.725 "block_size": 512, 00:24:10.725 "num_blocks": 204800, 00:24:10.725 "uuid": "ef8474b8-b965-4b2e-bb45-bd8a36b1f260", 00:24:10.725 "assigned_rate_limits": { 00:24:10.725 "rw_ios_per_sec": 0, 00:24:10.725 "rw_mbytes_per_sec": 0, 00:24:10.725 "r_mbytes_per_sec": 0, 00:24:10.725 "w_mbytes_per_sec": 0 00:24:10.725 }, 00:24:10.725 "claimed": false, 00:24:10.725 "zoned": false, 00:24:10.725 "supported_io_types": { 00:24:10.725 "read": true, 00:24:10.725 "write": true, 00:24:10.725 "unmap": true, 00:24:10.725 "flush": false, 00:24:10.725 "reset": true, 00:24:10.725 "nvme_admin": false, 00:24:10.725 "nvme_io": false, 00:24:10.725 "nvme_io_md": false, 00:24:10.725 "write_zeroes": true, 00:24:10.725 "zcopy": false, 00:24:10.725 "get_zone_info": false, 00:24:10.725 "zone_management": false, 00:24:10.725 "zone_append": false, 00:24:10.725 "compare": false, 00:24:10.725 "compare_and_write": false, 00:24:10.725 "abort": false, 00:24:10.725 "seek_hole": true, 00:24:10.725 "seek_data": true, 00:24:10.725 "copy": false, 00:24:10.725 "nvme_iov_md": false 00:24:10.725 }, 00:24:10.725 "driver_specific": { 00:24:10.725 "lvol": { 00:24:10.725 "lvol_store_uuid": "2f24c30f-f735-4f64-a21e-fb7f9a4b3632", 00:24:10.725 "base_bdev": "Nvme0n1", 00:24:10.725 "thin_provision": true, 00:24:10.725 "num_allocated_clusters": 0, 00:24:10.725 "snapshot": false, 00:24:10.725 "clone": false, 00:24:10.725 "esnap_clone": false 00:24:10.725 } 00:24:10.725 } 00:24:10.725 } 00:24:10.725 ] 00:24:10.725 23:45:55 compress_isal -- common/autotest_common.sh@907 -- # return 0 00:24:10.725 23:45:55 compress_isal -- compress/compress.sh@41 -- # '[' -z '' ']' 00:24:10.725 23:45:55 compress_isal -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:24:10.984 [2024-07-24 23:45:55.794917] vbdev_compress.c: 999:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:24:10.984 COMP_lvs0/lv0 00:24:10.984 23:45:55 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:24:10.984 23:45:55 compress_isal -- common/autotest_common.sh@899 -- # local bdev_name=COMP_lvs0/lv0 00:24:10.984 23:45:55 compress_isal -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:24:10.984 23:45:55 compress_isal -- common/autotest_common.sh@901 -- # local i 00:24:10.984 23:45:55 compress_isal -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:24:10.984 23:45:55 compress_isal -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:24:10.984 23:45:55 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:24:11.242 23:45:55 compress_isal -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:24:11.242 [ 00:24:11.242 { 00:24:11.242 "name": "COMP_lvs0/lv0", 00:24:11.242 "aliases": [ 00:24:11.242 "8e3087b6-67a7-5a52-bf03-7ed9539d46cb" 00:24:11.242 ], 00:24:11.242 "product_name": "compress", 00:24:11.242 "block_size": 512, 00:24:11.242 "num_blocks": 200704, 00:24:11.242 "uuid": "8e3087b6-67a7-5a52-bf03-7ed9539d46cb", 00:24:11.242 "assigned_rate_limits": { 00:24:11.242 "rw_ios_per_sec": 0, 00:24:11.242 "rw_mbytes_per_sec": 0, 00:24:11.242 "r_mbytes_per_sec": 0, 00:24:11.242 "w_mbytes_per_sec": 0 00:24:11.242 }, 00:24:11.242 "claimed": false, 00:24:11.242 "zoned": false, 00:24:11.242 "supported_io_types": { 00:24:11.242 "read": true, 00:24:11.242 "write": true, 00:24:11.242 "unmap": false, 00:24:11.242 "flush": false, 00:24:11.242 "reset": false, 00:24:11.242 "nvme_admin": false, 00:24:11.242 "nvme_io": false, 00:24:11.242 "nvme_io_md": false, 00:24:11.242 "write_zeroes": true, 00:24:11.242 "zcopy": false, 00:24:11.242 "get_zone_info": false, 00:24:11.242 "zone_management": false, 00:24:11.242 "zone_append": false, 00:24:11.242 "compare": false, 00:24:11.242 "compare_and_write": false, 00:24:11.242 "abort": false, 00:24:11.242 "seek_hole": false, 00:24:11.242 "seek_data": false, 00:24:11.242 "copy": false, 00:24:11.242 "nvme_iov_md": false 00:24:11.242 }, 00:24:11.242 "driver_specific": { 00:24:11.242 "compress": { 00:24:11.242 "name": "COMP_lvs0/lv0", 00:24:11.242 "base_bdev_name": "ef8474b8-b965-4b2e-bb45-bd8a36b1f260", 00:24:11.243 "pm_path": "/tmp/pmem/867ccad3-4deb-405d-b054-407d6368be51" 00:24:11.243 } 00:24:11.243 } 00:24:11.243 } 00:24:11.243 ] 00:24:11.243 23:45:56 compress_isal -- common/autotest_common.sh@907 -- # return 0 00:24:11.243 23:45:56 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:24:11.243 Running I/O for 3 seconds... 00:24:14.528 00:24:14.528 Latency(us) 00:24:14.529 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:14.529 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:24:14.529 Verification LBA range: start 0x0 length 0x3100 00:24:14.529 COMP_lvs0/lv0 : 3.01 3381.95 13.21 0.00 0.00 9413.52 54.37 14417.92 00:24:14.529 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:24:14.529 Verification LBA range: start 0x3100 length 0x3100 00:24:14.529 COMP_lvs0/lv0 : 3.01 3408.05 13.31 0.00 0.00 9343.39 54.61 14917.24 00:24:14.529 =================================================================================================================== 00:24:14.529 Total : 6790.00 26.52 0.00 0.00 9378.33 54.37 14917.24 00:24:14.529 0 00:24:14.529 23:45:59 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:24:14.529 23:45:59 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:24:14.529 23:45:59 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:24:14.787 23:45:59 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:24:14.787 23:45:59 compress_isal -- compress/compress.sh@78 -- # killprocess 414488 00:24:14.787 23:45:59 compress_isal -- common/autotest_common.sh@950 -- # '[' -z 414488 ']' 00:24:14.787 23:45:59 compress_isal -- common/autotest_common.sh@954 -- # kill -0 414488 00:24:14.787 23:45:59 compress_isal -- common/autotest_common.sh@955 -- # uname 00:24:14.787 23:45:59 compress_isal -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:24:14.787 23:45:59 compress_isal -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 414488 00:24:14.787 23:45:59 compress_isal -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:24:14.787 23:45:59 compress_isal -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:24:14.787 23:45:59 compress_isal -- common/autotest_common.sh@968 -- # echo 'killing process with pid 414488' 00:24:14.787 killing process with pid 414488 00:24:14.787 23:45:59 compress_isal -- common/autotest_common.sh@969 -- # kill 414488 00:24:14.787 Received shutdown signal, test time was about 3.000000 seconds 00:24:14.787 00:24:14.787 Latency(us) 00:24:14.787 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:14.787 =================================================================================================================== 00:24:14.787 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:24:14.787 23:45:59 compress_isal -- common/autotest_common.sh@974 -- # wait 414488 00:24:16.162 23:46:01 compress_isal -- compress/compress.sh@87 -- # run_bdevperf 32 4096 3 512 00:24:16.162 23:46:01 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:24:16.162 23:46:01 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=416330 00:24:16.162 23:46:01 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:24:16.162 23:46:01 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:24:16.162 23:46:01 compress_isal -- compress/compress.sh@73 -- # waitforlisten 416330 00:24:16.162 23:46:01 compress_isal -- common/autotest_common.sh@831 -- # '[' -z 416330 ']' 00:24:16.162 23:46:01 compress_isal -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:24:16.162 23:46:01 compress_isal -- common/autotest_common.sh@836 -- # local max_retries=100 00:24:16.162 23:46:01 compress_isal -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:24:16.162 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:24:16.163 23:46:01 compress_isal -- common/autotest_common.sh@840 -- # xtrace_disable 00:24:16.163 23:46:01 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:24:16.163 [2024-07-24 23:46:01.145244] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:24:16.163 [2024-07-24 23:46:01.145287] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid416330 ] 00:24:16.421 [2024-07-24 23:46:01.207954] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:24:16.421 [2024-07-24 23:46:01.285126] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:24:16.421 [2024-07-24 23:46:01.285128] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:24:16.986 23:46:01 compress_isal -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:24:16.986 23:46:01 compress_isal -- common/autotest_common.sh@864 -- # return 0 00:24:16.986 23:46:01 compress_isal -- compress/compress.sh@74 -- # create_vols 512 00:24:16.986 23:46:01 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:24:16.986 23:46:01 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:24:20.270 23:46:04 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:24:20.270 23:46:04 compress_isal -- common/autotest_common.sh@899 -- # local bdev_name=Nvme0n1 00:24:20.270 23:46:04 compress_isal -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:24:20.270 23:46:04 compress_isal -- common/autotest_common.sh@901 -- # local i 00:24:20.270 23:46:04 compress_isal -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:24:20.270 23:46:04 compress_isal -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:24:20.270 23:46:04 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:24:20.270 23:46:05 compress_isal -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:24:20.529 [ 00:24:20.529 { 00:24:20.529 "name": "Nvme0n1", 00:24:20.529 "aliases": [ 00:24:20.529 "ad46ba36-3953-43c9-a3c3-a0909224658b" 00:24:20.529 ], 00:24:20.529 "product_name": "NVMe disk", 00:24:20.529 "block_size": 512, 00:24:20.529 "num_blocks": 1953525168, 00:24:20.529 "uuid": "ad46ba36-3953-43c9-a3c3-a0909224658b", 00:24:20.529 "assigned_rate_limits": { 00:24:20.529 "rw_ios_per_sec": 0, 00:24:20.529 "rw_mbytes_per_sec": 0, 00:24:20.529 "r_mbytes_per_sec": 0, 00:24:20.529 "w_mbytes_per_sec": 0 00:24:20.529 }, 00:24:20.529 "claimed": false, 00:24:20.529 "zoned": false, 00:24:20.529 "supported_io_types": { 00:24:20.529 "read": true, 00:24:20.529 "write": true, 00:24:20.529 "unmap": true, 00:24:20.529 "flush": true, 00:24:20.529 "reset": true, 00:24:20.529 "nvme_admin": true, 00:24:20.529 "nvme_io": true, 00:24:20.529 "nvme_io_md": false, 00:24:20.529 "write_zeroes": true, 00:24:20.529 "zcopy": false, 00:24:20.529 "get_zone_info": false, 00:24:20.529 "zone_management": false, 00:24:20.529 "zone_append": false, 00:24:20.529 "compare": false, 00:24:20.529 "compare_and_write": false, 00:24:20.529 "abort": true, 00:24:20.529 "seek_hole": false, 00:24:20.529 "seek_data": false, 00:24:20.529 "copy": false, 00:24:20.529 "nvme_iov_md": false 00:24:20.529 }, 00:24:20.529 "driver_specific": { 00:24:20.529 "nvme": [ 00:24:20.529 { 00:24:20.529 "pci_address": "0000:5e:00.0", 00:24:20.529 "trid": { 00:24:20.529 "trtype": "PCIe", 00:24:20.529 "traddr": "0000:5e:00.0" 00:24:20.529 }, 00:24:20.529 "ctrlr_data": { 00:24:20.529 "cntlid": 0, 00:24:20.529 "vendor_id": "0x8086", 00:24:20.529 "model_number": "INTEL SSDPE2KX010T8", 00:24:20.529 "serial_number": "BTLJ807001JM1P0FGN", 00:24:20.529 "firmware_revision": "VDV10170", 00:24:20.529 "oacs": { 00:24:20.529 "security": 1, 00:24:20.529 "format": 1, 00:24:20.529 "firmware": 1, 00:24:20.529 "ns_manage": 1 00:24:20.529 }, 00:24:20.529 "multi_ctrlr": false, 00:24:20.529 "ana_reporting": false 00:24:20.529 }, 00:24:20.529 "vs": { 00:24:20.529 "nvme_version": "1.2" 00:24:20.529 }, 00:24:20.529 "ns_data": { 00:24:20.529 "id": 1, 00:24:20.529 "can_share": false 00:24:20.529 }, 00:24:20.529 "security": { 00:24:20.529 "opal": true 00:24:20.529 } 00:24:20.529 } 00:24:20.529 ], 00:24:20.529 "mp_policy": "active_passive" 00:24:20.529 } 00:24:20.529 } 00:24:20.529 ] 00:24:20.529 23:46:05 compress_isal -- common/autotest_common.sh@907 -- # return 0 00:24:20.529 23:46:05 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:24:21.465 2d5eae87-0424-4f23-819f-a4b5be171268 00:24:21.465 23:46:06 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:24:21.465 da5a465c-17f7-4323-b8a2-cb88eb3ea5b8 00:24:21.465 23:46:06 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:24:21.465 23:46:06 compress_isal -- common/autotest_common.sh@899 -- # local bdev_name=lvs0/lv0 00:24:21.465 23:46:06 compress_isal -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:24:21.465 23:46:06 compress_isal -- common/autotest_common.sh@901 -- # local i 00:24:21.465 23:46:06 compress_isal -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:24:21.465 23:46:06 compress_isal -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:24:21.465 23:46:06 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:24:21.724 23:46:06 compress_isal -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:24:21.982 [ 00:24:21.982 { 00:24:21.982 "name": "da5a465c-17f7-4323-b8a2-cb88eb3ea5b8", 00:24:21.982 "aliases": [ 00:24:21.982 "lvs0/lv0" 00:24:21.982 ], 00:24:21.982 "product_name": "Logical Volume", 00:24:21.982 "block_size": 512, 00:24:21.982 "num_blocks": 204800, 00:24:21.982 "uuid": "da5a465c-17f7-4323-b8a2-cb88eb3ea5b8", 00:24:21.982 "assigned_rate_limits": { 00:24:21.982 "rw_ios_per_sec": 0, 00:24:21.982 "rw_mbytes_per_sec": 0, 00:24:21.982 "r_mbytes_per_sec": 0, 00:24:21.982 "w_mbytes_per_sec": 0 00:24:21.982 }, 00:24:21.982 "claimed": false, 00:24:21.982 "zoned": false, 00:24:21.982 "supported_io_types": { 00:24:21.982 "read": true, 00:24:21.982 "write": true, 00:24:21.982 "unmap": true, 00:24:21.982 "flush": false, 00:24:21.982 "reset": true, 00:24:21.982 "nvme_admin": false, 00:24:21.982 "nvme_io": false, 00:24:21.982 "nvme_io_md": false, 00:24:21.982 "write_zeroes": true, 00:24:21.982 "zcopy": false, 00:24:21.982 "get_zone_info": false, 00:24:21.982 "zone_management": false, 00:24:21.982 "zone_append": false, 00:24:21.982 "compare": false, 00:24:21.982 "compare_and_write": false, 00:24:21.982 "abort": false, 00:24:21.982 "seek_hole": true, 00:24:21.982 "seek_data": true, 00:24:21.982 "copy": false, 00:24:21.982 "nvme_iov_md": false 00:24:21.982 }, 00:24:21.982 "driver_specific": { 00:24:21.982 "lvol": { 00:24:21.982 "lvol_store_uuid": "2d5eae87-0424-4f23-819f-a4b5be171268", 00:24:21.982 "base_bdev": "Nvme0n1", 00:24:21.982 "thin_provision": true, 00:24:21.982 "num_allocated_clusters": 0, 00:24:21.982 "snapshot": false, 00:24:21.982 "clone": false, 00:24:21.982 "esnap_clone": false 00:24:21.982 } 00:24:21.982 } 00:24:21.982 } 00:24:21.982 ] 00:24:21.982 23:46:06 compress_isal -- common/autotest_common.sh@907 -- # return 0 00:24:21.982 23:46:06 compress_isal -- compress/compress.sh@41 -- # '[' -z 512 ']' 00:24:21.982 23:46:06 compress_isal -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 512 00:24:21.982 [2024-07-24 23:46:06.940333] vbdev_compress.c: 999:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:24:21.982 COMP_lvs0/lv0 00:24:21.982 23:46:06 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:24:21.982 23:46:06 compress_isal -- common/autotest_common.sh@899 -- # local bdev_name=COMP_lvs0/lv0 00:24:21.982 23:46:06 compress_isal -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:24:21.982 23:46:06 compress_isal -- common/autotest_common.sh@901 -- # local i 00:24:21.982 23:46:06 compress_isal -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:24:21.982 23:46:06 compress_isal -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:24:21.982 23:46:06 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:24:22.245 23:46:07 compress_isal -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:24:22.591 [ 00:24:22.591 { 00:24:22.591 "name": "COMP_lvs0/lv0", 00:24:22.591 "aliases": [ 00:24:22.591 "bf4633d0-1f5e-5581-b7b5-8455dec74616" 00:24:22.591 ], 00:24:22.591 "product_name": "compress", 00:24:22.591 "block_size": 512, 00:24:22.591 "num_blocks": 200704, 00:24:22.591 "uuid": "bf4633d0-1f5e-5581-b7b5-8455dec74616", 00:24:22.591 "assigned_rate_limits": { 00:24:22.591 "rw_ios_per_sec": 0, 00:24:22.591 "rw_mbytes_per_sec": 0, 00:24:22.591 "r_mbytes_per_sec": 0, 00:24:22.591 "w_mbytes_per_sec": 0 00:24:22.591 }, 00:24:22.591 "claimed": false, 00:24:22.591 "zoned": false, 00:24:22.591 "supported_io_types": { 00:24:22.591 "read": true, 00:24:22.591 "write": true, 00:24:22.591 "unmap": false, 00:24:22.592 "flush": false, 00:24:22.592 "reset": false, 00:24:22.592 "nvme_admin": false, 00:24:22.592 "nvme_io": false, 00:24:22.592 "nvme_io_md": false, 00:24:22.592 "write_zeroes": true, 00:24:22.592 "zcopy": false, 00:24:22.592 "get_zone_info": false, 00:24:22.592 "zone_management": false, 00:24:22.592 "zone_append": false, 00:24:22.592 "compare": false, 00:24:22.592 "compare_and_write": false, 00:24:22.592 "abort": false, 00:24:22.592 "seek_hole": false, 00:24:22.592 "seek_data": false, 00:24:22.592 "copy": false, 00:24:22.592 "nvme_iov_md": false 00:24:22.592 }, 00:24:22.592 "driver_specific": { 00:24:22.592 "compress": { 00:24:22.592 "name": "COMP_lvs0/lv0", 00:24:22.592 "base_bdev_name": "da5a465c-17f7-4323-b8a2-cb88eb3ea5b8", 00:24:22.592 "pm_path": "/tmp/pmem/7efbb866-1b9d-4de0-9653-bf13dccee57b" 00:24:22.592 } 00:24:22.592 } 00:24:22.592 } 00:24:22.592 ] 00:24:22.592 23:46:07 compress_isal -- common/autotest_common.sh@907 -- # return 0 00:24:22.592 23:46:07 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:24:22.592 Running I/O for 3 seconds... 00:24:25.875 00:24:25.875 Latency(us) 00:24:25.875 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:25.875 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:24:25.875 Verification LBA range: start 0x0 length 0x3100 00:24:25.875 COMP_lvs0/lv0 : 3.01 3358.64 13.12 0.00 0.00 9490.65 55.59 15416.56 00:24:25.875 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:24:25.875 Verification LBA range: start 0x3100 length 0x3100 00:24:25.875 COMP_lvs0/lv0 : 3.01 3355.76 13.11 0.00 0.00 9487.63 55.34 15354.15 00:24:25.875 =================================================================================================================== 00:24:25.875 Total : 6714.40 26.23 0.00 0.00 9489.14 55.34 15416.56 00:24:25.875 0 00:24:25.875 23:46:10 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:24:25.875 23:46:10 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:24:25.875 23:46:10 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:24:25.875 23:46:10 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:24:25.875 23:46:10 compress_isal -- compress/compress.sh@78 -- # killprocess 416330 00:24:25.875 23:46:10 compress_isal -- common/autotest_common.sh@950 -- # '[' -z 416330 ']' 00:24:25.875 23:46:10 compress_isal -- common/autotest_common.sh@954 -- # kill -0 416330 00:24:25.875 23:46:10 compress_isal -- common/autotest_common.sh@955 -- # uname 00:24:25.875 23:46:10 compress_isal -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:24:25.875 23:46:10 compress_isal -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 416330 00:24:25.875 23:46:10 compress_isal -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:24:25.875 23:46:10 compress_isal -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:24:25.875 23:46:10 compress_isal -- common/autotest_common.sh@968 -- # echo 'killing process with pid 416330' 00:24:25.875 killing process with pid 416330 00:24:25.875 23:46:10 compress_isal -- common/autotest_common.sh@969 -- # kill 416330 00:24:25.875 Received shutdown signal, test time was about 3.000000 seconds 00:24:25.875 00:24:25.875 Latency(us) 00:24:25.875 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:25.875 =================================================================================================================== 00:24:25.875 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:24:25.875 23:46:10 compress_isal -- common/autotest_common.sh@974 -- # wait 416330 00:24:27.776 23:46:12 compress_isal -- compress/compress.sh@88 -- # run_bdevperf 32 4096 3 4096 00:24:27.776 23:46:12 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:24:27.776 23:46:12 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=418170 00:24:27.776 23:46:12 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:24:27.776 23:46:12 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:24:27.776 23:46:12 compress_isal -- compress/compress.sh@73 -- # waitforlisten 418170 00:24:27.776 23:46:12 compress_isal -- common/autotest_common.sh@831 -- # '[' -z 418170 ']' 00:24:27.776 23:46:12 compress_isal -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:24:27.776 23:46:12 compress_isal -- common/autotest_common.sh@836 -- # local max_retries=100 00:24:27.776 23:46:12 compress_isal -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:24:27.776 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:24:27.776 23:46:12 compress_isal -- common/autotest_common.sh@840 -- # xtrace_disable 00:24:27.776 23:46:12 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:24:27.776 [2024-07-24 23:46:12.306694] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:24:27.776 [2024-07-24 23:46:12.306739] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid418170 ] 00:24:27.776 [2024-07-24 23:46:12.370364] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:24:27.776 [2024-07-24 23:46:12.449317] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:24:27.776 [2024-07-24 23:46:12.449320] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:24:28.342 23:46:13 compress_isal -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:24:28.342 23:46:13 compress_isal -- common/autotest_common.sh@864 -- # return 0 00:24:28.342 23:46:13 compress_isal -- compress/compress.sh@74 -- # create_vols 4096 00:24:28.342 23:46:13 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:24:28.343 23:46:13 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:24:31.635 23:46:16 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:24:31.635 23:46:16 compress_isal -- common/autotest_common.sh@899 -- # local bdev_name=Nvme0n1 00:24:31.635 23:46:16 compress_isal -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:24:31.635 23:46:16 compress_isal -- common/autotest_common.sh@901 -- # local i 00:24:31.635 23:46:16 compress_isal -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:24:31.635 23:46:16 compress_isal -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:24:31.635 23:46:16 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:24:31.635 23:46:16 compress_isal -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:24:31.635 [ 00:24:31.635 { 00:24:31.635 "name": "Nvme0n1", 00:24:31.635 "aliases": [ 00:24:31.635 "6480e5a9-5cae-48f4-847b-8e2b7a1b8f0c" 00:24:31.635 ], 00:24:31.635 "product_name": "NVMe disk", 00:24:31.635 "block_size": 512, 00:24:31.635 "num_blocks": 1953525168, 00:24:31.635 "uuid": "6480e5a9-5cae-48f4-847b-8e2b7a1b8f0c", 00:24:31.635 "assigned_rate_limits": { 00:24:31.635 "rw_ios_per_sec": 0, 00:24:31.635 "rw_mbytes_per_sec": 0, 00:24:31.635 "r_mbytes_per_sec": 0, 00:24:31.635 "w_mbytes_per_sec": 0 00:24:31.635 }, 00:24:31.635 "claimed": false, 00:24:31.635 "zoned": false, 00:24:31.635 "supported_io_types": { 00:24:31.635 "read": true, 00:24:31.635 "write": true, 00:24:31.635 "unmap": true, 00:24:31.635 "flush": true, 00:24:31.635 "reset": true, 00:24:31.635 "nvme_admin": true, 00:24:31.635 "nvme_io": true, 00:24:31.635 "nvme_io_md": false, 00:24:31.635 "write_zeroes": true, 00:24:31.635 "zcopy": false, 00:24:31.635 "get_zone_info": false, 00:24:31.635 "zone_management": false, 00:24:31.635 "zone_append": false, 00:24:31.635 "compare": false, 00:24:31.635 "compare_and_write": false, 00:24:31.635 "abort": true, 00:24:31.635 "seek_hole": false, 00:24:31.635 "seek_data": false, 00:24:31.635 "copy": false, 00:24:31.635 "nvme_iov_md": false 00:24:31.635 }, 00:24:31.635 "driver_specific": { 00:24:31.635 "nvme": [ 00:24:31.635 { 00:24:31.635 "pci_address": "0000:5e:00.0", 00:24:31.635 "trid": { 00:24:31.635 "trtype": "PCIe", 00:24:31.635 "traddr": "0000:5e:00.0" 00:24:31.635 }, 00:24:31.635 "ctrlr_data": { 00:24:31.635 "cntlid": 0, 00:24:31.635 "vendor_id": "0x8086", 00:24:31.635 "model_number": "INTEL SSDPE2KX010T8", 00:24:31.635 "serial_number": "BTLJ807001JM1P0FGN", 00:24:31.635 "firmware_revision": "VDV10170", 00:24:31.635 "oacs": { 00:24:31.635 "security": 1, 00:24:31.635 "format": 1, 00:24:31.635 "firmware": 1, 00:24:31.635 "ns_manage": 1 00:24:31.635 }, 00:24:31.635 "multi_ctrlr": false, 00:24:31.635 "ana_reporting": false 00:24:31.635 }, 00:24:31.635 "vs": { 00:24:31.635 "nvme_version": "1.2" 00:24:31.636 }, 00:24:31.636 "ns_data": { 00:24:31.636 "id": 1, 00:24:31.636 "can_share": false 00:24:31.636 }, 00:24:31.636 "security": { 00:24:31.636 "opal": true 00:24:31.636 } 00:24:31.636 } 00:24:31.636 ], 00:24:31.636 "mp_policy": "active_passive" 00:24:31.636 } 00:24:31.636 } 00:24:31.636 ] 00:24:31.636 23:46:16 compress_isal -- common/autotest_common.sh@907 -- # return 0 00:24:31.636 23:46:16 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:24:32.570 accff588-2908-4481-bfeb-cef852d0f878 00:24:32.570 23:46:17 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:24:32.828 aec39a9d-5ce0-4aeb-b151-9843558c0d54 00:24:32.828 23:46:17 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:24:32.828 23:46:17 compress_isal -- common/autotest_common.sh@899 -- # local bdev_name=lvs0/lv0 00:24:32.828 23:46:17 compress_isal -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:24:32.828 23:46:17 compress_isal -- common/autotest_common.sh@901 -- # local i 00:24:32.828 23:46:17 compress_isal -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:24:32.828 23:46:17 compress_isal -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:24:32.828 23:46:17 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:24:33.087 23:46:17 compress_isal -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:24:33.087 [ 00:24:33.087 { 00:24:33.087 "name": "aec39a9d-5ce0-4aeb-b151-9843558c0d54", 00:24:33.087 "aliases": [ 00:24:33.087 "lvs0/lv0" 00:24:33.087 ], 00:24:33.087 "product_name": "Logical Volume", 00:24:33.087 "block_size": 512, 00:24:33.087 "num_blocks": 204800, 00:24:33.087 "uuid": "aec39a9d-5ce0-4aeb-b151-9843558c0d54", 00:24:33.087 "assigned_rate_limits": { 00:24:33.087 "rw_ios_per_sec": 0, 00:24:33.087 "rw_mbytes_per_sec": 0, 00:24:33.087 "r_mbytes_per_sec": 0, 00:24:33.087 "w_mbytes_per_sec": 0 00:24:33.087 }, 00:24:33.087 "claimed": false, 00:24:33.087 "zoned": false, 00:24:33.087 "supported_io_types": { 00:24:33.087 "read": true, 00:24:33.087 "write": true, 00:24:33.087 "unmap": true, 00:24:33.087 "flush": false, 00:24:33.087 "reset": true, 00:24:33.087 "nvme_admin": false, 00:24:33.087 "nvme_io": false, 00:24:33.087 "nvme_io_md": false, 00:24:33.087 "write_zeroes": true, 00:24:33.087 "zcopy": false, 00:24:33.087 "get_zone_info": false, 00:24:33.087 "zone_management": false, 00:24:33.087 "zone_append": false, 00:24:33.087 "compare": false, 00:24:33.087 "compare_and_write": false, 00:24:33.087 "abort": false, 00:24:33.087 "seek_hole": true, 00:24:33.087 "seek_data": true, 00:24:33.087 "copy": false, 00:24:33.087 "nvme_iov_md": false 00:24:33.087 }, 00:24:33.087 "driver_specific": { 00:24:33.087 "lvol": { 00:24:33.087 "lvol_store_uuid": "accff588-2908-4481-bfeb-cef852d0f878", 00:24:33.087 "base_bdev": "Nvme0n1", 00:24:33.087 "thin_provision": true, 00:24:33.087 "num_allocated_clusters": 0, 00:24:33.087 "snapshot": false, 00:24:33.087 "clone": false, 00:24:33.087 "esnap_clone": false 00:24:33.087 } 00:24:33.087 } 00:24:33.087 } 00:24:33.087 ] 00:24:33.087 23:46:18 compress_isal -- common/autotest_common.sh@907 -- # return 0 00:24:33.087 23:46:18 compress_isal -- compress/compress.sh@41 -- # '[' -z 4096 ']' 00:24:33.087 23:46:18 compress_isal -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 4096 00:24:33.346 [2024-07-24 23:46:18.223963] vbdev_compress.c: 999:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:24:33.346 COMP_lvs0/lv0 00:24:33.346 23:46:18 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:24:33.346 23:46:18 compress_isal -- common/autotest_common.sh@899 -- # local bdev_name=COMP_lvs0/lv0 00:24:33.346 23:46:18 compress_isal -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:24:33.346 23:46:18 compress_isal -- common/autotest_common.sh@901 -- # local i 00:24:33.346 23:46:18 compress_isal -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:24:33.346 23:46:18 compress_isal -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:24:33.346 23:46:18 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:24:33.605 23:46:18 compress_isal -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:24:33.605 [ 00:24:33.605 { 00:24:33.605 "name": "COMP_lvs0/lv0", 00:24:33.605 "aliases": [ 00:24:33.605 "afc79776-c378-5981-a55c-b74a652f6ed2" 00:24:33.605 ], 00:24:33.605 "product_name": "compress", 00:24:33.605 "block_size": 4096, 00:24:33.605 "num_blocks": 25088, 00:24:33.605 "uuid": "afc79776-c378-5981-a55c-b74a652f6ed2", 00:24:33.605 "assigned_rate_limits": { 00:24:33.605 "rw_ios_per_sec": 0, 00:24:33.605 "rw_mbytes_per_sec": 0, 00:24:33.605 "r_mbytes_per_sec": 0, 00:24:33.605 "w_mbytes_per_sec": 0 00:24:33.605 }, 00:24:33.605 "claimed": false, 00:24:33.605 "zoned": false, 00:24:33.605 "supported_io_types": { 00:24:33.605 "read": true, 00:24:33.605 "write": true, 00:24:33.605 "unmap": false, 00:24:33.605 "flush": false, 00:24:33.605 "reset": false, 00:24:33.605 "nvme_admin": false, 00:24:33.605 "nvme_io": false, 00:24:33.605 "nvme_io_md": false, 00:24:33.605 "write_zeroes": true, 00:24:33.605 "zcopy": false, 00:24:33.605 "get_zone_info": false, 00:24:33.605 "zone_management": false, 00:24:33.605 "zone_append": false, 00:24:33.605 "compare": false, 00:24:33.605 "compare_and_write": false, 00:24:33.605 "abort": false, 00:24:33.605 "seek_hole": false, 00:24:33.605 "seek_data": false, 00:24:33.605 "copy": false, 00:24:33.605 "nvme_iov_md": false 00:24:33.605 }, 00:24:33.605 "driver_specific": { 00:24:33.605 "compress": { 00:24:33.605 "name": "COMP_lvs0/lv0", 00:24:33.605 "base_bdev_name": "aec39a9d-5ce0-4aeb-b151-9843558c0d54", 00:24:33.605 "pm_path": "/tmp/pmem/49baa9a5-68f5-41b7-858e-9118d1e8fc1c" 00:24:33.605 } 00:24:33.605 } 00:24:33.605 } 00:24:33.605 ] 00:24:33.864 23:46:18 compress_isal -- common/autotest_common.sh@907 -- # return 0 00:24:33.864 23:46:18 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:24:33.864 Running I/O for 3 seconds... 00:24:37.148 00:24:37.148 Latency(us) 00:24:37.148 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:37.148 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:24:37.148 Verification LBA range: start 0x0 length 0x3100 00:24:37.148 COMP_lvs0/lv0 : 3.01 3353.79 13.10 0.00 0.00 9497.45 57.54 14854.83 00:24:37.148 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:24:37.148 Verification LBA range: start 0x3100 length 0x3100 00:24:37.148 COMP_lvs0/lv0 : 3.01 3373.17 13.18 0.00 0.00 9442.69 55.83 15291.73 00:24:37.148 =================================================================================================================== 00:24:37.148 Total : 6726.95 26.28 0.00 0.00 9469.98 55.83 15291.73 00:24:37.148 0 00:24:37.148 23:46:21 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:24:37.148 23:46:21 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:24:37.148 23:46:21 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:24:37.148 23:46:22 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:24:37.148 23:46:22 compress_isal -- compress/compress.sh@78 -- # killprocess 418170 00:24:37.148 23:46:22 compress_isal -- common/autotest_common.sh@950 -- # '[' -z 418170 ']' 00:24:37.148 23:46:22 compress_isal -- common/autotest_common.sh@954 -- # kill -0 418170 00:24:37.148 23:46:22 compress_isal -- common/autotest_common.sh@955 -- # uname 00:24:37.148 23:46:22 compress_isal -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:24:37.148 23:46:22 compress_isal -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 418170 00:24:37.148 23:46:22 compress_isal -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:24:37.148 23:46:22 compress_isal -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:24:37.148 23:46:22 compress_isal -- common/autotest_common.sh@968 -- # echo 'killing process with pid 418170' 00:24:37.148 killing process with pid 418170 00:24:37.148 23:46:22 compress_isal -- common/autotest_common.sh@969 -- # kill 418170 00:24:37.148 Received shutdown signal, test time was about 3.000000 seconds 00:24:37.148 00:24:37.148 Latency(us) 00:24:37.148 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:37.148 =================================================================================================================== 00:24:37.148 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:24:37.148 23:46:22 compress_isal -- common/autotest_common.sh@974 -- # wait 418170 00:24:39.050 23:46:23 compress_isal -- compress/compress.sh@89 -- # run_bdevio 00:24:39.050 23:46:23 compress_isal -- compress/compress.sh@50 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:24:39.050 23:46:23 compress_isal -- compress/compress.sh@55 -- # bdevio_pid=420009 00:24:39.050 23:46:23 compress_isal -- compress/compress.sh@56 -- # trap 'killprocess $bdevio_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:24:39.050 23:46:23 compress_isal -- compress/compress.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w 00:24:39.050 23:46:23 compress_isal -- compress/compress.sh@57 -- # waitforlisten 420009 00:24:39.050 23:46:23 compress_isal -- common/autotest_common.sh@831 -- # '[' -z 420009 ']' 00:24:39.050 23:46:23 compress_isal -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:24:39.050 23:46:23 compress_isal -- common/autotest_common.sh@836 -- # local max_retries=100 00:24:39.050 23:46:23 compress_isal -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:24:39.050 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:24:39.050 23:46:23 compress_isal -- common/autotest_common.sh@840 -- # xtrace_disable 00:24:39.050 23:46:23 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:24:39.050 [2024-07-24 23:46:23.608597] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:24:39.050 [2024-07-24 23:46:23.608641] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid420009 ] 00:24:39.050 [2024-07-24 23:46:23.671767] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:24:39.050 [2024-07-24 23:46:23.742046] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:24:39.050 [2024-07-24 23:46:23.742145] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:24:39.050 [2024-07-24 23:46:23.742145] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:24:39.616 23:46:24 compress_isal -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:24:39.616 23:46:24 compress_isal -- common/autotest_common.sh@864 -- # return 0 00:24:39.616 23:46:24 compress_isal -- compress/compress.sh@58 -- # create_vols 00:24:39.616 23:46:24 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:24:39.616 23:46:24 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:24:42.898 23:46:27 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:24:42.898 23:46:27 compress_isal -- common/autotest_common.sh@899 -- # local bdev_name=Nvme0n1 00:24:42.898 23:46:27 compress_isal -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:24:42.898 23:46:27 compress_isal -- common/autotest_common.sh@901 -- # local i 00:24:42.898 23:46:27 compress_isal -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:24:42.898 23:46:27 compress_isal -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:24:42.898 23:46:27 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:24:42.898 23:46:27 compress_isal -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:24:42.898 [ 00:24:42.898 { 00:24:42.898 "name": "Nvme0n1", 00:24:42.898 "aliases": [ 00:24:42.898 "2421f215-d2ce-43d5-9af7-6feddbf1cfe6" 00:24:42.898 ], 00:24:42.898 "product_name": "NVMe disk", 00:24:42.898 "block_size": 512, 00:24:42.898 "num_blocks": 1953525168, 00:24:42.898 "uuid": "2421f215-d2ce-43d5-9af7-6feddbf1cfe6", 00:24:42.898 "assigned_rate_limits": { 00:24:42.898 "rw_ios_per_sec": 0, 00:24:42.898 "rw_mbytes_per_sec": 0, 00:24:42.898 "r_mbytes_per_sec": 0, 00:24:42.898 "w_mbytes_per_sec": 0 00:24:42.898 }, 00:24:42.898 "claimed": false, 00:24:42.898 "zoned": false, 00:24:42.898 "supported_io_types": { 00:24:42.898 "read": true, 00:24:42.898 "write": true, 00:24:42.898 "unmap": true, 00:24:42.898 "flush": true, 00:24:42.898 "reset": true, 00:24:42.898 "nvme_admin": true, 00:24:42.898 "nvme_io": true, 00:24:42.898 "nvme_io_md": false, 00:24:42.898 "write_zeroes": true, 00:24:42.898 "zcopy": false, 00:24:42.898 "get_zone_info": false, 00:24:42.898 "zone_management": false, 00:24:42.898 "zone_append": false, 00:24:42.898 "compare": false, 00:24:42.898 "compare_and_write": false, 00:24:42.898 "abort": true, 00:24:42.898 "seek_hole": false, 00:24:42.898 "seek_data": false, 00:24:42.898 "copy": false, 00:24:42.898 "nvme_iov_md": false 00:24:42.898 }, 00:24:42.898 "driver_specific": { 00:24:42.898 "nvme": [ 00:24:42.898 { 00:24:42.898 "pci_address": "0000:5e:00.0", 00:24:42.898 "trid": { 00:24:42.898 "trtype": "PCIe", 00:24:42.898 "traddr": "0000:5e:00.0" 00:24:42.898 }, 00:24:42.898 "ctrlr_data": { 00:24:42.898 "cntlid": 0, 00:24:42.898 "vendor_id": "0x8086", 00:24:42.898 "model_number": "INTEL SSDPE2KX010T8", 00:24:42.899 "serial_number": "BTLJ807001JM1P0FGN", 00:24:42.899 "firmware_revision": "VDV10170", 00:24:42.899 "oacs": { 00:24:42.899 "security": 1, 00:24:42.899 "format": 1, 00:24:42.899 "firmware": 1, 00:24:42.899 "ns_manage": 1 00:24:42.899 }, 00:24:42.899 "multi_ctrlr": false, 00:24:42.899 "ana_reporting": false 00:24:42.899 }, 00:24:42.899 "vs": { 00:24:42.899 "nvme_version": "1.2" 00:24:42.899 }, 00:24:42.899 "ns_data": { 00:24:42.899 "id": 1, 00:24:42.899 "can_share": false 00:24:42.899 }, 00:24:42.899 "security": { 00:24:42.899 "opal": true 00:24:42.899 } 00:24:42.899 } 00:24:42.899 ], 00:24:42.899 "mp_policy": "active_passive" 00:24:42.899 } 00:24:42.899 } 00:24:42.899 ] 00:24:42.899 23:46:27 compress_isal -- common/autotest_common.sh@907 -- # return 0 00:24:42.899 23:46:27 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:24:43.833 1853fa34-a269-4df4-9e9f-dcfe1bbeb19f 00:24:43.834 23:46:28 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:24:44.092 30305c18-aca1-45e8-adcc-d7b0070ce89a 00:24:44.092 23:46:28 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:24:44.092 23:46:28 compress_isal -- common/autotest_common.sh@899 -- # local bdev_name=lvs0/lv0 00:24:44.092 23:46:28 compress_isal -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:24:44.092 23:46:28 compress_isal -- common/autotest_common.sh@901 -- # local i 00:24:44.092 23:46:28 compress_isal -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:24:44.092 23:46:28 compress_isal -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:24:44.092 23:46:28 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:24:44.351 23:46:29 compress_isal -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:24:44.351 [ 00:24:44.351 { 00:24:44.351 "name": "30305c18-aca1-45e8-adcc-d7b0070ce89a", 00:24:44.351 "aliases": [ 00:24:44.351 "lvs0/lv0" 00:24:44.351 ], 00:24:44.351 "product_name": "Logical Volume", 00:24:44.351 "block_size": 512, 00:24:44.351 "num_blocks": 204800, 00:24:44.351 "uuid": "30305c18-aca1-45e8-adcc-d7b0070ce89a", 00:24:44.351 "assigned_rate_limits": { 00:24:44.351 "rw_ios_per_sec": 0, 00:24:44.351 "rw_mbytes_per_sec": 0, 00:24:44.351 "r_mbytes_per_sec": 0, 00:24:44.351 "w_mbytes_per_sec": 0 00:24:44.351 }, 00:24:44.351 "claimed": false, 00:24:44.351 "zoned": false, 00:24:44.351 "supported_io_types": { 00:24:44.351 "read": true, 00:24:44.351 "write": true, 00:24:44.351 "unmap": true, 00:24:44.351 "flush": false, 00:24:44.351 "reset": true, 00:24:44.351 "nvme_admin": false, 00:24:44.351 "nvme_io": false, 00:24:44.351 "nvme_io_md": false, 00:24:44.351 "write_zeroes": true, 00:24:44.351 "zcopy": false, 00:24:44.351 "get_zone_info": false, 00:24:44.351 "zone_management": false, 00:24:44.351 "zone_append": false, 00:24:44.351 "compare": false, 00:24:44.351 "compare_and_write": false, 00:24:44.351 "abort": false, 00:24:44.351 "seek_hole": true, 00:24:44.351 "seek_data": true, 00:24:44.351 "copy": false, 00:24:44.351 "nvme_iov_md": false 00:24:44.351 }, 00:24:44.351 "driver_specific": { 00:24:44.351 "lvol": { 00:24:44.351 "lvol_store_uuid": "1853fa34-a269-4df4-9e9f-dcfe1bbeb19f", 00:24:44.351 "base_bdev": "Nvme0n1", 00:24:44.351 "thin_provision": true, 00:24:44.351 "num_allocated_clusters": 0, 00:24:44.351 "snapshot": false, 00:24:44.351 "clone": false, 00:24:44.351 "esnap_clone": false 00:24:44.351 } 00:24:44.351 } 00:24:44.351 } 00:24:44.351 ] 00:24:44.351 23:46:29 compress_isal -- common/autotest_common.sh@907 -- # return 0 00:24:44.351 23:46:29 compress_isal -- compress/compress.sh@41 -- # '[' -z '' ']' 00:24:44.351 23:46:29 compress_isal -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:24:44.609 [2024-07-24 23:46:29.478031] vbdev_compress.c: 999:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:24:44.609 COMP_lvs0/lv0 00:24:44.609 23:46:29 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:24:44.609 23:46:29 compress_isal -- common/autotest_common.sh@899 -- # local bdev_name=COMP_lvs0/lv0 00:24:44.609 23:46:29 compress_isal -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:24:44.609 23:46:29 compress_isal -- common/autotest_common.sh@901 -- # local i 00:24:44.609 23:46:29 compress_isal -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:24:44.609 23:46:29 compress_isal -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:24:44.609 23:46:29 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:24:44.867 23:46:29 compress_isal -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:24:44.867 [ 00:24:44.867 { 00:24:44.867 "name": "COMP_lvs0/lv0", 00:24:44.867 "aliases": [ 00:24:44.867 "d17449bc-a6b2-5350-bc7f-991d61518258" 00:24:44.867 ], 00:24:44.867 "product_name": "compress", 00:24:44.867 "block_size": 512, 00:24:44.867 "num_blocks": 200704, 00:24:44.867 "uuid": "d17449bc-a6b2-5350-bc7f-991d61518258", 00:24:44.867 "assigned_rate_limits": { 00:24:44.867 "rw_ios_per_sec": 0, 00:24:44.867 "rw_mbytes_per_sec": 0, 00:24:44.867 "r_mbytes_per_sec": 0, 00:24:44.867 "w_mbytes_per_sec": 0 00:24:44.867 }, 00:24:44.867 "claimed": false, 00:24:44.867 "zoned": false, 00:24:44.867 "supported_io_types": { 00:24:44.867 "read": true, 00:24:44.867 "write": true, 00:24:44.867 "unmap": false, 00:24:44.867 "flush": false, 00:24:44.867 "reset": false, 00:24:44.867 "nvme_admin": false, 00:24:44.867 "nvme_io": false, 00:24:44.867 "nvme_io_md": false, 00:24:44.867 "write_zeroes": true, 00:24:44.867 "zcopy": false, 00:24:44.867 "get_zone_info": false, 00:24:44.867 "zone_management": false, 00:24:44.867 "zone_append": false, 00:24:44.867 "compare": false, 00:24:44.867 "compare_and_write": false, 00:24:44.867 "abort": false, 00:24:44.867 "seek_hole": false, 00:24:44.867 "seek_data": false, 00:24:44.867 "copy": false, 00:24:44.867 "nvme_iov_md": false 00:24:44.867 }, 00:24:44.867 "driver_specific": { 00:24:44.867 "compress": { 00:24:44.867 "name": "COMP_lvs0/lv0", 00:24:44.867 "base_bdev_name": "30305c18-aca1-45e8-adcc-d7b0070ce89a", 00:24:44.867 "pm_path": "/tmp/pmem/5b87bc67-2e65-4f41-9b50-1488b71ebf2e" 00:24:44.867 } 00:24:44.867 } 00:24:44.867 } 00:24:44.867 ] 00:24:44.867 23:46:29 compress_isal -- common/autotest_common.sh@907 -- # return 0 00:24:44.867 23:46:29 compress_isal -- compress/compress.sh@59 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:24:45.126 I/O targets: 00:24:45.126 COMP_lvs0/lv0: 200704 blocks of 512 bytes (98 MiB) 00:24:45.126 00:24:45.126 00:24:45.126 CUnit - A unit testing framework for C - Version 2.1-3 00:24:45.126 http://cunit.sourceforge.net/ 00:24:45.126 00:24:45.126 00:24:45.126 Suite: bdevio tests on: COMP_lvs0/lv0 00:24:45.126 Test: blockdev write read block ...passed 00:24:45.126 Test: blockdev write zeroes read block ...passed 00:24:45.126 Test: blockdev write zeroes read no split ...passed 00:24:45.126 Test: blockdev write zeroes read split ...passed 00:24:45.126 Test: blockdev write zeroes read split partial ...passed 00:24:45.126 Test: blockdev reset ...[2024-07-24 23:46:29.960298] vbdev_compress.c: 252:vbdev_compress_submit_request: *ERROR*: Unknown I/O type 5 00:24:45.126 passed 00:24:45.126 Test: blockdev write read 8 blocks ...passed 00:24:45.126 Test: blockdev write read size > 128k ...passed 00:24:45.126 Test: blockdev write read invalid size ...passed 00:24:45.126 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:24:45.126 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:24:45.126 Test: blockdev write read max offset ...passed 00:24:45.126 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:24:45.126 Test: blockdev writev readv 8 blocks ...passed 00:24:45.126 Test: blockdev writev readv 30 x 1block ...passed 00:24:45.126 Test: blockdev writev readv block ...passed 00:24:45.126 Test: blockdev writev readv size > 128k ...passed 00:24:45.126 Test: blockdev writev readv size > 128k in two iovs ...passed 00:24:45.126 Test: blockdev comparev and writev ...passed 00:24:45.126 Test: blockdev nvme passthru rw ...passed 00:24:45.126 Test: blockdev nvme passthru vendor specific ...passed 00:24:45.126 Test: blockdev nvme admin passthru ...passed 00:24:45.126 Test: blockdev copy ...passed 00:24:45.126 00:24:45.126 Run Summary: Type Total Ran Passed Failed Inactive 00:24:45.126 suites 1 1 n/a 0 0 00:24:45.126 tests 23 23 23 0 0 00:24:45.126 asserts 130 130 130 0 n/a 00:24:45.126 00:24:45.126 Elapsed time = 0.173 seconds 00:24:45.126 0 00:24:45.126 23:46:30 compress_isal -- compress/compress.sh@60 -- # destroy_vols 00:24:45.126 23:46:30 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:24:45.384 23:46:30 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:24:45.384 23:46:30 compress_isal -- compress/compress.sh@61 -- # trap - SIGINT SIGTERM EXIT 00:24:45.384 23:46:30 compress_isal -- compress/compress.sh@62 -- # killprocess 420009 00:24:45.384 23:46:30 compress_isal -- common/autotest_common.sh@950 -- # '[' -z 420009 ']' 00:24:45.384 23:46:30 compress_isal -- common/autotest_common.sh@954 -- # kill -0 420009 00:24:45.384 23:46:30 compress_isal -- common/autotest_common.sh@955 -- # uname 00:24:45.384 23:46:30 compress_isal -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:24:45.384 23:46:30 compress_isal -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 420009 00:24:45.642 23:46:30 compress_isal -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:24:45.642 23:46:30 compress_isal -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:24:45.642 23:46:30 compress_isal -- common/autotest_common.sh@968 -- # echo 'killing process with pid 420009' 00:24:45.642 killing process with pid 420009 00:24:45.642 23:46:30 compress_isal -- common/autotest_common.sh@969 -- # kill 420009 00:24:45.642 23:46:30 compress_isal -- common/autotest_common.sh@974 -- # wait 420009 00:24:47.048 23:46:31 compress_isal -- compress/compress.sh@91 -- # '[' 0 -eq 1 ']' 00:24:47.048 23:46:31 compress_isal -- compress/compress.sh@120 -- # rm -rf /tmp/pmem 00:24:47.048 00:24:47.048 real 0m42.127s 00:24:47.048 user 1m35.157s 00:24:47.048 sys 0m2.791s 00:24:47.048 23:46:31 compress_isal -- common/autotest_common.sh@1126 -- # xtrace_disable 00:24:47.048 23:46:31 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:24:47.048 ************************************ 00:24:47.048 END TEST compress_isal 00:24:47.048 ************************************ 00:24:47.048 23:46:31 -- spdk/autotest.sh@356 -- # '[' 0 -eq 1 ']' 00:24:47.048 23:46:31 -- spdk/autotest.sh@360 -- # '[' 1 -eq 1 ']' 00:24:47.048 23:46:31 -- spdk/autotest.sh@361 -- # run_test blockdev_crypto_aesni /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_aesni 00:24:47.048 23:46:31 -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:24:47.048 23:46:31 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:24:47.048 23:46:31 -- common/autotest_common.sh@10 -- # set +x 00:24:47.048 ************************************ 00:24:47.048 START TEST blockdev_crypto_aesni 00:24:47.048 ************************************ 00:24:47.048 23:46:31 blockdev_crypto_aesni -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_aesni 00:24:47.048 * Looking for test storage... 00:24:47.048 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:24:47.048 23:46:31 blockdev_crypto_aesni -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:24:47.048 23:46:31 blockdev_crypto_aesni -- bdev/nbd_common.sh@6 -- # set -e 00:24:47.048 23:46:31 blockdev_crypto_aesni -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:24:47.048 23:46:31 blockdev_crypto_aesni -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:24:47.048 23:46:31 blockdev_crypto_aesni -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:24:47.048 23:46:31 blockdev_crypto_aesni -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:24:47.048 23:46:31 blockdev_crypto_aesni -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:24:47.048 23:46:31 blockdev_crypto_aesni -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:24:47.048 23:46:31 blockdev_crypto_aesni -- bdev/blockdev.sh@20 -- # : 00:24:47.048 23:46:31 blockdev_crypto_aesni -- bdev/blockdev.sh@669 -- # QOS_DEV_1=Malloc_0 00:24:47.048 23:46:31 blockdev_crypto_aesni -- bdev/blockdev.sh@670 -- # QOS_DEV_2=Null_1 00:24:47.048 23:46:31 blockdev_crypto_aesni -- bdev/blockdev.sh@671 -- # QOS_RUN_TIME=5 00:24:47.048 23:46:31 blockdev_crypto_aesni -- bdev/blockdev.sh@673 -- # uname -s 00:24:47.048 23:46:31 blockdev_crypto_aesni -- bdev/blockdev.sh@673 -- # '[' Linux = Linux ']' 00:24:47.048 23:46:31 blockdev_crypto_aesni -- bdev/blockdev.sh@675 -- # PRE_RESERVED_MEM=0 00:24:47.048 23:46:31 blockdev_crypto_aesni -- bdev/blockdev.sh@681 -- # test_type=crypto_aesni 00:24:47.048 23:46:31 blockdev_crypto_aesni -- bdev/blockdev.sh@682 -- # crypto_device= 00:24:47.048 23:46:31 blockdev_crypto_aesni -- bdev/blockdev.sh@683 -- # dek= 00:24:47.048 23:46:31 blockdev_crypto_aesni -- bdev/blockdev.sh@684 -- # env_ctx= 00:24:47.048 23:46:31 blockdev_crypto_aesni -- bdev/blockdev.sh@685 -- # wait_for_rpc= 00:24:47.048 23:46:31 blockdev_crypto_aesni -- bdev/blockdev.sh@686 -- # '[' -n '' ']' 00:24:47.048 23:46:31 blockdev_crypto_aesni -- bdev/blockdev.sh@689 -- # [[ crypto_aesni == bdev ]] 00:24:47.048 23:46:31 blockdev_crypto_aesni -- bdev/blockdev.sh@689 -- # [[ crypto_aesni == crypto_* ]] 00:24:47.048 23:46:31 blockdev_crypto_aesni -- bdev/blockdev.sh@690 -- # wait_for_rpc=--wait-for-rpc 00:24:47.048 23:46:31 blockdev_crypto_aesni -- bdev/blockdev.sh@692 -- # start_spdk_tgt 00:24:47.048 23:46:31 blockdev_crypto_aesni -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=421443 00:24:47.048 23:46:31 blockdev_crypto_aesni -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:24:47.048 23:46:31 blockdev_crypto_aesni -- bdev/blockdev.sh@49 -- # waitforlisten 421443 00:24:47.048 23:46:31 blockdev_crypto_aesni -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:24:47.048 23:46:31 blockdev_crypto_aesni -- common/autotest_common.sh@831 -- # '[' -z 421443 ']' 00:24:47.048 23:46:31 blockdev_crypto_aesni -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:24:47.048 23:46:31 blockdev_crypto_aesni -- common/autotest_common.sh@836 -- # local max_retries=100 00:24:47.048 23:46:31 blockdev_crypto_aesni -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:24:47.048 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:24:47.048 23:46:31 blockdev_crypto_aesni -- common/autotest_common.sh@840 -- # xtrace_disable 00:24:47.048 23:46:31 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:24:47.312 [2024-07-24 23:46:32.042056] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:24:47.312 [2024-07-24 23:46:32.042103] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid421443 ] 00:24:47.312 [2024-07-24 23:46:32.107165] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:47.312 [2024-07-24 23:46:32.177912] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:24:47.878 23:46:32 blockdev_crypto_aesni -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:24:47.878 23:46:32 blockdev_crypto_aesni -- common/autotest_common.sh@864 -- # return 0 00:24:47.878 23:46:32 blockdev_crypto_aesni -- bdev/blockdev.sh@693 -- # case "$test_type" in 00:24:47.878 23:46:32 blockdev_crypto_aesni -- bdev/blockdev.sh@704 -- # setup_crypto_aesni_conf 00:24:47.878 23:46:32 blockdev_crypto_aesni -- bdev/blockdev.sh@145 -- # rpc_cmd 00:24:47.878 23:46:32 blockdev_crypto_aesni -- common/autotest_common.sh@561 -- # xtrace_disable 00:24:47.878 23:46:32 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:24:47.878 [2024-07-24 23:46:32.847895] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:24:47.878 [2024-07-24 23:46:32.855923] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:24:47.878 [2024-07-24 23:46:32.863941] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:24:48.141 [2024-07-24 23:46:32.926971] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:24:50.676 true 00:24:50.676 true 00:24:50.676 true 00:24:50.676 true 00:24:50.676 Malloc0 00:24:50.676 Malloc1 00:24:50.676 Malloc2 00:24:50.677 Malloc3 00:24:50.677 [2024-07-24 23:46:35.206862] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:24:50.677 crypto_ram 00:24:50.677 [2024-07-24 23:46:35.214878] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:24:50.677 crypto_ram2 00:24:50.677 [2024-07-24 23:46:35.222899] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:24:50.677 crypto_ram3 00:24:50.677 [2024-07-24 23:46:35.230922] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:24:50.677 crypto_ram4 00:24:50.677 23:46:35 blockdev_crypto_aesni -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:24:50.677 23:46:35 blockdev_crypto_aesni -- bdev/blockdev.sh@736 -- # rpc_cmd bdev_wait_for_examine 00:24:50.677 23:46:35 blockdev_crypto_aesni -- common/autotest_common.sh@561 -- # xtrace_disable 00:24:50.677 23:46:35 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:24:50.677 23:46:35 blockdev_crypto_aesni -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:24:50.677 23:46:35 blockdev_crypto_aesni -- bdev/blockdev.sh@739 -- # cat 00:24:50.677 23:46:35 blockdev_crypto_aesni -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n accel 00:24:50.677 23:46:35 blockdev_crypto_aesni -- common/autotest_common.sh@561 -- # xtrace_disable 00:24:50.677 23:46:35 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:24:50.677 23:46:35 blockdev_crypto_aesni -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:24:50.677 23:46:35 blockdev_crypto_aesni -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n bdev 00:24:50.677 23:46:35 blockdev_crypto_aesni -- common/autotest_common.sh@561 -- # xtrace_disable 00:24:50.677 23:46:35 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:24:50.677 23:46:35 blockdev_crypto_aesni -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:24:50.677 23:46:35 blockdev_crypto_aesni -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n iobuf 00:24:50.677 23:46:35 blockdev_crypto_aesni -- common/autotest_common.sh@561 -- # xtrace_disable 00:24:50.677 23:46:35 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:24:50.677 23:46:35 blockdev_crypto_aesni -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:24:50.677 23:46:35 blockdev_crypto_aesni -- bdev/blockdev.sh@747 -- # mapfile -t bdevs 00:24:50.677 23:46:35 blockdev_crypto_aesni -- bdev/blockdev.sh@747 -- # rpc_cmd bdev_get_bdevs 00:24:50.677 23:46:35 blockdev_crypto_aesni -- bdev/blockdev.sh@747 -- # jq -r '.[] | select(.claimed == false)' 00:24:50.677 23:46:35 blockdev_crypto_aesni -- common/autotest_common.sh@561 -- # xtrace_disable 00:24:50.677 23:46:35 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:24:50.677 23:46:35 blockdev_crypto_aesni -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:24:50.677 23:46:35 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # mapfile -t bdevs_name 00:24:50.677 23:46:35 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "99501f13-3c00-5348-b855-2574392cef43"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "99501f13-3c00-5348-b855-2574392cef43",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "6a733981-b4b4-527c-9d1f-ee887708367d"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "6a733981-b4b4-527c-9d1f-ee887708367d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "6cc1a231-600f-5cbd-9238-c891439c5e0b"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "6cc1a231-600f-5cbd-9238-c891439c5e0b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "3542f73c-3924-5ae1-aa23-b97c82b06021"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "3542f73c-3924-5ae1-aa23-b97c82b06021",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:24:50.677 23:46:35 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # jq -r .name 00:24:50.677 23:46:35 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # bdev_list=("${bdevs_name[@]}") 00:24:50.677 23:46:35 blockdev_crypto_aesni -- bdev/blockdev.sh@751 -- # hello_world_bdev=crypto_ram 00:24:50.677 23:46:35 blockdev_crypto_aesni -- bdev/blockdev.sh@752 -- # trap - SIGINT SIGTERM EXIT 00:24:50.677 23:46:35 blockdev_crypto_aesni -- bdev/blockdev.sh@753 -- # killprocess 421443 00:24:50.677 23:46:35 blockdev_crypto_aesni -- common/autotest_common.sh@950 -- # '[' -z 421443 ']' 00:24:50.677 23:46:35 blockdev_crypto_aesni -- common/autotest_common.sh@954 -- # kill -0 421443 00:24:50.677 23:46:35 blockdev_crypto_aesni -- common/autotest_common.sh@955 -- # uname 00:24:50.677 23:46:35 blockdev_crypto_aesni -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:24:50.677 23:46:35 blockdev_crypto_aesni -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 421443 00:24:50.677 23:46:35 blockdev_crypto_aesni -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:24:50.677 23:46:35 blockdev_crypto_aesni -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:24:50.677 23:46:35 blockdev_crypto_aesni -- common/autotest_common.sh@968 -- # echo 'killing process with pid 421443' 00:24:50.677 killing process with pid 421443 00:24:50.677 23:46:35 blockdev_crypto_aesni -- common/autotest_common.sh@969 -- # kill 421443 00:24:50.677 23:46:35 blockdev_crypto_aesni -- common/autotest_common.sh@974 -- # wait 421443 00:24:50.936 23:46:35 blockdev_crypto_aesni -- bdev/blockdev.sh@757 -- # trap cleanup SIGINT SIGTERM EXIT 00:24:50.936 23:46:35 blockdev_crypto_aesni -- bdev/blockdev.sh@759 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:24:50.936 23:46:35 blockdev_crypto_aesni -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:24:50.936 23:46:35 blockdev_crypto_aesni -- common/autotest_common.sh@1107 -- # xtrace_disable 00:24:50.936 23:46:35 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:24:50.936 ************************************ 00:24:50.936 START TEST bdev_hello_world 00:24:50.936 ************************************ 00:24:50.936 23:46:35 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:24:51.195 [2024-07-24 23:46:35.966508] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:24:51.195 [2024-07-24 23:46:35.966547] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid422141 ] 00:24:51.195 [2024-07-24 23:46:36.028787] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:51.195 [2024-07-24 23:46:36.100419] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:24:51.195 [2024-07-24 23:46:36.121269] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:24:51.195 [2024-07-24 23:46:36.129292] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:24:51.195 [2024-07-24 23:46:36.137310] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:24:51.454 [2024-07-24 23:46:36.232758] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:24:53.987 [2024-07-24 23:46:38.376194] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:24:53.987 [2024-07-24 23:46:38.376244] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:24:53.987 [2024-07-24 23:46:38.376268] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:24:53.987 [2024-07-24 23:46:38.384213] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:24:53.987 [2024-07-24 23:46:38.384225] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:24:53.987 [2024-07-24 23:46:38.384231] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:24:53.987 [2024-07-24 23:46:38.392232] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:24:53.987 [2024-07-24 23:46:38.392248] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:24:53.987 [2024-07-24 23:46:38.392254] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:24:53.987 [2024-07-24 23:46:38.400252] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:24:53.987 [2024-07-24 23:46:38.400262] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:24:53.987 [2024-07-24 23:46:38.400267] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:24:53.987 [2024-07-24 23:46:38.467843] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:24:53.987 [2024-07-24 23:46:38.467880] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:24:53.987 [2024-07-24 23:46:38.467889] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:24:53.987 [2024-07-24 23:46:38.468761] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:24:53.987 [2024-07-24 23:46:38.468818] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:24:53.987 [2024-07-24 23:46:38.468827] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:24:53.987 [2024-07-24 23:46:38.468855] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:24:53.987 00:24:53.987 [2024-07-24 23:46:38.468866] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:24:53.987 00:24:53.987 real 0m2.853s 00:24:53.987 user 0m2.573s 00:24:53.987 sys 0m0.250s 00:24:53.987 23:46:38 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@1126 -- # xtrace_disable 00:24:53.987 23:46:38 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:24:53.987 ************************************ 00:24:53.987 END TEST bdev_hello_world 00:24:53.987 ************************************ 00:24:53.987 23:46:38 blockdev_crypto_aesni -- bdev/blockdev.sh@760 -- # run_test bdev_bounds bdev_bounds '' 00:24:53.987 23:46:38 blockdev_crypto_aesni -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:24:53.987 23:46:38 blockdev_crypto_aesni -- common/autotest_common.sh@1107 -- # xtrace_disable 00:24:53.987 23:46:38 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:24:53.987 ************************************ 00:24:53.987 START TEST bdev_bounds 00:24:53.987 ************************************ 00:24:53.987 23:46:38 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@1125 -- # bdev_bounds '' 00:24:53.987 23:46:38 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@289 -- # bdevio_pid=422613 00:24:53.987 23:46:38 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@288 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:24:53.987 23:46:38 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@290 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:24:53.987 23:46:38 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@291 -- # echo 'Process bdevio pid: 422613' 00:24:53.987 Process bdevio pid: 422613 00:24:53.987 23:46:38 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@292 -- # waitforlisten 422613 00:24:53.987 23:46:38 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@831 -- # '[' -z 422613 ']' 00:24:53.987 23:46:38 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:24:53.987 23:46:38 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@836 -- # local max_retries=100 00:24:53.987 23:46:38 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:24:53.987 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:24:53.987 23:46:38 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@840 -- # xtrace_disable 00:24:53.987 23:46:38 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:24:53.987 [2024-07-24 23:46:38.883526] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:24:53.987 [2024-07-24 23:46:38.883560] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid422613 ] 00:24:53.987 [2024-07-24 23:46:38.947792] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:24:54.246 [2024-07-24 23:46:39.028129] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:24:54.246 [2024-07-24 23:46:39.028227] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:24:54.246 [2024-07-24 23:46:39.028227] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:24:54.246 [2024-07-24 23:46:39.049174] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:24:54.246 [2024-07-24 23:46:39.057198] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:24:54.246 [2024-07-24 23:46:39.065218] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:24:54.246 [2024-07-24 23:46:39.163250] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:24:56.779 [2024-07-24 23:46:41.312528] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:24:56.779 [2024-07-24 23:46:41.312584] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:24:56.779 [2024-07-24 23:46:41.312593] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:24:56.779 [2024-07-24 23:46:41.320547] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:24:56.779 [2024-07-24 23:46:41.320560] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:24:56.779 [2024-07-24 23:46:41.320565] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:24:56.779 [2024-07-24 23:46:41.328568] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:24:56.779 [2024-07-24 23:46:41.328579] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:24:56.779 [2024-07-24 23:46:41.328584] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:24:56.779 [2024-07-24 23:46:41.336589] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:24:56.779 [2024-07-24 23:46:41.336599] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:24:56.779 [2024-07-24 23:46:41.336604] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:24:56.779 23:46:41 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:24:56.779 23:46:41 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@864 -- # return 0 00:24:56.779 23:46:41 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:24:56.779 I/O targets: 00:24:56.779 crypto_ram: 65536 blocks of 512 bytes (32 MiB) 00:24:56.779 crypto_ram2: 65536 blocks of 512 bytes (32 MiB) 00:24:56.780 crypto_ram3: 8192 blocks of 4096 bytes (32 MiB) 00:24:56.780 crypto_ram4: 8192 blocks of 4096 bytes (32 MiB) 00:24:56.780 00:24:56.780 00:24:56.780 CUnit - A unit testing framework for C - Version 2.1-3 00:24:56.780 http://cunit.sourceforge.net/ 00:24:56.780 00:24:56.780 00:24:56.780 Suite: bdevio tests on: crypto_ram4 00:24:56.780 Test: blockdev write read block ...passed 00:24:56.780 Test: blockdev write zeroes read block ...passed 00:24:56.780 Test: blockdev write zeroes read no split ...passed 00:24:56.780 Test: blockdev write zeroes read split ...passed 00:24:56.780 Test: blockdev write zeroes read split partial ...passed 00:24:56.780 Test: blockdev reset ...passed 00:24:56.780 Test: blockdev write read 8 blocks ...passed 00:24:56.780 Test: blockdev write read size > 128k ...passed 00:24:56.780 Test: blockdev write read invalid size ...passed 00:24:56.780 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:24:56.780 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:24:56.780 Test: blockdev write read max offset ...passed 00:24:56.780 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:24:56.780 Test: blockdev writev readv 8 blocks ...passed 00:24:56.780 Test: blockdev writev readv 30 x 1block ...passed 00:24:56.780 Test: blockdev writev readv block ...passed 00:24:56.780 Test: blockdev writev readv size > 128k ...passed 00:24:56.780 Test: blockdev writev readv size > 128k in two iovs ...passed 00:24:56.780 Test: blockdev comparev and writev ...passed 00:24:56.780 Test: blockdev nvme passthru rw ...passed 00:24:56.780 Test: blockdev nvme passthru vendor specific ...passed 00:24:56.780 Test: blockdev nvme admin passthru ...passed 00:24:56.780 Test: blockdev copy ...passed 00:24:56.780 Suite: bdevio tests on: crypto_ram3 00:24:56.780 Test: blockdev write read block ...passed 00:24:56.780 Test: blockdev write zeroes read block ...passed 00:24:56.780 Test: blockdev write zeroes read no split ...passed 00:24:56.780 Test: blockdev write zeroes read split ...passed 00:24:56.780 Test: blockdev write zeroes read split partial ...passed 00:24:56.780 Test: blockdev reset ...passed 00:24:56.780 Test: blockdev write read 8 blocks ...passed 00:24:56.780 Test: blockdev write read size > 128k ...passed 00:24:56.780 Test: blockdev write read invalid size ...passed 00:24:56.780 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:24:56.780 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:24:56.780 Test: blockdev write read max offset ...passed 00:24:56.780 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:24:56.780 Test: blockdev writev readv 8 blocks ...passed 00:24:56.780 Test: blockdev writev readv 30 x 1block ...passed 00:24:56.780 Test: blockdev writev readv block ...passed 00:24:56.780 Test: blockdev writev readv size > 128k ...passed 00:24:56.780 Test: blockdev writev readv size > 128k in two iovs ...passed 00:24:56.780 Test: blockdev comparev and writev ...passed 00:24:56.780 Test: blockdev nvme passthru rw ...passed 00:24:56.780 Test: blockdev nvme passthru vendor specific ...passed 00:24:56.780 Test: blockdev nvme admin passthru ...passed 00:24:56.780 Test: blockdev copy ...passed 00:24:56.780 Suite: bdevio tests on: crypto_ram2 00:24:56.780 Test: blockdev write read block ...passed 00:24:56.780 Test: blockdev write zeroes read block ...passed 00:24:56.780 Test: blockdev write zeroes read no split ...passed 00:24:56.780 Test: blockdev write zeroes read split ...passed 00:24:56.780 Test: blockdev write zeroes read split partial ...passed 00:24:56.780 Test: blockdev reset ...passed 00:24:56.780 Test: blockdev write read 8 blocks ...passed 00:24:56.780 Test: blockdev write read size > 128k ...passed 00:24:56.780 Test: blockdev write read invalid size ...passed 00:24:56.780 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:24:56.780 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:24:56.780 Test: blockdev write read max offset ...passed 00:24:56.780 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:24:56.780 Test: blockdev writev readv 8 blocks ...passed 00:24:56.780 Test: blockdev writev readv 30 x 1block ...passed 00:24:56.780 Test: blockdev writev readv block ...passed 00:24:56.780 Test: blockdev writev readv size > 128k ...passed 00:24:56.780 Test: blockdev writev readv size > 128k in two iovs ...passed 00:24:56.780 Test: blockdev comparev and writev ...passed 00:24:56.780 Test: blockdev nvme passthru rw ...passed 00:24:56.780 Test: blockdev nvme passthru vendor specific ...passed 00:24:56.780 Test: blockdev nvme admin passthru ...passed 00:24:56.780 Test: blockdev copy ...passed 00:24:56.780 Suite: bdevio tests on: crypto_ram 00:24:56.780 Test: blockdev write read block ...passed 00:24:56.780 Test: blockdev write zeroes read block ...passed 00:24:56.780 Test: blockdev write zeroes read no split ...passed 00:24:56.780 Test: blockdev write zeroes read split ...passed 00:24:56.780 Test: blockdev write zeroes read split partial ...passed 00:24:56.780 Test: blockdev reset ...passed 00:24:56.780 Test: blockdev write read 8 blocks ...passed 00:24:56.780 Test: blockdev write read size > 128k ...passed 00:24:56.780 Test: blockdev write read invalid size ...passed 00:24:56.780 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:24:56.780 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:24:56.780 Test: blockdev write read max offset ...passed 00:24:56.780 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:24:56.780 Test: blockdev writev readv 8 blocks ...passed 00:24:56.780 Test: blockdev writev readv 30 x 1block ...passed 00:24:56.780 Test: blockdev writev readv block ...passed 00:24:56.780 Test: blockdev writev readv size > 128k ...passed 00:24:56.780 Test: blockdev writev readv size > 128k in two iovs ...passed 00:24:56.780 Test: blockdev comparev and writev ...passed 00:24:56.780 Test: blockdev nvme passthru rw ...passed 00:24:56.780 Test: blockdev nvme passthru vendor specific ...passed 00:24:56.780 Test: blockdev nvme admin passthru ...passed 00:24:56.780 Test: blockdev copy ...passed 00:24:56.780 00:24:56.780 Run Summary: Type Total Ran Passed Failed Inactive 00:24:56.780 suites 4 4 n/a 0 0 00:24:56.780 tests 92 92 92 0 0 00:24:56.780 asserts 520 520 520 0 n/a 00:24:56.780 00:24:56.780 Elapsed time = 0.524 seconds 00:24:56.780 0 00:24:57.065 23:46:41 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@294 -- # killprocess 422613 00:24:57.065 23:46:41 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@950 -- # '[' -z 422613 ']' 00:24:57.065 23:46:41 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@954 -- # kill -0 422613 00:24:57.065 23:46:41 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@955 -- # uname 00:24:57.065 23:46:41 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:24:57.065 23:46:41 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 422613 00:24:57.065 23:46:41 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:24:57.065 23:46:41 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:24:57.065 23:46:41 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@968 -- # echo 'killing process with pid 422613' 00:24:57.065 killing process with pid 422613 00:24:57.065 23:46:41 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@969 -- # kill 422613 00:24:57.065 23:46:41 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@974 -- # wait 422613 00:24:57.324 23:46:42 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@295 -- # trap - SIGINT SIGTERM EXIT 00:24:57.324 00:24:57.324 real 0m3.294s 00:24:57.324 user 0m9.324s 00:24:57.324 sys 0m0.376s 00:24:57.324 23:46:42 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@1126 -- # xtrace_disable 00:24:57.324 23:46:42 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:24:57.324 ************************************ 00:24:57.324 END TEST bdev_bounds 00:24:57.324 ************************************ 00:24:57.324 23:46:42 blockdev_crypto_aesni -- bdev/blockdev.sh@761 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '' 00:24:57.324 23:46:42 blockdev_crypto_aesni -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:24:57.324 23:46:42 blockdev_crypto_aesni -- common/autotest_common.sh@1107 -- # xtrace_disable 00:24:57.324 23:46:42 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:24:57.324 ************************************ 00:24:57.324 START TEST bdev_nbd 00:24:57.324 ************************************ 00:24:57.324 23:46:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@1125 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '' 00:24:57.324 23:46:42 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@299 -- # uname -s 00:24:57.324 23:46:42 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@299 -- # [[ Linux == Linux ]] 00:24:57.324 23:46:42 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@301 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:24:57.324 23:46:42 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@302 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:24:57.324 23:46:42 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@303 -- # bdev_all=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:24:57.324 23:46:42 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@303 -- # local bdev_all 00:24:57.324 23:46:42 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_num=4 00:24:57.324 23:46:42 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@308 -- # [[ -e /sys/module/nbd ]] 00:24:57.324 23:46:42 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@310 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:24:57.324 23:46:42 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@310 -- # local nbd_all 00:24:57.324 23:46:42 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@311 -- # bdev_num=4 00:24:57.324 23:46:42 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@313 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:24:57.324 23:46:42 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@313 -- # local nbd_list 00:24:57.325 23:46:42 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@314 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:24:57.325 23:46:42 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@314 -- # local bdev_list 00:24:57.325 23:46:42 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@317 -- # nbd_pid=423099 00:24:57.325 23:46:42 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@318 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:24:57.325 23:46:42 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@316 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:24:57.325 23:46:42 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@319 -- # waitforlisten 423099 /var/tmp/spdk-nbd.sock 00:24:57.325 23:46:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@831 -- # '[' -z 423099 ']' 00:24:57.325 23:46:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:24:57.325 23:46:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@836 -- # local max_retries=100 00:24:57.325 23:46:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:24:57.325 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:24:57.325 23:46:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@840 -- # xtrace_disable 00:24:57.325 23:46:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:24:57.325 [2024-07-24 23:46:42.257716] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:24:57.325 [2024-07-24 23:46:42.257756] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:24:57.584 [2024-07-24 23:46:42.324857] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:57.584 [2024-07-24 23:46:42.396342] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:24:57.584 [2024-07-24 23:46:42.417258] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:24:57.584 [2024-07-24 23:46:42.425279] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:24:57.584 [2024-07-24 23:46:42.433297] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:24:57.584 [2024-07-24 23:46:42.526399] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:25:00.117 [2024-07-24 23:46:44.668353] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:25:00.117 [2024-07-24 23:46:44.668408] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:25:00.117 [2024-07-24 23:46:44.668433] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:25:00.117 [2024-07-24 23:46:44.676374] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:25:00.117 [2024-07-24 23:46:44.676386] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:25:00.117 [2024-07-24 23:46:44.676392] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:25:00.117 [2024-07-24 23:46:44.684391] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:25:00.117 [2024-07-24 23:46:44.684401] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:25:00.117 [2024-07-24 23:46:44.684406] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:25:00.117 [2024-07-24 23:46:44.692411] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:25:00.117 [2024-07-24 23:46:44.692421] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:25:00.117 [2024-07-24 23:46:44.692427] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:25:00.117 23:46:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:25:00.117 23:46:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@864 -- # return 0 00:25:00.117 23:46:44 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@321 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' 00:25:00.117 23:46:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:25:00.117 23:46:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:25:00.117 23:46:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:25:00.117 23:46:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' 00:25:00.117 23:46:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:25:00.117 23:46:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:25:00.117 23:46:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:25:00.117 23:46:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:25:00.117 23:46:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:25:00.117 23:46:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:25:00.117 23:46:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:25:00.117 23:46:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:25:00.117 23:46:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:25:00.117 23:46:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:25:00.117 23:46:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:25:00.117 23:46:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:25:00.117 23:46:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:25:00.117 23:46:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:25:00.117 23:46:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:25:00.117 23:46:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:25:00.117 23:46:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:25:00.117 23:46:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:25:00.117 23:46:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:25:00.117 23:46:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:00.117 1+0 records in 00:25:00.117 1+0 records out 00:25:00.118 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000240853 s, 17.0 MB/s 00:25:00.118 23:46:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:00.118 23:46:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:25:00.118 23:46:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:00.118 23:46:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:25:00.118 23:46:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:25:00.118 23:46:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:25:00.118 23:46:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:25:00.118 23:46:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 00:25:00.376 23:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:25:00.376 23:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:25:00.376 23:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:25:00.376 23:46:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:25:00.376 23:46:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:25:00.376 23:46:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:25:00.376 23:46:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:25:00.376 23:46:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:25:00.376 23:46:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:25:00.376 23:46:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:25:00.376 23:46:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:25:00.376 23:46:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:00.376 1+0 records in 00:25:00.376 1+0 records out 00:25:00.376 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000196336 s, 20.9 MB/s 00:25:00.376 23:46:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:00.376 23:46:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:25:00.376 23:46:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:00.376 23:46:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:25:00.376 23:46:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:25:00.376 23:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:25:00.376 23:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:25:00.376 23:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:25:00.376 23:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:25:00.376 23:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:25:00.376 23:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:25:00.376 23:46:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd2 00:25:00.376 23:46:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:25:00.376 23:46:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:25:00.376 23:46:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:25:00.376 23:46:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd2 /proc/partitions 00:25:00.376 23:46:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:25:00.376 23:46:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:25:00.376 23:46:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:25:00.376 23:46:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:00.376 1+0 records in 00:25:00.376 1+0 records out 00:25:00.376 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000237463 s, 17.2 MB/s 00:25:00.376 23:46:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:00.635 23:46:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:25:00.635 23:46:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:00.635 23:46:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:25:00.635 23:46:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:25:00.635 23:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:25:00.635 23:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:25:00.635 23:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram4 00:25:00.635 23:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:25:00.635 23:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:25:00.635 23:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:25:00.635 23:46:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd3 00:25:00.635 23:46:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:25:00.635 23:46:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:25:00.635 23:46:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:25:00.635 23:46:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd3 /proc/partitions 00:25:00.635 23:46:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:25:00.635 23:46:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:25:00.635 23:46:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:25:00.635 23:46:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:00.635 1+0 records in 00:25:00.635 1+0 records out 00:25:00.635 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000244362 s, 16.8 MB/s 00:25:00.635 23:46:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:00.635 23:46:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:25:00.635 23:46:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:00.635 23:46:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:25:00.635 23:46:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:25:00.635 23:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:25:00.635 23:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:25:00.635 23:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:25:00.894 23:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:25:00.894 { 00:25:00.894 "nbd_device": "/dev/nbd0", 00:25:00.894 "bdev_name": "crypto_ram" 00:25:00.894 }, 00:25:00.894 { 00:25:00.894 "nbd_device": "/dev/nbd1", 00:25:00.894 "bdev_name": "crypto_ram2" 00:25:00.894 }, 00:25:00.894 { 00:25:00.894 "nbd_device": "/dev/nbd2", 00:25:00.894 "bdev_name": "crypto_ram3" 00:25:00.894 }, 00:25:00.894 { 00:25:00.894 "nbd_device": "/dev/nbd3", 00:25:00.894 "bdev_name": "crypto_ram4" 00:25:00.894 } 00:25:00.894 ]' 00:25:00.894 23:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:25:00.894 23:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:25:00.894 { 00:25:00.894 "nbd_device": "/dev/nbd0", 00:25:00.894 "bdev_name": "crypto_ram" 00:25:00.894 }, 00:25:00.894 { 00:25:00.894 "nbd_device": "/dev/nbd1", 00:25:00.894 "bdev_name": "crypto_ram2" 00:25:00.894 }, 00:25:00.894 { 00:25:00.894 "nbd_device": "/dev/nbd2", 00:25:00.894 "bdev_name": "crypto_ram3" 00:25:00.894 }, 00:25:00.894 { 00:25:00.894 "nbd_device": "/dev/nbd3", 00:25:00.894 "bdev_name": "crypto_ram4" 00:25:00.894 } 00:25:00.894 ]' 00:25:00.894 23:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:25:00.894 23:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3' 00:25:00.894 23:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:25:00.894 23:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3') 00:25:00.894 23:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:00.894 23:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:25:00.894 23:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:00.894 23:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:25:01.152 23:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:01.152 23:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:01.152 23:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:01.152 23:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:01.152 23:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:01.152 23:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:01.152 23:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:25:01.152 23:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:25:01.152 23:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:01.152 23:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:25:01.411 23:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:25:01.411 23:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:25:01.411 23:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:25:01.411 23:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:01.411 23:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:01.411 23:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:25:01.411 23:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:25:01.412 23:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:25:01.412 23:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:01.412 23:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:25:01.412 23:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:25:01.412 23:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:25:01.412 23:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:25:01.412 23:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:01.412 23:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:01.412 23:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:25:01.412 23:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:25:01.412 23:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:25:01.412 23:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:01.412 23:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:25:01.670 23:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:25:01.671 23:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:25:01.671 23:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:25:01.671 23:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:01.671 23:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:01.671 23:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:25:01.671 23:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:25:01.671 23:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:25:01.671 23:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:25:01.671 23:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:25:01.671 23:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:25:01.930 23:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:25:01.930 23:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:25:01.930 23:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:25:01.930 23:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:25:01.930 23:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:25:01.930 23:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:25:01.930 23:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:25:01.930 23:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:25:01.930 23:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:25:01.930 23:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:25:01.930 23:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:25:01.930 23:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:25:01.930 23:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:25:01.930 23:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:25:01.930 23:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:25:01.930 23:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:25:01.930 23:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:25:01.930 23:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:25:01.930 23:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:25:01.930 23:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:25:01.930 23:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:25:01.930 23:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:01.930 23:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:25:01.930 23:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:01.930 23:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:25:01.930 23:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:01.930 23:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:25:01.930 23:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:25:02.189 /dev/nbd0 00:25:02.189 23:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:25:02.189 23:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:25:02.189 23:46:46 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:25:02.189 23:46:46 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:25:02.189 23:46:46 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:25:02.189 23:46:46 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:25:02.189 23:46:46 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:25:02.189 23:46:46 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:25:02.189 23:46:46 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:25:02.189 23:46:46 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:25:02.189 23:46:46 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:02.189 1+0 records in 00:25:02.189 1+0 records out 00:25:02.189 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000241079 s, 17.0 MB/s 00:25:02.189 23:46:46 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:02.189 23:46:46 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:25:02.189 23:46:46 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:02.189 23:46:46 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:25:02.189 23:46:46 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:25:02.189 23:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:02.189 23:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:25:02.189 23:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 /dev/nbd1 00:25:02.189 /dev/nbd1 00:25:02.189 23:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:25:02.189 23:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:25:02.189 23:46:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:25:02.189 23:46:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:25:02.189 23:46:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:25:02.189 23:46:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:25:02.189 23:46:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:25:02.189 23:46:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:25:02.189 23:46:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:25:02.189 23:46:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:25:02.189 23:46:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:02.189 1+0 records in 00:25:02.189 1+0 records out 00:25:02.189 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00023998 s, 17.1 MB/s 00:25:02.189 23:46:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:02.189 23:46:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:25:02.189 23:46:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:02.189 23:46:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:25:02.189 23:46:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:25:02.189 23:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:02.189 23:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:25:02.189 23:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd10 00:25:02.447 /dev/nbd10 00:25:02.447 23:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:25:02.447 23:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:25:02.447 23:46:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd10 00:25:02.447 23:46:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:25:02.447 23:46:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:25:02.447 23:46:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:25:02.447 23:46:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd10 /proc/partitions 00:25:02.447 23:46:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:25:02.447 23:46:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:25:02.447 23:46:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:25:02.447 23:46:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:02.447 1+0 records in 00:25:02.447 1+0 records out 00:25:02.447 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000221773 s, 18.5 MB/s 00:25:02.447 23:46:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:02.447 23:46:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:25:02.447 23:46:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:02.447 23:46:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:25:02.447 23:46:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:25:02.447 23:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:02.447 23:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:25:02.447 23:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram4 /dev/nbd11 00:25:02.706 /dev/nbd11 00:25:02.706 23:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:25:02.706 23:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:25:02.706 23:46:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd11 00:25:02.706 23:46:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:25:02.706 23:46:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:25:02.706 23:46:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:25:02.706 23:46:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd11 /proc/partitions 00:25:02.706 23:46:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:25:02.706 23:46:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:25:02.706 23:46:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:25:02.706 23:46:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:02.706 1+0 records in 00:25:02.706 1+0 records out 00:25:02.706 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000257854 s, 15.9 MB/s 00:25:02.706 23:46:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:02.706 23:46:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:25:02.706 23:46:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:02.706 23:46:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:25:02.706 23:46:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:25:02.706 23:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:02.706 23:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:25:02.706 23:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:25:02.706 23:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:25:02.706 23:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:25:02.964 23:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:25:02.964 { 00:25:02.964 "nbd_device": "/dev/nbd0", 00:25:02.964 "bdev_name": "crypto_ram" 00:25:02.964 }, 00:25:02.964 { 00:25:02.964 "nbd_device": "/dev/nbd1", 00:25:02.964 "bdev_name": "crypto_ram2" 00:25:02.964 }, 00:25:02.964 { 00:25:02.964 "nbd_device": "/dev/nbd10", 00:25:02.964 "bdev_name": "crypto_ram3" 00:25:02.964 }, 00:25:02.964 { 00:25:02.964 "nbd_device": "/dev/nbd11", 00:25:02.964 "bdev_name": "crypto_ram4" 00:25:02.964 } 00:25:02.964 ]' 00:25:02.964 23:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:25:02.964 { 00:25:02.964 "nbd_device": "/dev/nbd0", 00:25:02.964 "bdev_name": "crypto_ram" 00:25:02.964 }, 00:25:02.964 { 00:25:02.964 "nbd_device": "/dev/nbd1", 00:25:02.964 "bdev_name": "crypto_ram2" 00:25:02.964 }, 00:25:02.964 { 00:25:02.964 "nbd_device": "/dev/nbd10", 00:25:02.964 "bdev_name": "crypto_ram3" 00:25:02.964 }, 00:25:02.964 { 00:25:02.964 "nbd_device": "/dev/nbd11", 00:25:02.964 "bdev_name": "crypto_ram4" 00:25:02.964 } 00:25:02.964 ]' 00:25:02.964 23:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:25:02.964 23:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:25:02.964 /dev/nbd1 00:25:02.964 /dev/nbd10 00:25:02.964 /dev/nbd11' 00:25:02.964 23:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:25:02.964 /dev/nbd1 00:25:02.964 /dev/nbd10 00:25:02.964 /dev/nbd11' 00:25:02.964 23:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:25:02.965 23:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=4 00:25:02.965 23:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 4 00:25:02.965 23:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=4 00:25:02.965 23:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 4 -ne 4 ']' 00:25:02.965 23:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' write 00:25:02.965 23:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:25:02.965 23:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:25:02.965 23:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:25:02.965 23:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:25:02.965 23:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:25:02.965 23:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:25:02.965 256+0 records in 00:25:02.965 256+0 records out 00:25:02.965 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00335592 s, 312 MB/s 00:25:02.965 23:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:25:02.965 23:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:25:02.965 256+0 records in 00:25:02.965 256+0 records out 00:25:02.965 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0272205 s, 38.5 MB/s 00:25:02.965 23:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:25:02.965 23:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:25:02.965 256+0 records in 00:25:02.965 256+0 records out 00:25:02.965 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0300468 s, 34.9 MB/s 00:25:02.965 23:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:25:02.965 23:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:25:02.965 256+0 records in 00:25:02.965 256+0 records out 00:25:02.965 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0263624 s, 39.8 MB/s 00:25:02.965 23:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:25:02.965 23:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:25:02.965 256+0 records in 00:25:02.965 256+0 records out 00:25:02.965 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0246634 s, 42.5 MB/s 00:25:02.965 23:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' verify 00:25:02.965 23:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:25:02.965 23:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:25:02.965 23:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:25:02.965 23:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:25:02.965 23:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:25:02.965 23:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:25:02.965 23:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:25:02.965 23:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:25:02.965 23:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:25:02.965 23:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:25:02.965 23:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:25:02.965 23:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:25:02.965 23:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:25:02.965 23:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:25:02.965 23:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:25:02.965 23:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:25:02.965 23:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:25:02.965 23:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:25:02.965 23:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:02.965 23:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:25:02.965 23:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:02.965 23:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:25:03.224 23:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:03.224 23:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:03.224 23:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:03.224 23:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:03.224 23:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:03.224 23:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:03.224 23:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:25:03.224 23:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:25:03.224 23:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:03.224 23:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:25:03.481 23:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:25:03.481 23:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:25:03.481 23:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:25:03.481 23:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:03.481 23:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:03.481 23:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:25:03.481 23:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:25:03.481 23:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:25:03.481 23:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:03.482 23:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:25:03.482 23:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:25:03.482 23:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:25:03.482 23:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:25:03.482 23:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:03.482 23:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:03.482 23:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:25:03.482 23:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:25:03.482 23:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:25:03.482 23:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:03.482 23:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:25:03.740 23:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:25:03.740 23:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:25:03.740 23:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:25:03.740 23:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:03.740 23:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:03.740 23:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:25:03.740 23:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:25:03.740 23:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:25:03.740 23:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:25:03.740 23:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:25:03.740 23:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:25:03.998 23:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:25:03.998 23:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:25:03.998 23:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:25:03.998 23:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:25:03.998 23:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:25:03.998 23:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:25:03.998 23:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:25:03.998 23:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:25:03.998 23:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:25:03.998 23:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:25:03.998 23:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:25:03.998 23:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:25:03.998 23:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:25:03.998 23:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:25:03.998 23:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:25:03.998 23:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:25:03.998 23:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:25:03.998 23:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:25:04.256 malloc_lvol_verify 00:25:04.256 23:46:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:25:04.256 78e7d57e-579a-4f61-8cfb-ec03dc3dbdf8 00:25:04.256 23:46:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:25:04.513 7b76113f-48b6-49ee-a6f2-fc1c2be5c6d3 00:25:04.514 23:46:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:25:04.772 /dev/nbd0 00:25:04.772 23:46:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:25:04.772 mke2fs 1.46.5 (30-Dec-2021) 00:25:04.772 Discarding device blocks: 0/4096 done 00:25:04.772 Creating filesystem with 4096 1k blocks and 1024 inodes 00:25:04.772 00:25:04.772 Allocating group tables: 0/1 done 00:25:04.772 Writing inode tables: 0/1 done 00:25:04.772 Creating journal (1024 blocks): done 00:25:04.772 Writing superblocks and filesystem accounting information: 0/1 done 00:25:04.772 00:25:04.772 23:46:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:25:04.772 23:46:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:25:04.772 23:46:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:25:04.772 23:46:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:25:04.772 23:46:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:04.772 23:46:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:25:04.772 23:46:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:04.772 23:46:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:25:04.772 23:46:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:04.772 23:46:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:04.772 23:46:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:04.772 23:46:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:04.772 23:46:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:04.772 23:46:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:04.772 23:46:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:25:04.772 23:46:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:25:04.772 23:46:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:25:04.772 23:46:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:25:04.772 23:46:49 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@325 -- # killprocess 423099 00:25:04.772 23:46:49 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@950 -- # '[' -z 423099 ']' 00:25:04.772 23:46:49 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@954 -- # kill -0 423099 00:25:04.772 23:46:49 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@955 -- # uname 00:25:04.772 23:46:49 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:25:05.031 23:46:49 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 423099 00:25:05.031 23:46:49 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:25:05.031 23:46:49 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:25:05.031 23:46:49 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@968 -- # echo 'killing process with pid 423099' 00:25:05.031 killing process with pid 423099 00:25:05.031 23:46:49 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@969 -- # kill 423099 00:25:05.031 23:46:49 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@974 -- # wait 423099 00:25:05.290 23:46:50 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@326 -- # trap - SIGINT SIGTERM EXIT 00:25:05.290 00:25:05.290 real 0m7.913s 00:25:05.290 user 0m10.624s 00:25:05.290 sys 0m2.332s 00:25:05.290 23:46:50 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@1126 -- # xtrace_disable 00:25:05.290 23:46:50 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:25:05.290 ************************************ 00:25:05.290 END TEST bdev_nbd 00:25:05.290 ************************************ 00:25:05.290 23:46:50 blockdev_crypto_aesni -- bdev/blockdev.sh@762 -- # [[ y == y ]] 00:25:05.290 23:46:50 blockdev_crypto_aesni -- bdev/blockdev.sh@763 -- # '[' crypto_aesni = nvme ']' 00:25:05.290 23:46:50 blockdev_crypto_aesni -- bdev/blockdev.sh@763 -- # '[' crypto_aesni = gpt ']' 00:25:05.290 23:46:50 blockdev_crypto_aesni -- bdev/blockdev.sh@767 -- # run_test bdev_fio fio_test_suite '' 00:25:05.290 23:46:50 blockdev_crypto_aesni -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:25:05.290 23:46:50 blockdev_crypto_aesni -- common/autotest_common.sh@1107 -- # xtrace_disable 00:25:05.290 23:46:50 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:25:05.290 ************************************ 00:25:05.290 START TEST bdev_fio 00:25:05.290 ************************************ 00:25:05.290 23:46:50 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1125 -- # fio_test_suite '' 00:25:05.290 23:46:50 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@330 -- # local env_context 00:25:05.290 23:46:50 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@334 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:25:05.290 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:25:05.290 23:46:50 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@335 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:25:05.290 23:46:50 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@338 -- # echo '' 00:25:05.290 23:46:50 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@338 -- # sed s/--env-context=// 00:25:05.290 23:46:50 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@338 -- # env_context= 00:25:05.290 23:46:50 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:25:05.290 23:46:50 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:25:05.290 23:46:50 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:25:05.290 23:46:50 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:25:05.290 23:46:50 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:25:05.290 23:46:50 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:25:05.290 23:46:50 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:25:05.290 23:46:50 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:25:05.290 23:46:50 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:25:05.290 23:46:50 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:25:05.290 23:46:50 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:25:05.290 23:46:50 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:25:05.290 23:46:50 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:25:05.290 23:46:50 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:25:05.290 23:46:50 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:25:05.290 23:46:50 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:25:05.290 23:46:50 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:25:05.290 23:46:50 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:25:05.290 23:46:50 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram]' 00:25:05.290 23:46:50 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram 00:25:05.290 23:46:50 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:25:05.290 23:46:50 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram2]' 00:25:05.290 23:46:50 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram2 00:25:05.290 23:46:50 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:25:05.290 23:46:50 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram3]' 00:25:05.290 23:46:50 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram3 00:25:05.290 23:46:50 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:25:05.290 23:46:50 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram4]' 00:25:05.290 23:46:50 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram4 00:25:05.290 23:46:50 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@346 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:25:05.290 23:46:50 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@348 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:25:05.290 23:46:50 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1101 -- # '[' 11 -le 1 ']' 00:25:05.290 23:46:50 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1107 -- # xtrace_disable 00:25:05.290 23:46:50 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:25:05.290 ************************************ 00:25:05.290 START TEST bdev_fio_rw_verify 00:25:05.290 ************************************ 00:25:05.290 23:46:50 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1125 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:25:05.290 23:46:50 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:25:05.290 23:46:50 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:25:05.290 23:46:50 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:25:05.291 23:46:50 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:25:05.291 23:46:50 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:25:05.291 23:46:50 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:25:05.291 23:46:50 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:25:05.291 23:46:50 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:25:05.291 23:46:50 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:25:05.291 23:46:50 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:25:05.291 23:46:50 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:25:05.570 23:46:50 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:25:05.570 23:46:50 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:25:05.570 23:46:50 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:25:05.570 23:46:50 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:25:05.570 23:46:50 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:25:05.570 23:46:50 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:25:05.570 23:46:50 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:25:05.570 23:46:50 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:25:05.570 23:46:50 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:25:05.570 23:46:50 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:25:05.835 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:25:05.835 job_crypto_ram2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:25:05.835 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:25:05.835 job_crypto_ram4: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:25:05.835 fio-3.35 00:25:05.835 Starting 4 threads 00:25:20.742 00:25:20.742 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=425241: Wed Jul 24 23:47:03 2024 00:25:20.742 read: IOPS=24.5k, BW=95.8MiB/s (100MB/s)(958MiB/10001msec) 00:25:20.742 slat (usec): min=11, max=326, avg=57.22, stdev=27.02 00:25:20.742 clat (usec): min=9, max=1662, avg=301.76, stdev=179.55 00:25:20.742 lat (usec): min=39, max=1792, avg=358.98, stdev=193.09 00:25:20.742 clat percentiles (usec): 00:25:20.742 | 50.000th=[ 269], 99.000th=[ 840], 99.900th=[ 963], 99.990th=[ 1020], 00:25:20.742 | 99.999th=[ 1418] 00:25:20.743 write: IOPS=27.0k, BW=105MiB/s (111MB/s)(1027MiB/9736msec); 0 zone resets 00:25:20.743 slat (usec): min=15, max=419, avg=65.69, stdev=26.84 00:25:20.743 clat (usec): min=13, max=2053, avg=352.98, stdev=206.69 00:25:20.743 lat (usec): min=33, max=2157, avg=418.67, stdev=220.47 00:25:20.743 clat percentiles (usec): 00:25:20.743 | 50.000th=[ 318], 99.000th=[ 1029], 99.900th=[ 1188], 99.990th=[ 1303], 00:25:20.743 | 99.999th=[ 1762] 00:25:20.743 bw ( KiB/s): min=89728, max=160961, per=97.50%, avg=105294.37, stdev=3891.64, samples=76 00:25:20.743 iops : min=22432, max=40240, avg=26323.58, stdev=972.89, samples=76 00:25:20.743 lat (usec) : 10=0.01%, 20=0.01%, 50=0.36%, 100=7.76%, 250=32.38% 00:25:20.743 lat (usec) : 500=43.10%, 750=12.34%, 1000=3.46% 00:25:20.743 lat (msec) : 2=0.60%, 4=0.01% 00:25:20.743 cpu : usr=99.67%, sys=0.01%, ctx=72, majf=0, minf=271 00:25:20.743 IO depths : 1=10.0%, 2=25.6%, 4=51.3%, 8=13.1%, 16=0.0%, 32=0.0%, >=64=0.0% 00:25:20.743 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:25:20.743 complete : 0=0.0%, 4=88.7%, 8=11.3%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:25:20.743 issued rwts: total=245218,262861,0,0 short=0,0,0,0 dropped=0,0,0,0 00:25:20.743 latency : target=0, window=0, percentile=100.00%, depth=8 00:25:20.743 00:25:20.743 Run status group 0 (all jobs): 00:25:20.743 READ: bw=95.8MiB/s (100MB/s), 95.8MiB/s-95.8MiB/s (100MB/s-100MB/s), io=958MiB (1004MB), run=10001-10001msec 00:25:20.743 WRITE: bw=105MiB/s (111MB/s), 105MiB/s-105MiB/s (111MB/s-111MB/s), io=1027MiB (1077MB), run=9736-9736msec 00:25:20.743 00:25:20.743 real 0m13.253s 00:25:20.743 user 0m48.336s 00:25:20.743 sys 0m0.373s 00:25:20.743 23:47:03 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1126 -- # xtrace_disable 00:25:20.743 23:47:03 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:25:20.743 ************************************ 00:25:20.743 END TEST bdev_fio_rw_verify 00:25:20.743 ************************************ 00:25:20.743 23:47:03 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@349 -- # rm -f 00:25:20.743 23:47:03 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:25:20.743 23:47:03 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@353 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:25:20.743 23:47:03 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:25:20.743 23:47:03 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:25:20.743 23:47:03 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:25:20.743 23:47:03 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:25:20.743 23:47:03 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:25:20.743 23:47:03 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:25:20.743 23:47:03 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:25:20.743 23:47:03 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:25:20.743 23:47:03 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:25:20.743 23:47:03 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:25:20.743 23:47:03 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:25:20.743 23:47:03 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:25:20.743 23:47:03 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:25:20.743 23:47:03 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@354 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:25:20.743 23:47:03 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@354 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "99501f13-3c00-5348-b855-2574392cef43"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "99501f13-3c00-5348-b855-2574392cef43",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "6a733981-b4b4-527c-9d1f-ee887708367d"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "6a733981-b4b4-527c-9d1f-ee887708367d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "6cc1a231-600f-5cbd-9238-c891439c5e0b"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "6cc1a231-600f-5cbd-9238-c891439c5e0b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "3542f73c-3924-5ae1-aa23-b97c82b06021"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "3542f73c-3924-5ae1-aa23-b97c82b06021",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:25:20.743 23:47:03 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@354 -- # [[ -n crypto_ram 00:25:20.743 crypto_ram2 00:25:20.743 crypto_ram3 00:25:20.743 crypto_ram4 ]] 00:25:20.743 23:47:03 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:25:20.744 23:47:03 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "99501f13-3c00-5348-b855-2574392cef43"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "99501f13-3c00-5348-b855-2574392cef43",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "6a733981-b4b4-527c-9d1f-ee887708367d"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "6a733981-b4b4-527c-9d1f-ee887708367d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "6cc1a231-600f-5cbd-9238-c891439c5e0b"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "6cc1a231-600f-5cbd-9238-c891439c5e0b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "3542f73c-3924-5ae1-aa23-b97c82b06021"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "3542f73c-3924-5ae1-aa23-b97c82b06021",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:25:20.744 23:47:03 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:25:20.744 23:47:03 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram]' 00:25:20.744 23:47:03 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram 00:25:20.744 23:47:03 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:25:20.744 23:47:03 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram2]' 00:25:20.744 23:47:03 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram2 00:25:20.744 23:47:03 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:25:20.744 23:47:03 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram3]' 00:25:20.744 23:47:03 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram3 00:25:20.744 23:47:03 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:25:20.744 23:47:03 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram4]' 00:25:20.744 23:47:03 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram4 00:25:20.744 23:47:03 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@366 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:25:20.744 23:47:03 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1101 -- # '[' 11 -le 1 ']' 00:25:20.744 23:47:03 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1107 -- # xtrace_disable 00:25:20.744 23:47:03 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:25:20.744 ************************************ 00:25:20.744 START TEST bdev_fio_trim 00:25:20.744 ************************************ 00:25:20.744 23:47:03 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1125 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:25:20.744 23:47:03 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:25:20.744 23:47:03 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:25:20.744 23:47:03 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:25:20.744 23:47:03 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:25:20.744 23:47:03 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:25:20.744 23:47:03 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:25:20.744 23:47:03 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:25:20.744 23:47:03 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:25:20.744 23:47:03 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:25:20.744 23:47:03 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:25:20.744 23:47:03 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:25:20.744 23:47:03 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:25:20.744 23:47:03 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:25:20.744 23:47:03 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:25:20.744 23:47:03 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:25:20.744 23:47:03 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:25:20.744 23:47:03 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:25:20.744 23:47:03 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:25:20.744 23:47:03 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:25:20.744 23:47:03 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:25:20.744 23:47:03 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:25:20.744 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:25:20.744 job_crypto_ram2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:25:20.744 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:25:20.744 job_crypto_ram4: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:25:20.744 fio-3.35 00:25:20.744 Starting 4 threads 00:25:32.946 00:25:32.946 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=427424: Wed Jul 24 23:47:16 2024 00:25:32.946 write: IOPS=44.1k, BW=172MiB/s (181MB/s)(1723MiB/10001msec); 0 zone resets 00:25:32.946 slat (usec): min=11, max=1200, avg=51.70, stdev=26.16 00:25:32.946 clat (usec): min=35, max=1023, avg=228.58, stdev=137.64 00:25:32.946 lat (usec): min=53, max=1614, avg=280.29, stdev=153.15 00:25:32.946 clat percentiles (usec): 00:25:32.946 | 50.000th=[ 194], 99.000th=[ 660], 99.900th=[ 799], 99.990th=[ 906], 00:25:32.946 | 99.999th=[ 1012] 00:25:32.946 bw ( KiB/s): min=150416, max=259616, per=100.00%, avg=177520.84, stdev=8365.34, samples=76 00:25:32.946 iops : min=37604, max=64904, avg=44380.21, stdev=2091.34, samples=76 00:25:32.946 trim: IOPS=44.1k, BW=172MiB/s (181MB/s)(1723MiB/10001msec); 0 zone resets 00:25:32.946 slat (usec): min=4, max=259, avg=15.45, stdev= 6.87 00:25:32.946 clat (usec): min=45, max=1615, avg=215.56, stdev=103.17 00:25:32.946 lat (usec): min=53, max=1631, avg=231.01, stdev=105.77 00:25:32.946 clat percentiles (usec): 00:25:32.946 | 50.000th=[ 198], 99.000th=[ 498], 99.900th=[ 578], 99.990th=[ 660], 00:25:32.946 | 99.999th=[ 848] 00:25:32.946 bw ( KiB/s): min=150416, max=259640, per=100.00%, avg=177522.11, stdev=8365.71, samples=76 00:25:32.946 iops : min=37604, max=64910, avg=44380.53, stdev=2091.43, samples=76 00:25:32.946 lat (usec) : 50=1.36%, 100=11.58%, 250=54.03%, 500=29.94%, 750=2.96% 00:25:32.946 lat (usec) : 1000=0.12% 00:25:32.946 lat (msec) : 2=0.01% 00:25:32.946 cpu : usr=99.69%, sys=0.00%, ctx=90, majf=0, minf=119 00:25:32.946 IO depths : 1=7.4%, 2=26.4%, 4=52.9%, 8=13.2%, 16=0.0%, 32=0.0%, >=64=0.0% 00:25:32.946 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:25:32.946 complete : 0=0.0%, 4=88.3%, 8=11.7%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:25:32.946 issued rwts: total=0,440989,440990,0 short=0,0,0,0 dropped=0,0,0,0 00:25:32.946 latency : target=0, window=0, percentile=100.00%, depth=8 00:25:32.946 00:25:32.946 Run status group 0 (all jobs): 00:25:32.946 WRITE: bw=172MiB/s (181MB/s), 172MiB/s-172MiB/s (181MB/s-181MB/s), io=1723MiB (1806MB), run=10001-10001msec 00:25:32.946 TRIM: bw=172MiB/s (181MB/s), 172MiB/s-172MiB/s (181MB/s-181MB/s), io=1723MiB (1806MB), run=10001-10001msec 00:25:32.946 00:25:32.946 real 0m13.271s 00:25:32.946 user 0m48.283s 00:25:32.946 sys 0m0.366s 00:25:32.946 23:47:16 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1126 -- # xtrace_disable 00:25:32.946 23:47:16 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:25:32.946 ************************************ 00:25:32.946 END TEST bdev_fio_trim 00:25:32.946 ************************************ 00:25:32.946 23:47:16 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@367 -- # rm -f 00:25:32.946 23:47:17 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:25:32.946 23:47:17 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@369 -- # popd 00:25:32.946 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:25:32.946 23:47:17 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@370 -- # trap - SIGINT SIGTERM EXIT 00:25:32.946 00:25:32.946 real 0m26.829s 00:25:32.946 user 1m36.800s 00:25:32.946 sys 0m0.881s 00:25:32.946 23:47:17 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1126 -- # xtrace_disable 00:25:32.946 23:47:17 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:25:32.946 ************************************ 00:25:32.946 END TEST bdev_fio 00:25:32.946 ************************************ 00:25:32.946 23:47:17 blockdev_crypto_aesni -- bdev/blockdev.sh@774 -- # trap cleanup SIGINT SIGTERM EXIT 00:25:32.946 23:47:17 blockdev_crypto_aesni -- bdev/blockdev.sh@776 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:25:32.946 23:47:17 blockdev_crypto_aesni -- common/autotest_common.sh@1101 -- # '[' 16 -le 1 ']' 00:25:32.946 23:47:17 blockdev_crypto_aesni -- common/autotest_common.sh@1107 -- # xtrace_disable 00:25:32.946 23:47:17 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:25:32.946 ************************************ 00:25:32.946 START TEST bdev_verify 00:25:32.946 ************************************ 00:25:32.946 23:47:17 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:25:32.946 [2024-07-24 23:47:17.126159] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:25:32.946 [2024-07-24 23:47:17.126195] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid429252 ] 00:25:32.946 [2024-07-24 23:47:17.188836] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:25:32.946 [2024-07-24 23:47:17.265496] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:25:32.946 [2024-07-24 23:47:17.265514] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:25:32.946 [2024-07-24 23:47:17.286585] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:25:32.946 [2024-07-24 23:47:17.294608] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:25:32.946 [2024-07-24 23:47:17.302627] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:25:32.946 [2024-07-24 23:47:17.396717] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:25:34.846 [2024-07-24 23:47:19.545583] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:25:34.846 [2024-07-24 23:47:19.545646] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:25:34.846 [2024-07-24 23:47:19.545654] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:25:34.846 [2024-07-24 23:47:19.553599] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:25:34.846 [2024-07-24 23:47:19.553612] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:25:34.846 [2024-07-24 23:47:19.553618] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:25:34.846 [2024-07-24 23:47:19.561620] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:25:34.846 [2024-07-24 23:47:19.561631] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:25:34.846 [2024-07-24 23:47:19.561637] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:25:34.846 [2024-07-24 23:47:19.569641] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:25:34.846 [2024-07-24 23:47:19.569651] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:25:34.846 [2024-07-24 23:47:19.569656] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:25:34.846 Running I/O for 5 seconds... 00:25:40.116 00:25:40.116 Latency(us) 00:25:40.116 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:40.116 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:25:40.116 Verification LBA range: start 0x0 length 0x1000 00:25:40.116 crypto_ram : 5.04 710.60 2.78 0.00 0.00 179808.06 6241.52 119337.94 00:25:40.116 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:25:40.116 Verification LBA range: start 0x1000 length 0x1000 00:25:40.116 crypto_ram : 5.04 710.78 2.78 0.00 0.00 179671.73 12607.88 118838.61 00:25:40.116 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:25:40.116 Verification LBA range: start 0x0 length 0x1000 00:25:40.116 crypto_ram2 : 5.04 710.51 2.78 0.00 0.00 179456.31 6616.02 112347.43 00:25:40.116 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:25:40.116 Verification LBA range: start 0x1000 length 0x1000 00:25:40.116 crypto_ram2 : 5.04 710.69 2.78 0.00 0.00 179321.65 6647.22 112347.43 00:25:40.116 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:25:40.116 Verification LBA range: start 0x0 length 0x1000 00:25:40.116 crypto_ram3 : 5.04 5589.96 21.84 0.00 0.00 22749.55 3073.95 19723.22 00:25:40.116 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:25:40.116 Verification LBA range: start 0x1000 length 0x1000 00:25:40.116 crypto_ram3 : 5.04 5616.23 21.94 0.00 0.00 22643.06 2621.44 19723.22 00:25:40.116 Job: crypto_ram4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:25:40.116 Verification LBA range: start 0x0 length 0x1000 00:25:40.116 crypto_ram4 : 5.04 5587.59 21.83 0.00 0.00 22702.98 3760.52 17725.93 00:25:40.116 Job: crypto_ram4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:25:40.116 Verification LBA range: start 0x1000 length 0x1000 00:25:40.117 crypto_ram4 : 5.04 5615.40 21.94 0.00 0.00 22595.90 2668.25 17226.61 00:25:40.117 =================================================================================================================== 00:25:40.117 Total : 25251.77 98.64 0.00 0.00 40350.69 2621.44 119337.94 00:25:40.117 00:25:40.117 real 0m7.935s 00:25:40.117 user 0m15.277s 00:25:40.117 sys 0m0.253s 00:25:40.117 23:47:25 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@1126 -- # xtrace_disable 00:25:40.117 23:47:25 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:25:40.117 ************************************ 00:25:40.117 END TEST bdev_verify 00:25:40.117 ************************************ 00:25:40.117 23:47:25 blockdev_crypto_aesni -- bdev/blockdev.sh@777 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:25:40.117 23:47:25 blockdev_crypto_aesni -- common/autotest_common.sh@1101 -- # '[' 16 -le 1 ']' 00:25:40.117 23:47:25 blockdev_crypto_aesni -- common/autotest_common.sh@1107 -- # xtrace_disable 00:25:40.117 23:47:25 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:25:40.117 ************************************ 00:25:40.117 START TEST bdev_verify_big_io 00:25:40.117 ************************************ 00:25:40.117 23:47:25 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:25:40.375 [2024-07-24 23:47:25.130168] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:25:40.375 [2024-07-24 23:47:25.130208] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid430510 ] 00:25:40.375 [2024-07-24 23:47:25.193184] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:25:40.375 [2024-07-24 23:47:25.266714] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:25:40.375 [2024-07-24 23:47:25.266716] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:25:40.375 [2024-07-24 23:47:25.287703] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:25:40.375 [2024-07-24 23:47:25.295724] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:25:40.375 [2024-07-24 23:47:25.303741] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:25:40.634 [2024-07-24 23:47:25.399230] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:25:43.166 [2024-07-24 23:47:27.542578] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:25:43.166 [2024-07-24 23:47:27.542636] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:25:43.166 [2024-07-24 23:47:27.542644] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:25:43.166 [2024-07-24 23:47:27.550597] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:25:43.166 [2024-07-24 23:47:27.550609] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:25:43.166 [2024-07-24 23:47:27.550615] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:25:43.166 [2024-07-24 23:47:27.558618] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:25:43.166 [2024-07-24 23:47:27.558628] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:25:43.166 [2024-07-24 23:47:27.558634] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:25:43.166 [2024-07-24 23:47:27.566639] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:25:43.166 [2024-07-24 23:47:27.566648] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:25:43.166 [2024-07-24 23:47:27.566654] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:25:43.166 Running I/O for 5 seconds... 00:25:48.436 00:25:48.436 Latency(us) 00:25:48.436 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:48.436 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:25:48.436 Verification LBA range: start 0x0 length 0x100 00:25:48.436 crypto_ram : 5.54 68.58 4.29 0.00 0.00 1825301.77 52928.12 1621797.55 00:25:48.436 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:25:48.436 Verification LBA range: start 0x100 length 0x100 00:25:48.436 crypto_ram : 5.54 68.56 4.29 0.00 0.00 1826178.87 52928.12 1621797.55 00:25:48.436 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:25:48.436 Verification LBA range: start 0x0 length 0x100 00:25:48.436 crypto_ram2 : 5.54 68.58 4.29 0.00 0.00 1784166.19 52928.12 1621797.55 00:25:48.436 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:25:48.436 Verification LBA range: start 0x100 length 0x100 00:25:48.436 crypto_ram2 : 5.54 68.55 4.28 0.00 0.00 1784944.08 52678.46 1621797.55 00:25:48.436 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:25:48.436 Verification LBA range: start 0x0 length 0x100 00:25:48.436 crypto_ram3 : 5.37 468.58 29.29 0.00 0.00 254745.66 19473.55 359511.77 00:25:48.436 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:25:48.436 Verification LBA range: start 0x100 length 0x100 00:25:48.436 crypto_ram3 : 5.36 466.75 29.17 0.00 0.00 255645.11 14293.09 359511.77 00:25:48.436 Job: crypto_ram4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:25:48.436 Verification LBA range: start 0x0 length 0x100 00:25:48.436 crypto_ram4 : 5.41 481.75 30.11 0.00 0.00 243740.79 1950.48 321563.31 00:25:48.436 Job: crypto_ram4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:25:48.436 Verification LBA range: start 0x100 length 0x100 00:25:48.436 crypto_ram4 : 5.41 480.12 30.01 0.00 0.00 244484.95 11858.90 323560.59 00:25:48.437 =================================================================================================================== 00:25:48.437 Total : 2171.48 135.72 0.00 0.00 450959.28 1950.48 1621797.55 00:25:48.695 00:25:48.695 real 0m8.445s 00:25:48.695 user 0m16.274s 00:25:48.695 sys 0m0.271s 00:25:48.695 23:47:33 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@1126 -- # xtrace_disable 00:25:48.695 23:47:33 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:25:48.695 ************************************ 00:25:48.695 END TEST bdev_verify_big_io 00:25:48.695 ************************************ 00:25:48.695 23:47:33 blockdev_crypto_aesni -- bdev/blockdev.sh@778 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:25:48.695 23:47:33 blockdev_crypto_aesni -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:25:48.695 23:47:33 blockdev_crypto_aesni -- common/autotest_common.sh@1107 -- # xtrace_disable 00:25:48.695 23:47:33 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:25:48.695 ************************************ 00:25:48.695 START TEST bdev_write_zeroes 00:25:48.695 ************************************ 00:25:48.695 23:47:33 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:25:48.695 [2024-07-24 23:47:33.644942] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:25:48.695 [2024-07-24 23:47:33.644984] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid431895 ] 00:25:48.953 [2024-07-24 23:47:33.710650] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:48.953 [2024-07-24 23:47:33.784021] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:25:48.953 [2024-07-24 23:47:33.804874] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:25:48.953 [2024-07-24 23:47:33.812900] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:25:48.953 [2024-07-24 23:47:33.820917] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:25:48.953 [2024-07-24 23:47:33.918757] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:25:51.555 [2024-07-24 23:47:36.063571] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:25:51.555 [2024-07-24 23:47:36.063623] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:25:51.555 [2024-07-24 23:47:36.063631] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:25:51.555 [2024-07-24 23:47:36.071589] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:25:51.555 [2024-07-24 23:47:36.071608] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:25:51.555 [2024-07-24 23:47:36.071613] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:25:51.555 [2024-07-24 23:47:36.079609] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:25:51.555 [2024-07-24 23:47:36.079617] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:25:51.555 [2024-07-24 23:47:36.079622] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:25:51.555 [2024-07-24 23:47:36.087628] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:25:51.555 [2024-07-24 23:47:36.087636] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:25:51.555 [2024-07-24 23:47:36.087641] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:25:51.555 Running I/O for 1 seconds... 00:25:52.491 00:25:52.491 Latency(us) 00:25:52.491 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:52.491 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:25:52.491 crypto_ram : 1.02 3031.23 11.84 0.00 0.00 42021.83 3432.84 50681.17 00:25:52.491 Job: crypto_ram2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:25:52.491 crypto_ram2 : 1.02 3037.09 11.86 0.00 0.00 41791.24 3479.65 47185.92 00:25:52.491 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:25:52.491 crypto_ram3 : 1.01 23628.38 92.30 0.00 0.00 5364.24 1568.18 6896.88 00:25:52.491 Job: crypto_ram4 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:25:52.491 crypto_ram4 : 1.01 23613.52 92.24 0.00 0.00 5352.86 1575.98 5867.03 00:25:52.491 =================================================================================================================== 00:25:52.491 Total : 53310.22 208.24 0.00 0.00 9529.91 1568.18 50681.17 00:25:52.491 00:25:52.491 real 0m3.891s 00:25:52.491 user 0m3.589s 00:25:52.491 sys 0m0.264s 00:25:52.491 23:47:37 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@1126 -- # xtrace_disable 00:25:52.491 23:47:37 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:25:52.491 ************************************ 00:25:52.491 END TEST bdev_write_zeroes 00:25:52.491 ************************************ 00:25:52.749 23:47:37 blockdev_crypto_aesni -- bdev/blockdev.sh@781 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:25:52.749 23:47:37 blockdev_crypto_aesni -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:25:52.749 23:47:37 blockdev_crypto_aesni -- common/autotest_common.sh@1107 -- # xtrace_disable 00:25:52.750 23:47:37 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:25:52.750 ************************************ 00:25:52.750 START TEST bdev_json_nonenclosed 00:25:52.750 ************************************ 00:25:52.750 23:47:37 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:25:52.750 [2024-07-24 23:47:37.592426] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:25:52.750 [2024-07-24 23:47:37.592459] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid432474 ] 00:25:52.750 [2024-07-24 23:47:37.655628] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:52.750 [2024-07-24 23:47:37.731386] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:25:52.750 [2024-07-24 23:47:37.731439] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:25:52.750 [2024-07-24 23:47:37.731449] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:25:52.750 [2024-07-24 23:47:37.731456] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:25:53.008 00:25:53.008 real 0m0.259s 00:25:53.008 user 0m0.167s 00:25:53.008 sys 0m0.090s 00:25:53.008 23:47:37 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1126 -- # xtrace_disable 00:25:53.008 23:47:37 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:25:53.008 ************************************ 00:25:53.008 END TEST bdev_json_nonenclosed 00:25:53.008 ************************************ 00:25:53.008 23:47:37 blockdev_crypto_aesni -- bdev/blockdev.sh@784 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:25:53.008 23:47:37 blockdev_crypto_aesni -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:25:53.008 23:47:37 blockdev_crypto_aesni -- common/autotest_common.sh@1107 -- # xtrace_disable 00:25:53.008 23:47:37 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:25:53.008 ************************************ 00:25:53.008 START TEST bdev_json_nonarray 00:25:53.008 ************************************ 00:25:53.008 23:47:37 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:25:53.008 [2024-07-24 23:47:37.917690] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:25:53.008 [2024-07-24 23:47:37.917724] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid432646 ] 00:25:53.008 [2024-07-24 23:47:37.980718] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:53.267 [2024-07-24 23:47:38.053442] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:25:53.267 [2024-07-24 23:47:38.053506] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:25:53.267 [2024-07-24 23:47:38.053515] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:25:53.267 [2024-07-24 23:47:38.053521] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:25:53.267 00:25:53.267 real 0m0.255s 00:25:53.267 user 0m0.167s 00:25:53.267 sys 0m0.086s 00:25:53.267 23:47:38 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1126 -- # xtrace_disable 00:25:53.267 23:47:38 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:25:53.267 ************************************ 00:25:53.267 END TEST bdev_json_nonarray 00:25:53.267 ************************************ 00:25:53.267 23:47:38 blockdev_crypto_aesni -- bdev/blockdev.sh@786 -- # [[ crypto_aesni == bdev ]] 00:25:53.267 23:47:38 blockdev_crypto_aesni -- bdev/blockdev.sh@793 -- # [[ crypto_aesni == gpt ]] 00:25:53.267 23:47:38 blockdev_crypto_aesni -- bdev/blockdev.sh@797 -- # [[ crypto_aesni == crypto_sw ]] 00:25:53.267 23:47:38 blockdev_crypto_aesni -- bdev/blockdev.sh@809 -- # trap - SIGINT SIGTERM EXIT 00:25:53.267 23:47:38 blockdev_crypto_aesni -- bdev/blockdev.sh@810 -- # cleanup 00:25:53.267 23:47:38 blockdev_crypto_aesni -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:25:53.267 23:47:38 blockdev_crypto_aesni -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:25:53.267 23:47:38 blockdev_crypto_aesni -- bdev/blockdev.sh@26 -- # [[ crypto_aesni == rbd ]] 00:25:53.267 23:47:38 blockdev_crypto_aesni -- bdev/blockdev.sh@30 -- # [[ crypto_aesni == daos ]] 00:25:53.267 23:47:38 blockdev_crypto_aesni -- bdev/blockdev.sh@34 -- # [[ crypto_aesni = \g\p\t ]] 00:25:53.267 23:47:38 blockdev_crypto_aesni -- bdev/blockdev.sh@40 -- # [[ crypto_aesni == xnvme ]] 00:25:53.267 00:25:53.267 real 1m6.278s 00:25:53.267 user 2m39.030s 00:25:53.267 sys 0m5.736s 00:25:53.267 23:47:38 blockdev_crypto_aesni -- common/autotest_common.sh@1126 -- # xtrace_disable 00:25:53.267 23:47:38 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:25:53.267 ************************************ 00:25:53.267 END TEST blockdev_crypto_aesni 00:25:53.267 ************************************ 00:25:53.267 23:47:38 -- spdk/autotest.sh@362 -- # run_test blockdev_crypto_sw /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_sw 00:25:53.267 23:47:38 -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:25:53.267 23:47:38 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:25:53.267 23:47:38 -- common/autotest_common.sh@10 -- # set +x 00:25:53.267 ************************************ 00:25:53.267 START TEST blockdev_crypto_sw 00:25:53.267 ************************************ 00:25:53.267 23:47:38 blockdev_crypto_sw -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_sw 00:25:53.526 * Looking for test storage... 00:25:53.526 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:25:53.526 23:47:38 blockdev_crypto_sw -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:25:53.526 23:47:38 blockdev_crypto_sw -- bdev/nbd_common.sh@6 -- # set -e 00:25:53.526 23:47:38 blockdev_crypto_sw -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:25:53.526 23:47:38 blockdev_crypto_sw -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:25:53.526 23:47:38 blockdev_crypto_sw -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:25:53.526 23:47:38 blockdev_crypto_sw -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:25:53.526 23:47:38 blockdev_crypto_sw -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:25:53.526 23:47:38 blockdev_crypto_sw -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:25:53.526 23:47:38 blockdev_crypto_sw -- bdev/blockdev.sh@20 -- # : 00:25:53.526 23:47:38 blockdev_crypto_sw -- bdev/blockdev.sh@669 -- # QOS_DEV_1=Malloc_0 00:25:53.526 23:47:38 blockdev_crypto_sw -- bdev/blockdev.sh@670 -- # QOS_DEV_2=Null_1 00:25:53.526 23:47:38 blockdev_crypto_sw -- bdev/blockdev.sh@671 -- # QOS_RUN_TIME=5 00:25:53.526 23:47:38 blockdev_crypto_sw -- bdev/blockdev.sh@673 -- # uname -s 00:25:53.526 23:47:38 blockdev_crypto_sw -- bdev/blockdev.sh@673 -- # '[' Linux = Linux ']' 00:25:53.526 23:47:38 blockdev_crypto_sw -- bdev/blockdev.sh@675 -- # PRE_RESERVED_MEM=0 00:25:53.526 23:47:38 blockdev_crypto_sw -- bdev/blockdev.sh@681 -- # test_type=crypto_sw 00:25:53.526 23:47:38 blockdev_crypto_sw -- bdev/blockdev.sh@682 -- # crypto_device= 00:25:53.526 23:47:38 blockdev_crypto_sw -- bdev/blockdev.sh@683 -- # dek= 00:25:53.526 23:47:38 blockdev_crypto_sw -- bdev/blockdev.sh@684 -- # env_ctx= 00:25:53.526 23:47:38 blockdev_crypto_sw -- bdev/blockdev.sh@685 -- # wait_for_rpc= 00:25:53.526 23:47:38 blockdev_crypto_sw -- bdev/blockdev.sh@686 -- # '[' -n '' ']' 00:25:53.526 23:47:38 blockdev_crypto_sw -- bdev/blockdev.sh@689 -- # [[ crypto_sw == bdev ]] 00:25:53.526 23:47:38 blockdev_crypto_sw -- bdev/blockdev.sh@689 -- # [[ crypto_sw == crypto_* ]] 00:25:53.526 23:47:38 blockdev_crypto_sw -- bdev/blockdev.sh@690 -- # wait_for_rpc=--wait-for-rpc 00:25:53.526 23:47:38 blockdev_crypto_sw -- bdev/blockdev.sh@692 -- # start_spdk_tgt 00:25:53.526 23:47:38 blockdev_crypto_sw -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=432774 00:25:53.526 23:47:38 blockdev_crypto_sw -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:25:53.526 23:47:38 blockdev_crypto_sw -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:25:53.526 23:47:38 blockdev_crypto_sw -- bdev/blockdev.sh@49 -- # waitforlisten 432774 00:25:53.526 23:47:38 blockdev_crypto_sw -- common/autotest_common.sh@831 -- # '[' -z 432774 ']' 00:25:53.526 23:47:38 blockdev_crypto_sw -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:25:53.526 23:47:38 blockdev_crypto_sw -- common/autotest_common.sh@836 -- # local max_retries=100 00:25:53.526 23:47:38 blockdev_crypto_sw -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:25:53.526 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:25:53.526 23:47:38 blockdev_crypto_sw -- common/autotest_common.sh@840 -- # xtrace_disable 00:25:53.526 23:47:38 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:25:53.526 [2024-07-24 23:47:38.374055] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:25:53.526 [2024-07-24 23:47:38.374097] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid432774 ] 00:25:53.526 [2024-07-24 23:47:38.438855] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:53.527 [2024-07-24 23:47:38.509643] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:25:54.462 23:47:39 blockdev_crypto_sw -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:25:54.462 23:47:39 blockdev_crypto_sw -- common/autotest_common.sh@864 -- # return 0 00:25:54.462 23:47:39 blockdev_crypto_sw -- bdev/blockdev.sh@693 -- # case "$test_type" in 00:25:54.462 23:47:39 blockdev_crypto_sw -- bdev/blockdev.sh@710 -- # setup_crypto_sw_conf 00:25:54.462 23:47:39 blockdev_crypto_sw -- bdev/blockdev.sh@192 -- # rpc_cmd 00:25:54.462 23:47:39 blockdev_crypto_sw -- common/autotest_common.sh@561 -- # xtrace_disable 00:25:54.462 23:47:39 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:25:54.462 Malloc0 00:25:54.462 Malloc1 00:25:54.462 true 00:25:54.462 true 00:25:54.462 true 00:25:54.462 [2024-07-24 23:47:39.397427] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:25:54.462 crypto_ram 00:25:54.462 [2024-07-24 23:47:39.405455] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:25:54.462 crypto_ram2 00:25:54.462 [2024-07-24 23:47:39.413480] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:25:54.462 crypto_ram3 00:25:54.462 [ 00:25:54.462 { 00:25:54.462 "name": "Malloc1", 00:25:54.462 "aliases": [ 00:25:54.462 "d3d6f8b8-6c3b-4a94-a916-606d4f6d81e2" 00:25:54.462 ], 00:25:54.462 "product_name": "Malloc disk", 00:25:54.462 "block_size": 4096, 00:25:54.462 "num_blocks": 4096, 00:25:54.462 "uuid": "d3d6f8b8-6c3b-4a94-a916-606d4f6d81e2", 00:25:54.462 "assigned_rate_limits": { 00:25:54.462 "rw_ios_per_sec": 0, 00:25:54.462 "rw_mbytes_per_sec": 0, 00:25:54.462 "r_mbytes_per_sec": 0, 00:25:54.462 "w_mbytes_per_sec": 0 00:25:54.462 }, 00:25:54.462 "claimed": true, 00:25:54.462 "claim_type": "exclusive_write", 00:25:54.462 "zoned": false, 00:25:54.462 "supported_io_types": { 00:25:54.462 "read": true, 00:25:54.462 "write": true, 00:25:54.462 "unmap": true, 00:25:54.462 "flush": true, 00:25:54.462 "reset": true, 00:25:54.462 "nvme_admin": false, 00:25:54.462 "nvme_io": false, 00:25:54.462 "nvme_io_md": false, 00:25:54.462 "write_zeroes": true, 00:25:54.462 "zcopy": true, 00:25:54.462 "get_zone_info": false, 00:25:54.462 "zone_management": false, 00:25:54.462 "zone_append": false, 00:25:54.462 "compare": false, 00:25:54.462 "compare_and_write": false, 00:25:54.462 "abort": true, 00:25:54.462 "seek_hole": false, 00:25:54.462 "seek_data": false, 00:25:54.462 "copy": true, 00:25:54.462 "nvme_iov_md": false 00:25:54.462 }, 00:25:54.462 "memory_domains": [ 00:25:54.462 { 00:25:54.462 "dma_device_id": "system", 00:25:54.462 "dma_device_type": 1 00:25:54.462 }, 00:25:54.462 { 00:25:54.462 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:54.462 "dma_device_type": 2 00:25:54.462 } 00:25:54.462 ], 00:25:54.462 "driver_specific": {} 00:25:54.462 } 00:25:54.462 ] 00:25:54.462 23:47:39 blockdev_crypto_sw -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:25:54.462 23:47:39 blockdev_crypto_sw -- bdev/blockdev.sh@736 -- # rpc_cmd bdev_wait_for_examine 00:25:54.462 23:47:39 blockdev_crypto_sw -- common/autotest_common.sh@561 -- # xtrace_disable 00:25:54.462 23:47:39 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:25:54.462 23:47:39 blockdev_crypto_sw -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:25:54.462 23:47:39 blockdev_crypto_sw -- bdev/blockdev.sh@739 -- # cat 00:25:54.462 23:47:39 blockdev_crypto_sw -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n accel 00:25:54.462 23:47:39 blockdev_crypto_sw -- common/autotest_common.sh@561 -- # xtrace_disable 00:25:54.462 23:47:39 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:25:54.462 23:47:39 blockdev_crypto_sw -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:25:54.462 23:47:39 blockdev_crypto_sw -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n bdev 00:25:54.721 23:47:39 blockdev_crypto_sw -- common/autotest_common.sh@561 -- # xtrace_disable 00:25:54.721 23:47:39 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:25:54.721 23:47:39 blockdev_crypto_sw -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:25:54.721 23:47:39 blockdev_crypto_sw -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n iobuf 00:25:54.721 23:47:39 blockdev_crypto_sw -- common/autotest_common.sh@561 -- # xtrace_disable 00:25:54.721 23:47:39 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:25:54.721 23:47:39 blockdev_crypto_sw -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:25:54.721 23:47:39 blockdev_crypto_sw -- bdev/blockdev.sh@747 -- # mapfile -t bdevs 00:25:54.721 23:47:39 blockdev_crypto_sw -- bdev/blockdev.sh@747 -- # rpc_cmd bdev_get_bdevs 00:25:54.721 23:47:39 blockdev_crypto_sw -- bdev/blockdev.sh@747 -- # jq -r '.[] | select(.claimed == false)' 00:25:54.721 23:47:39 blockdev_crypto_sw -- common/autotest_common.sh@561 -- # xtrace_disable 00:25:54.721 23:47:39 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:25:54.721 23:47:39 blockdev_crypto_sw -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:25:54.721 23:47:39 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # mapfile -t bdevs_name 00:25:54.721 23:47:39 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # jq -r .name 00:25:54.721 23:47:39 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "79f365f2-7fff-5abe-877e-286d68cb3bea"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "79f365f2-7fff-5abe-877e-286d68cb3bea",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "505cf14d-b937-52f0-8fa7-c616779ddc10"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "505cf14d-b937-52f0-8fa7-c616779ddc10",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:25:54.721 23:47:39 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # bdev_list=("${bdevs_name[@]}") 00:25:54.721 23:47:39 blockdev_crypto_sw -- bdev/blockdev.sh@751 -- # hello_world_bdev=crypto_ram 00:25:54.721 23:47:39 blockdev_crypto_sw -- bdev/blockdev.sh@752 -- # trap - SIGINT SIGTERM EXIT 00:25:54.721 23:47:39 blockdev_crypto_sw -- bdev/blockdev.sh@753 -- # killprocess 432774 00:25:54.721 23:47:39 blockdev_crypto_sw -- common/autotest_common.sh@950 -- # '[' -z 432774 ']' 00:25:54.721 23:47:39 blockdev_crypto_sw -- common/autotest_common.sh@954 -- # kill -0 432774 00:25:54.721 23:47:39 blockdev_crypto_sw -- common/autotest_common.sh@955 -- # uname 00:25:54.721 23:47:39 blockdev_crypto_sw -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:25:54.721 23:47:39 blockdev_crypto_sw -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 432774 00:25:54.721 23:47:39 blockdev_crypto_sw -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:25:54.721 23:47:39 blockdev_crypto_sw -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:25:54.721 23:47:39 blockdev_crypto_sw -- common/autotest_common.sh@968 -- # echo 'killing process with pid 432774' 00:25:54.721 killing process with pid 432774 00:25:54.721 23:47:39 blockdev_crypto_sw -- common/autotest_common.sh@969 -- # kill 432774 00:25:54.721 23:47:39 blockdev_crypto_sw -- common/autotest_common.sh@974 -- # wait 432774 00:25:54.980 23:47:39 blockdev_crypto_sw -- bdev/blockdev.sh@757 -- # trap cleanup SIGINT SIGTERM EXIT 00:25:54.980 23:47:39 blockdev_crypto_sw -- bdev/blockdev.sh@759 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:25:54.980 23:47:39 blockdev_crypto_sw -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:25:54.980 23:47:39 blockdev_crypto_sw -- common/autotest_common.sh@1107 -- # xtrace_disable 00:25:54.980 23:47:39 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:25:54.980 ************************************ 00:25:54.980 START TEST bdev_hello_world 00:25:54.980 ************************************ 00:25:54.980 23:47:39 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:25:55.239 [2024-07-24 23:47:40.017411] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:25:55.239 [2024-07-24 23:47:40.017448] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid433029 ] 00:25:55.239 [2024-07-24 23:47:40.081686] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:55.239 [2024-07-24 23:47:40.154321] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:25:55.498 [2024-07-24 23:47:40.316080] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:25:55.498 [2024-07-24 23:47:40.316134] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:25:55.498 [2024-07-24 23:47:40.316142] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:25:55.498 [2024-07-24 23:47:40.324097] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:25:55.498 [2024-07-24 23:47:40.324108] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:25:55.498 [2024-07-24 23:47:40.324113] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:25:55.498 [2024-07-24 23:47:40.332118] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:25:55.498 [2024-07-24 23:47:40.332127] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:25:55.498 [2024-07-24 23:47:40.332132] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:25:55.498 [2024-07-24 23:47:40.370239] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:25:55.498 [2024-07-24 23:47:40.370263] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:25:55.498 [2024-07-24 23:47:40.370272] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:25:55.498 [2024-07-24 23:47:40.371185] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:25:55.498 [2024-07-24 23:47:40.371232] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:25:55.498 [2024-07-24 23:47:40.371241] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:25:55.498 [2024-07-24 23:47:40.371266] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:25:55.498 00:25:55.498 [2024-07-24 23:47:40.371276] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:25:55.757 00:25:55.757 real 0m0.579s 00:25:55.757 user 0m0.401s 00:25:55.757 sys 0m0.160s 00:25:55.757 23:47:40 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@1126 -- # xtrace_disable 00:25:55.757 23:47:40 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:25:55.757 ************************************ 00:25:55.757 END TEST bdev_hello_world 00:25:55.757 ************************************ 00:25:55.757 23:47:40 blockdev_crypto_sw -- bdev/blockdev.sh@760 -- # run_test bdev_bounds bdev_bounds '' 00:25:55.757 23:47:40 blockdev_crypto_sw -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:25:55.757 23:47:40 blockdev_crypto_sw -- common/autotest_common.sh@1107 -- # xtrace_disable 00:25:55.757 23:47:40 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:25:55.757 ************************************ 00:25:55.757 START TEST bdev_bounds 00:25:55.757 ************************************ 00:25:55.757 23:47:40 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@1125 -- # bdev_bounds '' 00:25:55.757 23:47:40 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@289 -- # bdevio_pid=433142 00:25:55.757 23:47:40 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@290 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:25:55.757 23:47:40 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@288 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:25:55.757 23:47:40 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@291 -- # echo 'Process bdevio pid: 433142' 00:25:55.757 Process bdevio pid: 433142 00:25:55.757 23:47:40 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@292 -- # waitforlisten 433142 00:25:55.757 23:47:40 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@831 -- # '[' -z 433142 ']' 00:25:55.757 23:47:40 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:25:55.757 23:47:40 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@836 -- # local max_retries=100 00:25:55.757 23:47:40 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:25:55.757 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:25:55.757 23:47:40 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@840 -- # xtrace_disable 00:25:55.757 23:47:40 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:25:55.757 [2024-07-24 23:47:40.645822] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:25:55.757 [2024-07-24 23:47:40.645860] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid433142 ] 00:25:55.757 [2024-07-24 23:47:40.709387] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:25:56.016 [2024-07-24 23:47:40.791401] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:25:56.016 [2024-07-24 23:47:40.791505] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:25:56.016 [2024-07-24 23:47:40.791507] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:25:56.016 [2024-07-24 23:47:40.944384] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:25:56.016 [2024-07-24 23:47:40.944430] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:25:56.016 [2024-07-24 23:47:40.944438] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:25:56.016 [2024-07-24 23:47:40.952406] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:25:56.016 [2024-07-24 23:47:40.952417] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:25:56.016 [2024-07-24 23:47:40.952422] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:25:56.016 [2024-07-24 23:47:40.960438] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:25:56.016 [2024-07-24 23:47:40.960452] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:25:56.016 [2024-07-24 23:47:40.960457] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:25:56.583 23:47:41 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:25:56.583 23:47:41 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@864 -- # return 0 00:25:56.583 23:47:41 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:25:56.583 I/O targets: 00:25:56.583 crypto_ram: 32768 blocks of 512 bytes (16 MiB) 00:25:56.583 crypto_ram3: 4096 blocks of 4096 bytes (16 MiB) 00:25:56.583 00:25:56.583 00:25:56.583 CUnit - A unit testing framework for C - Version 2.1-3 00:25:56.583 http://cunit.sourceforge.net/ 00:25:56.583 00:25:56.583 00:25:56.583 Suite: bdevio tests on: crypto_ram3 00:25:56.583 Test: blockdev write read block ...passed 00:25:56.583 Test: blockdev write zeroes read block ...passed 00:25:56.583 Test: blockdev write zeroes read no split ...passed 00:25:56.583 Test: blockdev write zeroes read split ...passed 00:25:56.583 Test: blockdev write zeroes read split partial ...passed 00:25:56.583 Test: blockdev reset ...passed 00:25:56.583 Test: blockdev write read 8 blocks ...passed 00:25:56.583 Test: blockdev write read size > 128k ...passed 00:25:56.583 Test: blockdev write read invalid size ...passed 00:25:56.583 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:25:56.583 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:25:56.583 Test: blockdev write read max offset ...passed 00:25:56.583 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:25:56.583 Test: blockdev writev readv 8 blocks ...passed 00:25:56.583 Test: blockdev writev readv 30 x 1block ...passed 00:25:56.583 Test: blockdev writev readv block ...passed 00:25:56.583 Test: blockdev writev readv size > 128k ...passed 00:25:56.583 Test: blockdev writev readv size > 128k in two iovs ...passed 00:25:56.583 Test: blockdev comparev and writev ...passed 00:25:56.583 Test: blockdev nvme passthru rw ...passed 00:25:56.583 Test: blockdev nvme passthru vendor specific ...passed 00:25:56.583 Test: blockdev nvme admin passthru ...passed 00:25:56.583 Test: blockdev copy ...passed 00:25:56.583 Suite: bdevio tests on: crypto_ram 00:25:56.583 Test: blockdev write read block ...passed 00:25:56.583 Test: blockdev write zeroes read block ...passed 00:25:56.583 Test: blockdev write zeroes read no split ...passed 00:25:56.583 Test: blockdev write zeroes read split ...passed 00:25:56.583 Test: blockdev write zeroes read split partial ...passed 00:25:56.583 Test: blockdev reset ...passed 00:25:56.583 Test: blockdev write read 8 blocks ...passed 00:25:56.583 Test: blockdev write read size > 128k ...passed 00:25:56.583 Test: blockdev write read invalid size ...passed 00:25:56.583 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:25:56.583 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:25:56.583 Test: blockdev write read max offset ...passed 00:25:56.583 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:25:56.583 Test: blockdev writev readv 8 blocks ...passed 00:25:56.583 Test: blockdev writev readv 30 x 1block ...passed 00:25:56.583 Test: blockdev writev readv block ...passed 00:25:56.583 Test: blockdev writev readv size > 128k ...passed 00:25:56.583 Test: blockdev writev readv size > 128k in two iovs ...passed 00:25:56.583 Test: blockdev comparev and writev ...passed 00:25:56.583 Test: blockdev nvme passthru rw ...passed 00:25:56.583 Test: blockdev nvme passthru vendor specific ...passed 00:25:56.583 Test: blockdev nvme admin passthru ...passed 00:25:56.583 Test: blockdev copy ...passed 00:25:56.583 00:25:56.583 Run Summary: Type Total Ran Passed Failed Inactive 00:25:56.583 suites 2 2 n/a 0 0 00:25:56.583 tests 46 46 46 0 0 00:25:56.583 asserts 260 260 260 0 n/a 00:25:56.583 00:25:56.583 Elapsed time = 0.080 seconds 00:25:56.583 0 00:25:56.842 23:47:41 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@294 -- # killprocess 433142 00:25:56.842 23:47:41 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@950 -- # '[' -z 433142 ']' 00:25:56.842 23:47:41 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@954 -- # kill -0 433142 00:25:56.842 23:47:41 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@955 -- # uname 00:25:56.842 23:47:41 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:25:56.842 23:47:41 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 433142 00:25:56.842 23:47:41 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:25:56.842 23:47:41 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:25:56.842 23:47:41 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@968 -- # echo 'killing process with pid 433142' 00:25:56.842 killing process with pid 433142 00:25:56.842 23:47:41 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@969 -- # kill 433142 00:25:56.842 23:47:41 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@974 -- # wait 433142 00:25:56.842 23:47:41 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@295 -- # trap - SIGINT SIGTERM EXIT 00:25:56.842 00:25:56.842 real 0m1.206s 00:25:56.842 user 0m3.243s 00:25:56.842 sys 0m0.267s 00:25:56.842 23:47:41 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@1126 -- # xtrace_disable 00:25:56.842 23:47:41 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:25:56.842 ************************************ 00:25:56.842 END TEST bdev_bounds 00:25:56.842 ************************************ 00:25:56.842 23:47:41 blockdev_crypto_sw -- bdev/blockdev.sh@761 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram3' '' 00:25:56.842 23:47:41 blockdev_crypto_sw -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:25:56.842 23:47:41 blockdev_crypto_sw -- common/autotest_common.sh@1107 -- # xtrace_disable 00:25:56.842 23:47:41 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:25:57.101 ************************************ 00:25:57.101 START TEST bdev_nbd 00:25:57.101 ************************************ 00:25:57.101 23:47:41 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@1125 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram3' '' 00:25:57.101 23:47:41 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@299 -- # uname -s 00:25:57.101 23:47:41 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@299 -- # [[ Linux == Linux ]] 00:25:57.101 23:47:41 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@301 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:25:57.101 23:47:41 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@302 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:25:57.101 23:47:41 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@303 -- # bdev_all=('crypto_ram' 'crypto_ram3') 00:25:57.101 23:47:41 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@303 -- # local bdev_all 00:25:57.101 23:47:41 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_num=2 00:25:57.101 23:47:41 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@308 -- # [[ -e /sys/module/nbd ]] 00:25:57.101 23:47:41 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@310 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:25:57.101 23:47:41 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@310 -- # local nbd_all 00:25:57.101 23:47:41 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@311 -- # bdev_num=2 00:25:57.101 23:47:41 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@313 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:25:57.101 23:47:41 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@313 -- # local nbd_list 00:25:57.101 23:47:41 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@314 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:25:57.101 23:47:41 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@314 -- # local bdev_list 00:25:57.101 23:47:41 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@317 -- # nbd_pid=433329 00:25:57.101 23:47:41 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@318 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:25:57.101 23:47:41 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@316 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:25:57.101 23:47:41 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@319 -- # waitforlisten 433329 /var/tmp/spdk-nbd.sock 00:25:57.101 23:47:41 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@831 -- # '[' -z 433329 ']' 00:25:57.101 23:47:41 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:25:57.101 23:47:41 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@836 -- # local max_retries=100 00:25:57.101 23:47:41 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:25:57.101 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:25:57.101 23:47:41 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@840 -- # xtrace_disable 00:25:57.101 23:47:41 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:25:57.101 [2024-07-24 23:47:41.928978] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:25:57.101 [2024-07-24 23:47:41.929015] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:25:57.101 [2024-07-24 23:47:41.991932] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:57.101 [2024-07-24 23:47:42.070376] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:25:57.360 [2024-07-24 23:47:42.223586] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:25:57.360 [2024-07-24 23:47:42.223634] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:25:57.360 [2024-07-24 23:47:42.223642] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:25:57.360 [2024-07-24 23:47:42.231604] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:25:57.360 [2024-07-24 23:47:42.231615] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:25:57.360 [2024-07-24 23:47:42.231620] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:25:57.360 [2024-07-24 23:47:42.239623] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:25:57.360 [2024-07-24 23:47:42.239633] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:25:57.360 [2024-07-24 23:47:42.239638] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:25:57.926 23:47:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:25:57.926 23:47:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@864 -- # return 0 00:25:57.926 23:47:42 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@321 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' 00:25:57.926 23:47:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:25:57.926 23:47:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:25:57.926 23:47:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:25:57.926 23:47:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' 00:25:57.926 23:47:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:25:57.926 23:47:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:25:57.926 23:47:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:25:57.926 23:47:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:25:57.926 23:47:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:25:57.926 23:47:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:25:57.926 23:47:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:25:57.926 23:47:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:25:57.926 23:47:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:25:57.926 23:47:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:25:57.926 23:47:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:25:57.926 23:47:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:25:57.926 23:47:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:25:57.926 23:47:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:25:57.926 23:47:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:25:57.926 23:47:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:25:57.926 23:47:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:25:57.926 23:47:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:25:57.926 23:47:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:25:57.926 23:47:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:57.926 1+0 records in 00:25:57.926 1+0 records out 00:25:57.926 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000244138 s, 16.8 MB/s 00:25:57.926 23:47:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:57.926 23:47:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:25:57.926 23:47:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:57.926 23:47:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:25:57.926 23:47:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:25:57.926 23:47:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:25:57.926 23:47:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:25:58.184 23:47:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:25:58.184 23:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:25:58.184 23:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:25:58.184 23:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:25:58.184 23:47:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:25:58.184 23:47:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:25:58.184 23:47:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:25:58.184 23:47:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:25:58.185 23:47:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:25:58.185 23:47:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:25:58.185 23:47:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:25:58.185 23:47:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:25:58.185 23:47:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:58.185 1+0 records in 00:25:58.185 1+0 records out 00:25:58.185 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000241711 s, 16.9 MB/s 00:25:58.185 23:47:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:58.185 23:47:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:25:58.185 23:47:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:58.185 23:47:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:25:58.185 23:47:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:25:58.185 23:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:25:58.185 23:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:25:58.185 23:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:25:58.443 23:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:25:58.443 { 00:25:58.443 "nbd_device": "/dev/nbd0", 00:25:58.443 "bdev_name": "crypto_ram" 00:25:58.443 }, 00:25:58.443 { 00:25:58.443 "nbd_device": "/dev/nbd1", 00:25:58.443 "bdev_name": "crypto_ram3" 00:25:58.443 } 00:25:58.443 ]' 00:25:58.443 23:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:25:58.443 23:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:25:58.443 { 00:25:58.443 "nbd_device": "/dev/nbd0", 00:25:58.443 "bdev_name": "crypto_ram" 00:25:58.443 }, 00:25:58.443 { 00:25:58.443 "nbd_device": "/dev/nbd1", 00:25:58.443 "bdev_name": "crypto_ram3" 00:25:58.443 } 00:25:58.443 ]' 00:25:58.443 23:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:25:58.443 23:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:25:58.443 23:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:25:58.443 23:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:25:58.443 23:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:58.443 23:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:25:58.443 23:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:58.444 23:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:25:58.702 23:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:58.702 23:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:58.702 23:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:58.702 23:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:58.702 23:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:58.702 23:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:58.702 23:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:25:58.702 23:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:25:58.702 23:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:58.702 23:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:25:58.702 23:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:25:58.702 23:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:25:58.702 23:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:25:58.961 23:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:58.961 23:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:58.961 23:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:25:58.961 23:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:25:58.961 23:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:25:58.961 23:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:25:58.961 23:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:25:58.961 23:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:25:58.961 23:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:25:58.961 23:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:25:58.961 23:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:25:58.961 23:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:25:58.961 23:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:25:58.961 23:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:25:58.961 23:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:25:58.961 23:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:25:58.961 23:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:25:58.961 23:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:25:58.961 23:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:25:58.961 23:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:25:58.961 23:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' '/dev/nbd0 /dev/nbd1' 00:25:58.961 23:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:25:58.961 23:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:25:58.961 23:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:25:58.961 23:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:25:58.961 23:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:25:58.961 23:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' '/dev/nbd0 /dev/nbd1' 00:25:58.961 23:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:25:58.961 23:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:25:58.961 23:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:58.961 23:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:25:58.961 23:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:58.961 23:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:25:58.961 23:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:58.961 23:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:25:58.961 23:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:25:59.219 /dev/nbd0 00:25:59.219 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:25:59.219 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:25:59.219 23:47:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:25:59.219 23:47:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:25:59.219 23:47:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:25:59.219 23:47:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:25:59.219 23:47:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:25:59.219 23:47:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:25:59.219 23:47:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:25:59.219 23:47:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:25:59.219 23:47:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:59.219 1+0 records in 00:25:59.219 1+0 records out 00:25:59.219 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00025414 s, 16.1 MB/s 00:25:59.219 23:47:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:59.219 23:47:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:25:59.219 23:47:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:59.219 23:47:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:25:59.219 23:47:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:25:59.219 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:59.219 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:25:59.219 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd1 00:25:59.478 /dev/nbd1 00:25:59.478 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:25:59.478 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:25:59.478 23:47:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:25:59.478 23:47:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:25:59.478 23:47:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:25:59.478 23:47:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:25:59.478 23:47:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:25:59.478 23:47:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:25:59.478 23:47:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:25:59.478 23:47:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:25:59.478 23:47:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:59.478 1+0 records in 00:25:59.478 1+0 records out 00:25:59.478 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000205541 s, 19.9 MB/s 00:25:59.478 23:47:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:59.478 23:47:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:25:59.478 23:47:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:59.478 23:47:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:25:59.478 23:47:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:25:59.478 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:59.478 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:25:59.478 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:25:59.478 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:25:59.478 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:25:59.736 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:25:59.736 { 00:25:59.736 "nbd_device": "/dev/nbd0", 00:25:59.736 "bdev_name": "crypto_ram" 00:25:59.736 }, 00:25:59.736 { 00:25:59.736 "nbd_device": "/dev/nbd1", 00:25:59.736 "bdev_name": "crypto_ram3" 00:25:59.736 } 00:25:59.736 ]' 00:25:59.736 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:25:59.737 { 00:25:59.737 "nbd_device": "/dev/nbd0", 00:25:59.737 "bdev_name": "crypto_ram" 00:25:59.737 }, 00:25:59.737 { 00:25:59.737 "nbd_device": "/dev/nbd1", 00:25:59.737 "bdev_name": "crypto_ram3" 00:25:59.737 } 00:25:59.737 ]' 00:25:59.737 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:25:59.737 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:25:59.737 /dev/nbd1' 00:25:59.737 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:25:59.737 /dev/nbd1' 00:25:59.737 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:25:59.737 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=2 00:25:59.737 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 2 00:25:59.737 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=2 00:25:59.737 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:25:59.737 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:25:59.737 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:25:59.737 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:25:59.737 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:25:59.737 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:25:59.737 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:25:59.737 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:25:59.737 256+0 records in 00:25:59.737 256+0 records out 00:25:59.737 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0100684 s, 104 MB/s 00:25:59.737 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:25:59.737 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:25:59.737 256+0 records in 00:25:59.737 256+0 records out 00:25:59.737 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0143356 s, 73.1 MB/s 00:25:59.737 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:25:59.737 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:25:59.737 256+0 records in 00:25:59.737 256+0 records out 00:25:59.737 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0213477 s, 49.1 MB/s 00:25:59.737 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:25:59.737 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:25:59.737 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:25:59.737 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:25:59.737 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:25:59.737 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:25:59.737 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:25:59.737 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:25:59.737 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:25:59.737 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:25:59.737 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:25:59.737 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:25:59.737 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:25:59.737 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:25:59.737 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:25:59.737 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:59.737 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:25:59.737 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:59.737 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:25:59.995 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:59.995 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:59.995 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:59.995 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:59.995 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:59.995 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:59.995 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:25:59.995 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:25:59.995 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:59.995 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:25:59.995 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:25:59.995 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:25:59.995 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:25:59.995 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:59.995 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:59.995 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:25:59.995 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:25:59.995 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:25:59.995 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:25:59.995 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:25:59.995 23:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:26:00.253 23:47:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:26:00.253 23:47:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:26:00.253 23:47:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:26:00.253 23:47:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:26:00.253 23:47:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:26:00.253 23:47:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:26:00.253 23:47:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:26:00.253 23:47:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:26:00.253 23:47:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:26:00.253 23:47:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:26:00.253 23:47:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:26:00.253 23:47:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:26:00.253 23:47:45 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:26:00.253 23:47:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:26:00.253 23:47:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:26:00.253 23:47:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:26:00.253 23:47:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:26:00.253 23:47:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:26:00.510 malloc_lvol_verify 00:26:00.511 23:47:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:26:00.511 e72cbab3-95b1-4715-b592-71cccc1797b0 00:26:00.769 23:47:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:26:00.769 3d3ff47c-7787-411c-be7d-ec71807a7a77 00:26:00.769 23:47:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:26:01.027 /dev/nbd0 00:26:01.027 23:47:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:26:01.027 mke2fs 1.46.5 (30-Dec-2021) 00:26:01.027 Discarding device blocks: 0/4096 done 00:26:01.027 Creating filesystem with 4096 1k blocks and 1024 inodes 00:26:01.027 00:26:01.027 Allocating group tables: 0/1 done 00:26:01.027 Writing inode tables: 0/1 done 00:26:01.027 Creating journal (1024 blocks): done 00:26:01.027 Writing superblocks and filesystem accounting information: 0/1 done 00:26:01.027 00:26:01.027 23:47:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:26:01.027 23:47:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:26:01.027 23:47:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:26:01.027 23:47:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:26:01.027 23:47:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:01.027 23:47:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:26:01.027 23:47:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:01.027 23:47:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:26:01.286 23:47:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:26:01.287 23:47:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:26:01.287 23:47:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:26:01.287 23:47:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:01.287 23:47:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:01.287 23:47:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:26:01.287 23:47:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:26:01.287 23:47:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:26:01.287 23:47:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:26:01.287 23:47:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:26:01.287 23:47:46 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@325 -- # killprocess 433329 00:26:01.287 23:47:46 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@950 -- # '[' -z 433329 ']' 00:26:01.287 23:47:46 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@954 -- # kill -0 433329 00:26:01.287 23:47:46 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@955 -- # uname 00:26:01.287 23:47:46 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:26:01.287 23:47:46 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 433329 00:26:01.287 23:47:46 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:26:01.287 23:47:46 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:26:01.287 23:47:46 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@968 -- # echo 'killing process with pid 433329' 00:26:01.287 killing process with pid 433329 00:26:01.287 23:47:46 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@969 -- # kill 433329 00:26:01.287 23:47:46 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@974 -- # wait 433329 00:26:01.287 23:47:46 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@326 -- # trap - SIGINT SIGTERM EXIT 00:26:01.287 00:26:01.287 real 0m4.415s 00:26:01.287 user 0m6.482s 00:26:01.287 sys 0m1.414s 00:26:01.546 23:47:46 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@1126 -- # xtrace_disable 00:26:01.546 23:47:46 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:26:01.546 ************************************ 00:26:01.546 END TEST bdev_nbd 00:26:01.546 ************************************ 00:26:01.546 23:47:46 blockdev_crypto_sw -- bdev/blockdev.sh@762 -- # [[ y == y ]] 00:26:01.546 23:47:46 blockdev_crypto_sw -- bdev/blockdev.sh@763 -- # '[' crypto_sw = nvme ']' 00:26:01.546 23:47:46 blockdev_crypto_sw -- bdev/blockdev.sh@763 -- # '[' crypto_sw = gpt ']' 00:26:01.546 23:47:46 blockdev_crypto_sw -- bdev/blockdev.sh@767 -- # run_test bdev_fio fio_test_suite '' 00:26:01.546 23:47:46 blockdev_crypto_sw -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:26:01.546 23:47:46 blockdev_crypto_sw -- common/autotest_common.sh@1107 -- # xtrace_disable 00:26:01.546 23:47:46 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:26:01.546 ************************************ 00:26:01.546 START TEST bdev_fio 00:26:01.546 ************************************ 00:26:01.546 23:47:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1125 -- # fio_test_suite '' 00:26:01.546 23:47:46 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@330 -- # local env_context 00:26:01.546 23:47:46 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@334 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:26:01.546 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:26:01.546 23:47:46 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@335 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:26:01.546 23:47:46 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@338 -- # echo '' 00:26:01.546 23:47:46 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@338 -- # sed s/--env-context=// 00:26:01.546 23:47:46 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@338 -- # env_context= 00:26:01.546 23:47:46 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:26:01.546 23:47:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:26:01.546 23:47:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:26:01.546 23:47:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:26:01.546 23:47:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:26:01.547 23:47:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:26:01.547 23:47:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:26:01.547 23:47:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:26:01.547 23:47:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:26:01.547 23:47:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:26:01.547 23:47:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:26:01.547 23:47:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:26:01.547 23:47:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:26:01.547 23:47:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:26:01.547 23:47:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:26:01.547 23:47:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:26:01.547 23:47:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:26:01.547 23:47:46 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:26:01.547 23:47:46 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram]' 00:26:01.547 23:47:46 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram 00:26:01.547 23:47:46 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:26:01.547 23:47:46 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram3]' 00:26:01.547 23:47:46 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram3 00:26:01.547 23:47:46 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@346 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:26:01.547 23:47:46 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@348 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:26:01.547 23:47:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1101 -- # '[' 11 -le 1 ']' 00:26:01.547 23:47:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1107 -- # xtrace_disable 00:26:01.547 23:47:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:26:01.547 ************************************ 00:26:01.547 START TEST bdev_fio_rw_verify 00:26:01.547 ************************************ 00:26:01.547 23:47:46 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1125 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:26:01.547 23:47:46 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:26:01.547 23:47:46 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:26:01.547 23:47:46 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:26:01.547 23:47:46 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:26:01.547 23:47:46 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:26:01.547 23:47:46 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:26:01.547 23:47:46 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:26:01.547 23:47:46 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:26:01.547 23:47:46 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:26:01.547 23:47:46 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:26:01.547 23:47:46 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:26:01.547 23:47:46 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:26:01.547 23:47:46 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:26:01.547 23:47:46 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:26:01.547 23:47:46 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:26:01.547 23:47:46 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:26:01.547 23:47:46 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:26:01.547 23:47:46 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:26:01.547 23:47:46 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:26:01.547 23:47:46 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:26:01.547 23:47:46 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:26:02.112 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:26:02.112 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:26:02.112 fio-3.35 00:26:02.112 Starting 2 threads 00:26:14.328 00:26:14.328 job_crypto_ram: (groupid=0, jobs=2): err= 0: pid=434473: Wed Jul 24 23:47:57 2024 00:26:14.328 read: IOPS=31.4k, BW=123MiB/s (129MB/s)(1227MiB/10001msec) 00:26:14.328 slat (usec): min=9, max=1240, avg=14.24, stdev= 3.71 00:26:14.328 clat (usec): min=5, max=1394, avg=102.34, stdev=41.42 00:26:14.328 lat (usec): min=18, max=1409, avg=116.59, stdev=42.62 00:26:14.328 clat percentiles (usec): 00:26:14.328 | 50.000th=[ 100], 99.000th=[ 198], 99.900th=[ 217], 99.990th=[ 253], 00:26:14.328 | 99.999th=[ 392] 00:26:14.328 write: IOPS=37.7k, BW=147MiB/s (154MB/s)(1397MiB/9484msec); 0 zone resets 00:26:14.328 slat (usec): min=9, max=349, avg=23.48, stdev= 3.38 00:26:14.328 clat (usec): min=16, max=595, avg=136.42, stdev=62.37 00:26:14.328 lat (usec): min=34, max=622, avg=159.90, stdev=63.50 00:26:14.328 clat percentiles (usec): 00:26:14.328 | 50.000th=[ 133], 99.000th=[ 273], 99.900th=[ 289], 99.990th=[ 412], 00:26:14.328 | 99.999th=[ 506] 00:26:14.328 bw ( KiB/s): min=135632, max=149384, per=94.77%, avg=142994.11, stdev=2110.56, samples=38 00:26:14.329 iops : min=33908, max=37346, avg=35748.53, stdev=527.64, samples=38 00:26:14.329 lat (usec) : 10=0.01%, 20=0.01%, 50=9.05%, 100=31.92%, 250=56.62% 00:26:14.329 lat (usec) : 500=2.40%, 750=0.01% 00:26:14.329 lat (msec) : 2=0.01% 00:26:14.329 cpu : usr=99.70%, sys=0.00%, ctx=64, majf=0, minf=477 00:26:14.329 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:26:14.329 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:26:14.329 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:26:14.329 issued rwts: total=314131,357731,0,0 short=0,0,0,0 dropped=0,0,0,0 00:26:14.329 latency : target=0, window=0, percentile=100.00%, depth=8 00:26:14.329 00:26:14.329 Run status group 0 (all jobs): 00:26:14.329 READ: bw=123MiB/s (129MB/s), 123MiB/s-123MiB/s (129MB/s-129MB/s), io=1227MiB (1287MB), run=10001-10001msec 00:26:14.329 WRITE: bw=147MiB/s (154MB/s), 147MiB/s-147MiB/s (154MB/s-154MB/s), io=1397MiB (1465MB), run=9484-9484msec 00:26:14.329 00:26:14.329 real 0m10.956s 00:26:14.329 user 0m26.556s 00:26:14.329 sys 0m0.251s 00:26:14.329 23:47:57 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1126 -- # xtrace_disable 00:26:14.329 23:47:57 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:26:14.329 ************************************ 00:26:14.329 END TEST bdev_fio_rw_verify 00:26:14.329 ************************************ 00:26:14.329 23:47:57 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@349 -- # rm -f 00:26:14.329 23:47:57 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:26:14.329 23:47:57 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@353 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:26:14.329 23:47:57 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:26:14.329 23:47:57 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:26:14.329 23:47:57 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:26:14.329 23:47:57 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:26:14.329 23:47:57 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:26:14.329 23:47:57 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:26:14.329 23:47:57 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:26:14.329 23:47:57 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:26:14.329 23:47:57 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:26:14.329 23:47:57 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:26:14.329 23:47:57 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:26:14.329 23:47:57 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:26:14.329 23:47:57 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:26:14.329 23:47:57 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@354 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:26:14.329 23:47:57 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@354 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "79f365f2-7fff-5abe-877e-286d68cb3bea"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "79f365f2-7fff-5abe-877e-286d68cb3bea",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "505cf14d-b937-52f0-8fa7-c616779ddc10"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "505cf14d-b937-52f0-8fa7-c616779ddc10",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:26:14.329 23:47:57 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@354 -- # [[ -n crypto_ram 00:26:14.329 crypto_ram3 ]] 00:26:14.329 23:47:57 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:26:14.329 23:47:57 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "79f365f2-7fff-5abe-877e-286d68cb3bea"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "79f365f2-7fff-5abe-877e-286d68cb3bea",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "505cf14d-b937-52f0-8fa7-c616779ddc10"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "505cf14d-b937-52f0-8fa7-c616779ddc10",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:26:14.329 23:47:57 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:26:14.329 23:47:57 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram]' 00:26:14.329 23:47:57 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram 00:26:14.329 23:47:57 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:26:14.329 23:47:57 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram3]' 00:26:14.329 23:47:57 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram3 00:26:14.329 23:47:57 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@366 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:26:14.329 23:47:57 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1101 -- # '[' 11 -le 1 ']' 00:26:14.329 23:47:57 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1107 -- # xtrace_disable 00:26:14.329 23:47:57 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:26:14.329 ************************************ 00:26:14.329 START TEST bdev_fio_trim 00:26:14.329 ************************************ 00:26:14.329 23:47:57 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1125 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:26:14.329 23:47:57 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:26:14.329 23:47:57 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:26:14.329 23:47:57 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:26:14.329 23:47:57 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:26:14.329 23:47:57 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:26:14.329 23:47:57 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:26:14.329 23:47:57 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:26:14.329 23:47:57 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:26:14.329 23:47:57 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:26:14.329 23:47:57 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:26:14.329 23:47:57 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:26:14.329 23:47:57 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:26:14.329 23:47:57 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:26:14.329 23:47:57 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:26:14.329 23:47:57 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:26:14.330 23:47:57 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:26:14.330 23:47:57 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:26:14.330 23:47:57 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:26:14.330 23:47:57 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:26:14.330 23:47:57 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:26:14.330 23:47:57 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:26:14.330 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:26:14.330 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:26:14.330 fio-3.35 00:26:14.330 Starting 2 threads 00:26:24.372 00:26:24.372 job_crypto_ram: (groupid=0, jobs=2): err= 0: pid=436414: Wed Jul 24 23:48:08 2024 00:26:24.372 write: IOPS=54.5k, BW=213MiB/s (223MB/s)(2130MiB/10001msec); 0 zone resets 00:26:24.372 slat (usec): min=9, max=1253, avg=15.90, stdev= 3.72 00:26:24.372 clat (usec): min=24, max=1448, avg=120.78, stdev=67.35 00:26:24.372 lat (usec): min=34, max=1509, avg=136.68, stdev=69.80 00:26:24.372 clat percentiles (usec): 00:26:24.372 | 50.000th=[ 96], 99.000th=[ 253], 99.900th=[ 277], 99.990th=[ 570], 00:26:24.372 | 99.999th=[ 725] 00:26:24.372 bw ( KiB/s): min=211384, max=220768, per=100.00%, avg=218220.63, stdev=1279.49, samples=38 00:26:24.372 iops : min=52846, max=55192, avg=54555.16, stdev=319.87, samples=38 00:26:24.372 trim: IOPS=54.5k, BW=213MiB/s (223MB/s)(2130MiB/10001msec); 0 zone resets 00:26:24.372 slat (usec): min=3, max=348, avg= 7.37, stdev= 2.04 00:26:24.372 clat (usec): min=32, max=457, avg=80.51, stdev=24.77 00:26:24.372 lat (usec): min=37, max=466, avg=87.89, stdev=25.01 00:26:24.372 clat percentiles (usec): 00:26:24.372 | 50.000th=[ 81], 99.000th=[ 135], 99.900th=[ 151], 99.990th=[ 310], 00:26:24.372 | 99.999th=[ 375] 00:26:24.372 bw ( KiB/s): min=211416, max=220768, per=100.00%, avg=218222.32, stdev=1277.41, samples=38 00:26:24.372 iops : min=52854, max=55192, avg=54555.58, stdev=319.35, samples=38 00:26:24.372 lat (usec) : 50=14.31%, 100=49.19%, 250=35.81%, 500=0.68%, 750=0.01% 00:26:24.372 lat (usec) : 1000=0.01% 00:26:24.372 lat (msec) : 2=0.01% 00:26:24.372 cpu : usr=99.69%, sys=0.01%, ctx=18, majf=0, minf=264 00:26:24.372 IO depths : 1=7.5%, 2=17.4%, 4=60.1%, 8=15.0%, 16=0.0%, 32=0.0%, >=64=0.0% 00:26:24.372 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:26:24.372 complete : 0=0.0%, 4=86.9%, 8=13.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:26:24.372 issued rwts: total=0,545333,545333,0 short=0,0,0,0 dropped=0,0,0,0 00:26:24.372 latency : target=0, window=0, percentile=100.00%, depth=8 00:26:24.372 00:26:24.372 Run status group 0 (all jobs): 00:26:24.372 WRITE: bw=213MiB/s (223MB/s), 213MiB/s-213MiB/s (223MB/s-223MB/s), io=2130MiB (2234MB), run=10001-10001msec 00:26:24.372 TRIM: bw=213MiB/s (223MB/s), 213MiB/s-213MiB/s (223MB/s-223MB/s), io=2130MiB (2234MB), run=10001-10001msec 00:26:24.372 00:26:24.372 real 0m10.947s 00:26:24.372 user 0m26.592s 00:26:24.372 sys 0m0.254s 00:26:24.372 23:48:08 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1126 -- # xtrace_disable 00:26:24.372 23:48:08 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:26:24.372 ************************************ 00:26:24.372 END TEST bdev_fio_trim 00:26:24.372 ************************************ 00:26:24.372 23:48:08 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@367 -- # rm -f 00:26:24.372 23:48:08 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:26:24.372 23:48:08 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@369 -- # popd 00:26:24.373 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:26:24.373 23:48:08 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@370 -- # trap - SIGINT SIGTERM EXIT 00:26:24.373 00:26:24.373 real 0m22.203s 00:26:24.373 user 0m53.316s 00:26:24.373 sys 0m0.652s 00:26:24.373 23:48:08 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1126 -- # xtrace_disable 00:26:24.373 23:48:08 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:26:24.373 ************************************ 00:26:24.373 END TEST bdev_fio 00:26:24.373 ************************************ 00:26:24.373 23:48:08 blockdev_crypto_sw -- bdev/blockdev.sh@774 -- # trap cleanup SIGINT SIGTERM EXIT 00:26:24.373 23:48:08 blockdev_crypto_sw -- bdev/blockdev.sh@776 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:26:24.373 23:48:08 blockdev_crypto_sw -- common/autotest_common.sh@1101 -- # '[' 16 -le 1 ']' 00:26:24.373 23:48:08 blockdev_crypto_sw -- common/autotest_common.sh@1107 -- # xtrace_disable 00:26:24.373 23:48:08 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:26:24.373 ************************************ 00:26:24.373 START TEST bdev_verify 00:26:24.373 ************************************ 00:26:24.373 23:48:08 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:26:24.373 [2024-07-24 23:48:08.663257] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:26:24.373 [2024-07-24 23:48:08.663292] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid438318 ] 00:26:24.373 [2024-07-24 23:48:08.723692] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:26:24.373 [2024-07-24 23:48:08.795455] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:26:24.373 [2024-07-24 23:48:08.795457] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:24.373 [2024-07-24 23:48:08.953009] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:26:24.373 [2024-07-24 23:48:08.953055] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:26:24.373 [2024-07-24 23:48:08.953063] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:24.373 [2024-07-24 23:48:08.961031] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:26:24.373 [2024-07-24 23:48:08.961042] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:26:24.373 [2024-07-24 23:48:08.961048] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:24.373 [2024-07-24 23:48:08.969052] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:26:24.373 [2024-07-24 23:48:08.969064] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:26:24.373 [2024-07-24 23:48:08.969070] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:24.373 Running I/O for 5 seconds... 00:26:29.642 00:26:29.642 Latency(us) 00:26:29.642 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:29.642 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:26:29.642 Verification LBA range: start 0x0 length 0x800 00:26:29.642 crypto_ram : 5.02 7877.96 30.77 0.00 0.00 16188.81 1380.94 19848.05 00:26:29.642 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:26:29.642 Verification LBA range: start 0x800 length 0x800 00:26:29.642 crypto_ram : 5.02 7878.49 30.78 0.00 0.00 16187.35 1482.36 19848.05 00:26:29.642 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:26:29.642 Verification LBA range: start 0x0 length 0x800 00:26:29.642 crypto_ram3 : 5.03 3946.58 15.42 0.00 0.00 32298.01 1700.82 23343.30 00:26:29.642 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:26:29.642 Verification LBA range: start 0x800 length 0x800 00:26:29.642 crypto_ram3 : 5.03 3946.86 15.42 0.00 0.00 32292.22 1763.23 23343.30 00:26:29.642 =================================================================================================================== 00:26:29.642 Total : 23649.88 92.38 0.00 0.00 21568.66 1380.94 23343.30 00:26:29.642 00:26:29.642 real 0m5.618s 00:26:29.642 user 0m10.767s 00:26:29.642 sys 0m0.158s 00:26:29.642 23:48:14 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@1126 -- # xtrace_disable 00:26:29.642 23:48:14 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:26:29.642 ************************************ 00:26:29.642 END TEST bdev_verify 00:26:29.642 ************************************ 00:26:29.642 23:48:14 blockdev_crypto_sw -- bdev/blockdev.sh@777 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:26:29.642 23:48:14 blockdev_crypto_sw -- common/autotest_common.sh@1101 -- # '[' 16 -le 1 ']' 00:26:29.642 23:48:14 blockdev_crypto_sw -- common/autotest_common.sh@1107 -- # xtrace_disable 00:26:29.642 23:48:14 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:26:29.642 ************************************ 00:26:29.642 START TEST bdev_verify_big_io 00:26:29.642 ************************************ 00:26:29.642 23:48:14 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:26:29.642 [2024-07-24 23:48:14.332911] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:26:29.642 [2024-07-24 23:48:14.332947] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid439461 ] 00:26:29.642 [2024-07-24 23:48:14.395210] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:26:29.642 [2024-07-24 23:48:14.467007] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:26:29.642 [2024-07-24 23:48:14.467009] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:29.642 [2024-07-24 23:48:14.621508] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:26:29.642 [2024-07-24 23:48:14.621551] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:26:29.642 [2024-07-24 23:48:14.621560] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:29.643 [2024-07-24 23:48:14.629526] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:26:29.643 [2024-07-24 23:48:14.629537] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:26:29.643 [2024-07-24 23:48:14.629543] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:29.643 [2024-07-24 23:48:14.637549] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:26:29.643 [2024-07-24 23:48:14.637563] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:26:29.643 [2024-07-24 23:48:14.637568] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:29.901 Running I/O for 5 seconds... 00:26:35.168 00:26:35.168 Latency(us) 00:26:35.168 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:35.168 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:26:35.168 Verification LBA range: start 0x0 length 0x80 00:26:35.168 crypto_ram : 5.07 808.22 50.51 0.00 0.00 155864.16 4119.41 214708.42 00:26:35.168 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:26:35.168 Verification LBA range: start 0x80 length 0x80 00:26:35.168 crypto_ram : 5.14 821.29 51.33 0.00 0.00 153513.39 4525.10 210713.84 00:26:35.168 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:26:35.168 Verification LBA range: start 0x0 length 0x80 00:26:35.168 crypto_ram3 : 5.16 421.82 26.36 0.00 0.00 292286.06 4056.99 220700.28 00:26:35.168 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:26:35.168 Verification LBA range: start 0x80 length 0x80 00:26:35.168 crypto_ram3 : 5.15 422.25 26.39 0.00 0.00 292288.06 4618.73 215707.06 00:26:35.168 =================================================================================================================== 00:26:35.168 Total : 2473.59 154.60 0.00 0.00 201932.88 4056.99 220700.28 00:26:35.168 00:26:35.168 real 0m5.756s 00:26:35.168 user 0m11.043s 00:26:35.168 sys 0m0.162s 00:26:35.168 23:48:20 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@1126 -- # xtrace_disable 00:26:35.168 23:48:20 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:26:35.168 ************************************ 00:26:35.168 END TEST bdev_verify_big_io 00:26:35.168 ************************************ 00:26:35.168 23:48:20 blockdev_crypto_sw -- bdev/blockdev.sh@778 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:26:35.168 23:48:20 blockdev_crypto_sw -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:26:35.168 23:48:20 blockdev_crypto_sw -- common/autotest_common.sh@1107 -- # xtrace_disable 00:26:35.168 23:48:20 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:26:35.168 ************************************ 00:26:35.168 START TEST bdev_write_zeroes 00:26:35.169 ************************************ 00:26:35.169 23:48:20 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:26:35.169 [2024-07-24 23:48:20.166904] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:26:35.169 [2024-07-24 23:48:20.166943] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid440382 ] 00:26:35.427 [2024-07-24 23:48:20.229106] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:35.427 [2024-07-24 23:48:20.300554] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:35.685 [2024-07-24 23:48:20.457493] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:26:35.685 [2024-07-24 23:48:20.457538] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:26:35.685 [2024-07-24 23:48:20.457547] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:35.686 [2024-07-24 23:48:20.465506] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:26:35.686 [2024-07-24 23:48:20.465517] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:26:35.686 [2024-07-24 23:48:20.465526] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:35.686 [2024-07-24 23:48:20.473526] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:26:35.686 [2024-07-24 23:48:20.473537] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:26:35.686 [2024-07-24 23:48:20.473542] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:35.686 Running I/O for 1 seconds... 00:26:36.621 00:26:36.621 Latency(us) 00:26:36.621 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:36.621 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:26:36.621 crypto_ram : 1.01 41486.71 162.06 0.00 0.00 3078.33 815.30 4493.90 00:26:36.622 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:26:36.622 crypto_ram3 : 1.01 20717.20 80.93 0.00 0.00 6148.06 3822.93 6709.64 00:26:36.622 =================================================================================================================== 00:26:36.622 Total : 62203.91 242.98 0.00 0.00 4101.57 815.30 6709.64 00:26:36.880 00:26:36.880 real 0m1.584s 00:26:36.880 user 0m1.419s 00:26:36.880 sys 0m0.146s 00:26:36.880 23:48:21 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@1126 -- # xtrace_disable 00:26:36.880 23:48:21 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:26:36.880 ************************************ 00:26:36.880 END TEST bdev_write_zeroes 00:26:36.880 ************************************ 00:26:36.880 23:48:21 blockdev_crypto_sw -- bdev/blockdev.sh@781 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:26:36.880 23:48:21 blockdev_crypto_sw -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:26:36.880 23:48:21 blockdev_crypto_sw -- common/autotest_common.sh@1107 -- # xtrace_disable 00:26:36.880 23:48:21 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:26:36.880 ************************************ 00:26:36.880 START TEST bdev_json_nonenclosed 00:26:36.880 ************************************ 00:26:36.880 23:48:21 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:26:36.880 [2024-07-24 23:48:21.814939] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:26:36.880 [2024-07-24 23:48:21.814973] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid440708 ] 00:26:36.880 [2024-07-24 23:48:21.876715] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:37.139 [2024-07-24 23:48:21.949352] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:37.139 [2024-07-24 23:48:21.949413] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:26:37.139 [2024-07-24 23:48:21.949423] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:26:37.139 [2024-07-24 23:48:21.949429] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:26:37.139 00:26:37.139 real 0m0.253s 00:26:37.139 user 0m0.167s 00:26:37.139 sys 0m0.084s 00:26:37.139 23:48:22 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1126 -- # xtrace_disable 00:26:37.139 23:48:22 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:26:37.139 ************************************ 00:26:37.139 END TEST bdev_json_nonenclosed 00:26:37.139 ************************************ 00:26:37.139 23:48:22 blockdev_crypto_sw -- bdev/blockdev.sh@784 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:26:37.139 23:48:22 blockdev_crypto_sw -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:26:37.139 23:48:22 blockdev_crypto_sw -- common/autotest_common.sh@1107 -- # xtrace_disable 00:26:37.139 23:48:22 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:26:37.139 ************************************ 00:26:37.139 START TEST bdev_json_nonarray 00:26:37.139 ************************************ 00:26:37.139 23:48:22 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:26:37.139 [2024-07-24 23:48:22.137171] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:26:37.139 [2024-07-24 23:48:22.137207] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid440864 ] 00:26:37.398 [2024-07-24 23:48:22.198449] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:37.398 [2024-07-24 23:48:22.270089] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:37.398 [2024-07-24 23:48:22.270152] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:26:37.398 [2024-07-24 23:48:22.270161] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:26:37.398 [2024-07-24 23:48:22.270167] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:26:37.398 00:26:37.398 real 0m0.254s 00:26:37.398 user 0m0.166s 00:26:37.398 sys 0m0.087s 00:26:37.398 23:48:22 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1126 -- # xtrace_disable 00:26:37.398 23:48:22 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:26:37.398 ************************************ 00:26:37.398 END TEST bdev_json_nonarray 00:26:37.398 ************************************ 00:26:37.398 23:48:22 blockdev_crypto_sw -- bdev/blockdev.sh@786 -- # [[ crypto_sw == bdev ]] 00:26:37.398 23:48:22 blockdev_crypto_sw -- bdev/blockdev.sh@793 -- # [[ crypto_sw == gpt ]] 00:26:37.398 23:48:22 blockdev_crypto_sw -- bdev/blockdev.sh@797 -- # [[ crypto_sw == crypto_sw ]] 00:26:37.398 23:48:22 blockdev_crypto_sw -- bdev/blockdev.sh@798 -- # run_test bdev_crypto_enomem bdev_crypto_enomem 00:26:37.398 23:48:22 blockdev_crypto_sw -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:26:37.398 23:48:22 blockdev_crypto_sw -- common/autotest_common.sh@1107 -- # xtrace_disable 00:26:37.398 23:48:22 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:26:37.657 ************************************ 00:26:37.657 START TEST bdev_crypto_enomem 00:26:37.657 ************************************ 00:26:37.657 23:48:22 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@1125 -- # bdev_crypto_enomem 00:26:37.657 23:48:22 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@634 -- # local base_dev=base0 00:26:37.657 23:48:22 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@635 -- # local test_dev=crypt0 00:26:37.657 23:48:22 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@636 -- # local err_dev=EE_base0 00:26:37.657 23:48:22 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@637 -- # local qd=32 00:26:37.657 23:48:22 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@640 -- # ERR_PID=440889 00:26:37.657 23:48:22 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@641 -- # trap 'cleanup; killprocess $ERR_PID; exit 1' SIGINT SIGTERM EXIT 00:26:37.657 23:48:22 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 32 -o 4096 -w randwrite -t 5 -f '' 00:26:37.657 23:48:22 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@642 -- # waitforlisten 440889 00:26:37.657 23:48:22 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@831 -- # '[' -z 440889 ']' 00:26:37.657 23:48:22 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:26:37.657 23:48:22 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@836 -- # local max_retries=100 00:26:37.657 23:48:22 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:26:37.657 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:26:37.657 23:48:22 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@840 -- # xtrace_disable 00:26:37.657 23:48:22 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:26:37.657 [2024-07-24 23:48:22.453596] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:26:37.657 [2024-07-24 23:48:22.453632] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid440889 ] 00:26:37.657 [2024-07-24 23:48:22.517091] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:37.657 [2024-07-24 23:48:22.595697] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:26:38.591 23:48:23 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:26:38.591 23:48:23 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@864 -- # return 0 00:26:38.591 23:48:23 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@644 -- # rpc_cmd 00:26:38.591 23:48:23 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@561 -- # xtrace_disable 00:26:38.591 23:48:23 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:26:38.591 true 00:26:38.591 base0 00:26:38.591 true 00:26:38.591 [2024-07-24 23:48:23.279913] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:26:38.591 crypt0 00:26:38.591 23:48:23 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:26:38.591 23:48:23 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@651 -- # waitforbdev crypt0 00:26:38.591 23:48:23 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@899 -- # local bdev_name=crypt0 00:26:38.591 23:48:23 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:26:38.591 23:48:23 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@901 -- # local i 00:26:38.591 23:48:23 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:26:38.591 23:48:23 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:26:38.591 23:48:23 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@904 -- # rpc_cmd bdev_wait_for_examine 00:26:38.591 23:48:23 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@561 -- # xtrace_disable 00:26:38.591 23:48:23 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:26:38.591 23:48:23 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:26:38.591 23:48:23 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@906 -- # rpc_cmd bdev_get_bdevs -b crypt0 -t 2000 00:26:38.591 23:48:23 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@561 -- # xtrace_disable 00:26:38.591 23:48:23 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:26:38.591 [ 00:26:38.591 { 00:26:38.591 "name": "crypt0", 00:26:38.591 "aliases": [ 00:26:38.591 "3f2c278c-0bbb-594d-b109-4e174b67da0b" 00:26:38.591 ], 00:26:38.591 "product_name": "crypto", 00:26:38.591 "block_size": 512, 00:26:38.591 "num_blocks": 2097152, 00:26:38.591 "uuid": "3f2c278c-0bbb-594d-b109-4e174b67da0b", 00:26:38.591 "assigned_rate_limits": { 00:26:38.591 "rw_ios_per_sec": 0, 00:26:38.591 "rw_mbytes_per_sec": 0, 00:26:38.591 "r_mbytes_per_sec": 0, 00:26:38.591 "w_mbytes_per_sec": 0 00:26:38.591 }, 00:26:38.591 "claimed": false, 00:26:38.591 "zoned": false, 00:26:38.591 "supported_io_types": { 00:26:38.591 "read": true, 00:26:38.591 "write": true, 00:26:38.591 "unmap": false, 00:26:38.591 "flush": false, 00:26:38.591 "reset": true, 00:26:38.591 "nvme_admin": false, 00:26:38.591 "nvme_io": false, 00:26:38.591 "nvme_io_md": false, 00:26:38.591 "write_zeroes": true, 00:26:38.591 "zcopy": false, 00:26:38.591 "get_zone_info": false, 00:26:38.591 "zone_management": false, 00:26:38.591 "zone_append": false, 00:26:38.591 "compare": false, 00:26:38.591 "compare_and_write": false, 00:26:38.591 "abort": false, 00:26:38.591 "seek_hole": false, 00:26:38.591 "seek_data": false, 00:26:38.591 "copy": false, 00:26:38.591 "nvme_iov_md": false 00:26:38.591 }, 00:26:38.591 "memory_domains": [ 00:26:38.591 { 00:26:38.591 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:38.591 "dma_device_type": 2 00:26:38.591 } 00:26:38.591 ], 00:26:38.591 "driver_specific": { 00:26:38.591 "crypto": { 00:26:38.591 "base_bdev_name": "EE_base0", 00:26:38.591 "name": "crypt0", 00:26:38.591 "key_name": "test_dek_sw" 00:26:38.591 } 00:26:38.591 } 00:26:38.591 } 00:26:38.591 ] 00:26:38.591 23:48:23 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:26:38.591 23:48:23 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@907 -- # return 0 00:26:38.591 23:48:23 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@654 -- # rpcpid=441044 00:26:38.591 23:48:23 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:26:38.591 23:48:23 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@656 -- # sleep 1 00:26:38.591 Running I/O for 5 seconds... 00:26:39.524 23:48:24 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@657 -- # rpc_cmd bdev_error_inject_error EE_base0 -n 5 -q 31 write nomem 00:26:39.524 23:48:24 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@561 -- # xtrace_disable 00:26:39.524 23:48:24 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:26:39.524 23:48:24 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:26:39.524 23:48:24 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@659 -- # wait 441044 00:26:43.709 00:26:43.709 Latency(us) 00:26:43.709 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:43.709 Job: crypt0 (Core Mask 0x2, workload: randwrite, depth: 32, IO size: 4096) 00:26:43.709 crypt0 : 5.00 55935.25 218.50 0.00 0.00 569.58 269.17 811.40 00:26:43.709 =================================================================================================================== 00:26:43.709 Total : 55935.25 218.50 0.00 0.00 569.58 269.17 811.40 00:26:43.709 0 00:26:43.709 23:48:28 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@661 -- # rpc_cmd bdev_crypto_delete crypt0 00:26:43.709 23:48:28 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@561 -- # xtrace_disable 00:26:43.709 23:48:28 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:26:43.709 23:48:28 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:26:43.709 23:48:28 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@663 -- # killprocess 440889 00:26:43.709 23:48:28 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@950 -- # '[' -z 440889 ']' 00:26:43.710 23:48:28 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@954 -- # kill -0 440889 00:26:43.710 23:48:28 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@955 -- # uname 00:26:43.710 23:48:28 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:26:43.710 23:48:28 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 440889 00:26:43.710 23:48:28 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:26:43.710 23:48:28 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:26:43.710 23:48:28 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@968 -- # echo 'killing process with pid 440889' 00:26:43.710 killing process with pid 440889 00:26:43.710 23:48:28 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@969 -- # kill 440889 00:26:43.710 Received shutdown signal, test time was about 5.000000 seconds 00:26:43.710 00:26:43.710 Latency(us) 00:26:43.710 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:43.710 =================================================================================================================== 00:26:43.710 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:26:43.710 23:48:28 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@974 -- # wait 440889 00:26:43.710 23:48:28 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@664 -- # trap - SIGINT SIGTERM EXIT 00:26:43.710 00:26:43.710 real 0m6.227s 00:26:43.710 user 0m6.429s 00:26:43.710 sys 0m0.257s 00:26:43.710 23:48:28 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@1126 -- # xtrace_disable 00:26:43.710 23:48:28 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:26:43.710 ************************************ 00:26:43.710 END TEST bdev_crypto_enomem 00:26:43.710 ************************************ 00:26:43.710 23:48:28 blockdev_crypto_sw -- bdev/blockdev.sh@809 -- # trap - SIGINT SIGTERM EXIT 00:26:43.710 23:48:28 blockdev_crypto_sw -- bdev/blockdev.sh@810 -- # cleanup 00:26:43.710 23:48:28 blockdev_crypto_sw -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:26:43.710 23:48:28 blockdev_crypto_sw -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:26:43.710 23:48:28 blockdev_crypto_sw -- bdev/blockdev.sh@26 -- # [[ crypto_sw == rbd ]] 00:26:43.710 23:48:28 blockdev_crypto_sw -- bdev/blockdev.sh@30 -- # [[ crypto_sw == daos ]] 00:26:43.710 23:48:28 blockdev_crypto_sw -- bdev/blockdev.sh@34 -- # [[ crypto_sw = \g\p\t ]] 00:26:43.710 23:48:28 blockdev_crypto_sw -- bdev/blockdev.sh@40 -- # [[ crypto_sw == xnvme ]] 00:26:43.710 00:26:43.710 real 0m50.447s 00:26:43.710 user 1m35.483s 00:26:43.710 sys 0m4.237s 00:26:43.710 23:48:28 blockdev_crypto_sw -- common/autotest_common.sh@1126 -- # xtrace_disable 00:26:43.710 23:48:28 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:26:43.710 ************************************ 00:26:43.710 END TEST blockdev_crypto_sw 00:26:43.710 ************************************ 00:26:43.710 23:48:28 -- spdk/autotest.sh@363 -- # run_test blockdev_crypto_qat /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_qat 00:26:43.710 23:48:28 -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:26:43.710 23:48:28 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:26:43.710 23:48:28 -- common/autotest_common.sh@10 -- # set +x 00:26:43.969 ************************************ 00:26:43.969 START TEST blockdev_crypto_qat 00:26:43.969 ************************************ 00:26:43.969 23:48:28 blockdev_crypto_qat -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_qat 00:26:43.969 * Looking for test storage... 00:26:43.969 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:26:43.969 23:48:28 blockdev_crypto_qat -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:26:43.969 23:48:28 blockdev_crypto_qat -- bdev/nbd_common.sh@6 -- # set -e 00:26:43.969 23:48:28 blockdev_crypto_qat -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:26:43.969 23:48:28 blockdev_crypto_qat -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:26:43.969 23:48:28 blockdev_crypto_qat -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:26:43.969 23:48:28 blockdev_crypto_qat -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:26:43.969 23:48:28 blockdev_crypto_qat -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:26:43.969 23:48:28 blockdev_crypto_qat -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:26:43.969 23:48:28 blockdev_crypto_qat -- bdev/blockdev.sh@20 -- # : 00:26:43.969 23:48:28 blockdev_crypto_qat -- bdev/blockdev.sh@669 -- # QOS_DEV_1=Malloc_0 00:26:43.969 23:48:28 blockdev_crypto_qat -- bdev/blockdev.sh@670 -- # QOS_DEV_2=Null_1 00:26:43.969 23:48:28 blockdev_crypto_qat -- bdev/blockdev.sh@671 -- # QOS_RUN_TIME=5 00:26:43.969 23:48:28 blockdev_crypto_qat -- bdev/blockdev.sh@673 -- # uname -s 00:26:43.969 23:48:28 blockdev_crypto_qat -- bdev/blockdev.sh@673 -- # '[' Linux = Linux ']' 00:26:43.969 23:48:28 blockdev_crypto_qat -- bdev/blockdev.sh@675 -- # PRE_RESERVED_MEM=0 00:26:43.969 23:48:28 blockdev_crypto_qat -- bdev/blockdev.sh@681 -- # test_type=crypto_qat 00:26:43.969 23:48:28 blockdev_crypto_qat -- bdev/blockdev.sh@682 -- # crypto_device= 00:26:43.969 23:48:28 blockdev_crypto_qat -- bdev/blockdev.sh@683 -- # dek= 00:26:43.969 23:48:28 blockdev_crypto_qat -- bdev/blockdev.sh@684 -- # env_ctx= 00:26:43.969 23:48:28 blockdev_crypto_qat -- bdev/blockdev.sh@685 -- # wait_for_rpc= 00:26:43.969 23:48:28 blockdev_crypto_qat -- bdev/blockdev.sh@686 -- # '[' -n '' ']' 00:26:43.969 23:48:28 blockdev_crypto_qat -- bdev/blockdev.sh@689 -- # [[ crypto_qat == bdev ]] 00:26:43.969 23:48:28 blockdev_crypto_qat -- bdev/blockdev.sh@689 -- # [[ crypto_qat == crypto_* ]] 00:26:43.969 23:48:28 blockdev_crypto_qat -- bdev/blockdev.sh@690 -- # wait_for_rpc=--wait-for-rpc 00:26:43.969 23:48:28 blockdev_crypto_qat -- bdev/blockdev.sh@692 -- # start_spdk_tgt 00:26:43.969 23:48:28 blockdev_crypto_qat -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=441944 00:26:43.969 23:48:28 blockdev_crypto_qat -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:26:43.969 23:48:28 blockdev_crypto_qat -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:26:43.969 23:48:28 blockdev_crypto_qat -- bdev/blockdev.sh@49 -- # waitforlisten 441944 00:26:43.969 23:48:28 blockdev_crypto_qat -- common/autotest_common.sh@831 -- # '[' -z 441944 ']' 00:26:43.969 23:48:28 blockdev_crypto_qat -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:26:43.969 23:48:28 blockdev_crypto_qat -- common/autotest_common.sh@836 -- # local max_retries=100 00:26:43.969 23:48:28 blockdev_crypto_qat -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:26:43.969 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:26:43.969 23:48:28 blockdev_crypto_qat -- common/autotest_common.sh@840 -- # xtrace_disable 00:26:43.969 23:48:28 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:26:43.969 [2024-07-24 23:48:28.881508] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:26:43.969 [2024-07-24 23:48:28.881557] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid441944 ] 00:26:43.969 [2024-07-24 23:48:28.945012] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:44.228 [2024-07-24 23:48:29.023148] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:44.794 23:48:29 blockdev_crypto_qat -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:26:44.794 23:48:29 blockdev_crypto_qat -- common/autotest_common.sh@864 -- # return 0 00:26:44.794 23:48:29 blockdev_crypto_qat -- bdev/blockdev.sh@693 -- # case "$test_type" in 00:26:44.794 23:48:29 blockdev_crypto_qat -- bdev/blockdev.sh@707 -- # setup_crypto_qat_conf 00:26:44.794 23:48:29 blockdev_crypto_qat -- bdev/blockdev.sh@169 -- # rpc_cmd 00:26:44.794 23:48:29 blockdev_crypto_qat -- common/autotest_common.sh@561 -- # xtrace_disable 00:26:44.794 23:48:29 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:26:44.794 [2024-07-24 23:48:29.685107] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:26:44.794 [2024-07-24 23:48:29.693136] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:26:44.795 [2024-07-24 23:48:29.701154] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:26:44.795 [2024-07-24 23:48:29.758566] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:26:47.337 true 00:26:47.337 true 00:26:47.337 true 00:26:47.337 true 00:26:47.337 Malloc0 00:26:47.337 Malloc1 00:26:47.337 Malloc2 00:26:47.337 Malloc3 00:26:47.337 [2024-07-24 23:48:32.040447] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:26:47.337 crypto_ram 00:26:47.337 [2024-07-24 23:48:32.048464] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:26:47.337 crypto_ram1 00:26:47.337 [2024-07-24 23:48:32.056488] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:26:47.337 crypto_ram2 00:26:47.337 [2024-07-24 23:48:32.064507] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:26:47.337 crypto_ram3 00:26:47.337 [ 00:26:47.337 { 00:26:47.337 "name": "Malloc1", 00:26:47.337 "aliases": [ 00:26:47.337 "6dcfaaf7-0cef-4b05-883a-071f9f81bc23" 00:26:47.337 ], 00:26:47.337 "product_name": "Malloc disk", 00:26:47.337 "block_size": 512, 00:26:47.337 "num_blocks": 65536, 00:26:47.337 "uuid": "6dcfaaf7-0cef-4b05-883a-071f9f81bc23", 00:26:47.337 "assigned_rate_limits": { 00:26:47.337 "rw_ios_per_sec": 0, 00:26:47.337 "rw_mbytes_per_sec": 0, 00:26:47.337 "r_mbytes_per_sec": 0, 00:26:47.337 "w_mbytes_per_sec": 0 00:26:47.337 }, 00:26:47.337 "claimed": true, 00:26:47.337 "claim_type": "exclusive_write", 00:26:47.337 "zoned": false, 00:26:47.337 "supported_io_types": { 00:26:47.337 "read": true, 00:26:47.337 "write": true, 00:26:47.337 "unmap": true, 00:26:47.337 "flush": true, 00:26:47.337 "reset": true, 00:26:47.337 "nvme_admin": false, 00:26:47.337 "nvme_io": false, 00:26:47.337 "nvme_io_md": false, 00:26:47.337 "write_zeroes": true, 00:26:47.337 "zcopy": true, 00:26:47.337 "get_zone_info": false, 00:26:47.337 "zone_management": false, 00:26:47.337 "zone_append": false, 00:26:47.337 "compare": false, 00:26:47.337 "compare_and_write": false, 00:26:47.337 "abort": true, 00:26:47.337 "seek_hole": false, 00:26:47.337 "seek_data": false, 00:26:47.337 "copy": true, 00:26:47.337 "nvme_iov_md": false 00:26:47.337 }, 00:26:47.337 "memory_domains": [ 00:26:47.337 { 00:26:47.337 "dma_device_id": "system", 00:26:47.337 "dma_device_type": 1 00:26:47.337 }, 00:26:47.337 { 00:26:47.337 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:47.337 "dma_device_type": 2 00:26:47.337 } 00:26:47.337 ], 00:26:47.337 "driver_specific": {} 00:26:47.337 } 00:26:47.337 ] 00:26:47.337 23:48:32 blockdev_crypto_qat -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:26:47.337 23:48:32 blockdev_crypto_qat -- bdev/blockdev.sh@736 -- # rpc_cmd bdev_wait_for_examine 00:26:47.337 23:48:32 blockdev_crypto_qat -- common/autotest_common.sh@561 -- # xtrace_disable 00:26:47.337 23:48:32 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:26:47.337 23:48:32 blockdev_crypto_qat -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:26:47.337 23:48:32 blockdev_crypto_qat -- bdev/blockdev.sh@739 -- # cat 00:26:47.337 23:48:32 blockdev_crypto_qat -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n accel 00:26:47.337 23:48:32 blockdev_crypto_qat -- common/autotest_common.sh@561 -- # xtrace_disable 00:26:47.337 23:48:32 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:26:47.337 23:48:32 blockdev_crypto_qat -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:26:47.337 23:48:32 blockdev_crypto_qat -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n bdev 00:26:47.337 23:48:32 blockdev_crypto_qat -- common/autotest_common.sh@561 -- # xtrace_disable 00:26:47.337 23:48:32 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:26:47.337 23:48:32 blockdev_crypto_qat -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:26:47.337 23:48:32 blockdev_crypto_qat -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n iobuf 00:26:47.337 23:48:32 blockdev_crypto_qat -- common/autotest_common.sh@561 -- # xtrace_disable 00:26:47.337 23:48:32 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:26:47.337 23:48:32 blockdev_crypto_qat -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:26:47.338 23:48:32 blockdev_crypto_qat -- bdev/blockdev.sh@747 -- # mapfile -t bdevs 00:26:47.338 23:48:32 blockdev_crypto_qat -- bdev/blockdev.sh@747 -- # rpc_cmd bdev_get_bdevs 00:26:47.338 23:48:32 blockdev_crypto_qat -- bdev/blockdev.sh@747 -- # jq -r '.[] | select(.claimed == false)' 00:26:47.338 23:48:32 blockdev_crypto_qat -- common/autotest_common.sh@561 -- # xtrace_disable 00:26:47.338 23:48:32 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:26:47.338 23:48:32 blockdev_crypto_qat -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:26:47.338 23:48:32 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # mapfile -t bdevs_name 00:26:47.338 23:48:32 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "2a1deae4-ea5d-5c02-95c4-18b81905cef9"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "2a1deae4-ea5d-5c02-95c4-18b81905cef9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "ce7fdd5b-48b3-5813-8e89-4a5c110f6eaf"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "ce7fdd5b-48b3-5813-8e89-4a5c110f6eaf",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "aa6c1042-5504-580d-9044-394effe1b844"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "aa6c1042-5504-580d-9044-394effe1b844",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "85342774-5078-5087-b63f-084ee7749123"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "85342774-5078-5087-b63f-084ee7749123",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:26:47.338 23:48:32 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # jq -r .name 00:26:47.338 23:48:32 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # bdev_list=("${bdevs_name[@]}") 00:26:47.338 23:48:32 blockdev_crypto_qat -- bdev/blockdev.sh@751 -- # hello_world_bdev=crypto_ram 00:26:47.338 23:48:32 blockdev_crypto_qat -- bdev/blockdev.sh@752 -- # trap - SIGINT SIGTERM EXIT 00:26:47.338 23:48:32 blockdev_crypto_qat -- bdev/blockdev.sh@753 -- # killprocess 441944 00:26:47.338 23:48:32 blockdev_crypto_qat -- common/autotest_common.sh@950 -- # '[' -z 441944 ']' 00:26:47.338 23:48:32 blockdev_crypto_qat -- common/autotest_common.sh@954 -- # kill -0 441944 00:26:47.338 23:48:32 blockdev_crypto_qat -- common/autotest_common.sh@955 -- # uname 00:26:47.338 23:48:32 blockdev_crypto_qat -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:26:47.338 23:48:32 blockdev_crypto_qat -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 441944 00:26:47.338 23:48:32 blockdev_crypto_qat -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:26:47.338 23:48:32 blockdev_crypto_qat -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:26:47.338 23:48:32 blockdev_crypto_qat -- common/autotest_common.sh@968 -- # echo 'killing process with pid 441944' 00:26:47.338 killing process with pid 441944 00:26:47.338 23:48:32 blockdev_crypto_qat -- common/autotest_common.sh@969 -- # kill 441944 00:26:47.338 23:48:32 blockdev_crypto_qat -- common/autotest_common.sh@974 -- # wait 441944 00:26:47.905 23:48:32 blockdev_crypto_qat -- bdev/blockdev.sh@757 -- # trap cleanup SIGINT SIGTERM EXIT 00:26:47.905 23:48:32 blockdev_crypto_qat -- bdev/blockdev.sh@759 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:26:47.905 23:48:32 blockdev_crypto_qat -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:26:47.905 23:48:32 blockdev_crypto_qat -- common/autotest_common.sh@1107 -- # xtrace_disable 00:26:47.905 23:48:32 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:26:47.905 ************************************ 00:26:47.905 START TEST bdev_hello_world 00:26:47.905 ************************************ 00:26:47.905 23:48:32 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:26:47.905 [2024-07-24 23:48:32.792987] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:26:47.905 [2024-07-24 23:48:32.793027] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid442571 ] 00:26:47.905 [2024-07-24 23:48:32.856045] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:48.163 [2024-07-24 23:48:32.927146] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:48.163 [2024-07-24 23:48:32.948048] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:26:48.163 [2024-07-24 23:48:32.956074] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:26:48.163 [2024-07-24 23:48:32.964092] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:26:48.163 [2024-07-24 23:48:33.058386] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:26:50.729 [2024-07-24 23:48:35.194585] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:26:50.729 [2024-07-24 23:48:35.194639] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:26:50.729 [2024-07-24 23:48:35.194662] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:50.729 [2024-07-24 23:48:35.202606] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:26:50.729 [2024-07-24 23:48:35.202617] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:26:50.729 [2024-07-24 23:48:35.202623] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:50.729 [2024-07-24 23:48:35.210624] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:26:50.729 [2024-07-24 23:48:35.210633] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:26:50.729 [2024-07-24 23:48:35.210638] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:50.729 [2024-07-24 23:48:35.218644] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:26:50.729 [2024-07-24 23:48:35.218652] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:26:50.729 [2024-07-24 23:48:35.218657] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:50.729 [2024-07-24 23:48:35.285402] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:26:50.729 [2024-07-24 23:48:35.285437] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:26:50.729 [2024-07-24 23:48:35.285446] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:26:50.729 [2024-07-24 23:48:35.286314] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:26:50.729 [2024-07-24 23:48:35.286367] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:26:50.729 [2024-07-24 23:48:35.286377] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:26:50.729 [2024-07-24 23:48:35.286408] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:26:50.729 00:26:50.729 [2024-07-24 23:48:35.286418] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:26:50.729 00:26:50.729 real 0m2.826s 00:26:50.729 user 0m2.540s 00:26:50.729 sys 0m0.256s 00:26:50.729 23:48:35 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@1126 -- # xtrace_disable 00:26:50.729 23:48:35 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:26:50.729 ************************************ 00:26:50.729 END TEST bdev_hello_world 00:26:50.729 ************************************ 00:26:50.729 23:48:35 blockdev_crypto_qat -- bdev/blockdev.sh@760 -- # run_test bdev_bounds bdev_bounds '' 00:26:50.729 23:48:35 blockdev_crypto_qat -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:26:50.729 23:48:35 blockdev_crypto_qat -- common/autotest_common.sh@1107 -- # xtrace_disable 00:26:50.729 23:48:35 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:26:50.729 ************************************ 00:26:50.729 START TEST bdev_bounds 00:26:50.729 ************************************ 00:26:50.729 23:48:35 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@1125 -- # bdev_bounds '' 00:26:50.729 23:48:35 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@289 -- # bdevio_pid=443036 00:26:50.729 23:48:35 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@290 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:26:50.729 23:48:35 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@288 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:26:50.729 23:48:35 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@291 -- # echo 'Process bdevio pid: 443036' 00:26:50.729 Process bdevio pid: 443036 00:26:50.729 23:48:35 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@292 -- # waitforlisten 443036 00:26:50.729 23:48:35 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@831 -- # '[' -z 443036 ']' 00:26:50.729 23:48:35 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:26:50.729 23:48:35 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@836 -- # local max_retries=100 00:26:50.729 23:48:35 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:26:50.729 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:26:50.729 23:48:35 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@840 -- # xtrace_disable 00:26:50.729 23:48:35 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:26:50.729 [2024-07-24 23:48:35.680166] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:26:50.729 [2024-07-24 23:48:35.680202] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid443036 ] 00:26:50.988 [2024-07-24 23:48:35.745556] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:26:50.988 [2024-07-24 23:48:35.826460] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:26:50.988 [2024-07-24 23:48:35.826558] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:26:50.988 [2024-07-24 23:48:35.826561] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:50.988 [2024-07-24 23:48:35.847495] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:26:50.988 [2024-07-24 23:48:35.855520] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:26:50.988 [2024-07-24 23:48:35.863539] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:26:50.988 [2024-07-24 23:48:35.960127] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:26:53.520 [2024-07-24 23:48:38.098979] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:26:53.520 [2024-07-24 23:48:38.099038] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:26:53.520 [2024-07-24 23:48:38.099046] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:53.520 [2024-07-24 23:48:38.106998] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:26:53.520 [2024-07-24 23:48:38.107010] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:26:53.520 [2024-07-24 23:48:38.107016] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:53.520 [2024-07-24 23:48:38.115021] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:26:53.520 [2024-07-24 23:48:38.115031] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:26:53.520 [2024-07-24 23:48:38.115038] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:53.520 [2024-07-24 23:48:38.123041] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:26:53.520 [2024-07-24 23:48:38.123055] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:26:53.520 [2024-07-24 23:48:38.123060] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:53.520 23:48:38 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:26:53.520 23:48:38 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@864 -- # return 0 00:26:53.520 23:48:38 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:26:53.520 I/O targets: 00:26:53.520 crypto_ram: 65536 blocks of 512 bytes (32 MiB) 00:26:53.520 crypto_ram1: 65536 blocks of 512 bytes (32 MiB) 00:26:53.520 crypto_ram2: 8192 blocks of 4096 bytes (32 MiB) 00:26:53.520 crypto_ram3: 8192 blocks of 4096 bytes (32 MiB) 00:26:53.520 00:26:53.520 00:26:53.520 CUnit - A unit testing framework for C - Version 2.1-3 00:26:53.520 http://cunit.sourceforge.net/ 00:26:53.520 00:26:53.520 00:26:53.520 Suite: bdevio tests on: crypto_ram3 00:26:53.520 Test: blockdev write read block ...passed 00:26:53.520 Test: blockdev write zeroes read block ...passed 00:26:53.520 Test: blockdev write zeroes read no split ...passed 00:26:53.520 Test: blockdev write zeroes read split ...passed 00:26:53.520 Test: blockdev write zeroes read split partial ...passed 00:26:53.520 Test: blockdev reset ...passed 00:26:53.520 Test: blockdev write read 8 blocks ...passed 00:26:53.520 Test: blockdev write read size > 128k ...passed 00:26:53.520 Test: blockdev write read invalid size ...passed 00:26:53.520 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:26:53.520 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:26:53.520 Test: blockdev write read max offset ...passed 00:26:53.520 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:26:53.521 Test: blockdev writev readv 8 blocks ...passed 00:26:53.521 Test: blockdev writev readv 30 x 1block ...passed 00:26:53.521 Test: blockdev writev readv block ...passed 00:26:53.521 Test: blockdev writev readv size > 128k ...passed 00:26:53.521 Test: blockdev writev readv size > 128k in two iovs ...passed 00:26:53.521 Test: blockdev comparev and writev ...passed 00:26:53.521 Test: blockdev nvme passthru rw ...passed 00:26:53.521 Test: blockdev nvme passthru vendor specific ...passed 00:26:53.521 Test: blockdev nvme admin passthru ...passed 00:26:53.521 Test: blockdev copy ...passed 00:26:53.521 Suite: bdevio tests on: crypto_ram2 00:26:53.521 Test: blockdev write read block ...passed 00:26:53.521 Test: blockdev write zeroes read block ...passed 00:26:53.521 Test: blockdev write zeroes read no split ...passed 00:26:53.521 Test: blockdev write zeroes read split ...passed 00:26:53.521 Test: blockdev write zeroes read split partial ...passed 00:26:53.521 Test: blockdev reset ...passed 00:26:53.521 Test: blockdev write read 8 blocks ...passed 00:26:53.521 Test: blockdev write read size > 128k ...passed 00:26:53.521 Test: blockdev write read invalid size ...passed 00:26:53.521 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:26:53.521 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:26:53.521 Test: blockdev write read max offset ...passed 00:26:53.521 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:26:53.521 Test: blockdev writev readv 8 blocks ...passed 00:26:53.521 Test: blockdev writev readv 30 x 1block ...passed 00:26:53.521 Test: blockdev writev readv block ...passed 00:26:53.521 Test: blockdev writev readv size > 128k ...passed 00:26:53.521 Test: blockdev writev readv size > 128k in two iovs ...passed 00:26:53.521 Test: blockdev comparev and writev ...passed 00:26:53.521 Test: blockdev nvme passthru rw ...passed 00:26:53.521 Test: blockdev nvme passthru vendor specific ...passed 00:26:53.521 Test: blockdev nvme admin passthru ...passed 00:26:53.521 Test: blockdev copy ...passed 00:26:53.521 Suite: bdevio tests on: crypto_ram1 00:26:53.521 Test: blockdev write read block ...passed 00:26:53.521 Test: blockdev write zeroes read block ...passed 00:26:53.521 Test: blockdev write zeroes read no split ...passed 00:26:53.521 Test: blockdev write zeroes read split ...passed 00:26:53.521 Test: blockdev write zeroes read split partial ...passed 00:26:53.521 Test: blockdev reset ...passed 00:26:53.521 Test: blockdev write read 8 blocks ...passed 00:26:53.521 Test: blockdev write read size > 128k ...passed 00:26:53.521 Test: blockdev write read invalid size ...passed 00:26:53.521 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:26:53.521 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:26:53.521 Test: blockdev write read max offset ...passed 00:26:53.521 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:26:53.521 Test: blockdev writev readv 8 blocks ...passed 00:26:53.521 Test: blockdev writev readv 30 x 1block ...passed 00:26:53.521 Test: blockdev writev readv block ...passed 00:26:53.521 Test: blockdev writev readv size > 128k ...passed 00:26:53.521 Test: blockdev writev readv size > 128k in two iovs ...passed 00:26:53.521 Test: blockdev comparev and writev ...passed 00:26:53.521 Test: blockdev nvme passthru rw ...passed 00:26:53.521 Test: blockdev nvme passthru vendor specific ...passed 00:26:53.521 Test: blockdev nvme admin passthru ...passed 00:26:53.521 Test: blockdev copy ...passed 00:26:53.521 Suite: bdevio tests on: crypto_ram 00:26:53.521 Test: blockdev write read block ...passed 00:26:53.521 Test: blockdev write zeroes read block ...passed 00:26:53.521 Test: blockdev write zeroes read no split ...passed 00:26:53.521 Test: blockdev write zeroes read split ...passed 00:26:53.779 Test: blockdev write zeroes read split partial ...passed 00:26:53.779 Test: blockdev reset ...passed 00:26:53.779 Test: blockdev write read 8 blocks ...passed 00:26:53.779 Test: blockdev write read size > 128k ...passed 00:26:53.779 Test: blockdev write read invalid size ...passed 00:26:53.779 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:26:53.779 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:26:53.779 Test: blockdev write read max offset ...passed 00:26:53.779 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:26:53.779 Test: blockdev writev readv 8 blocks ...passed 00:26:53.779 Test: blockdev writev readv 30 x 1block ...passed 00:26:53.779 Test: blockdev writev readv block ...passed 00:26:53.779 Test: blockdev writev readv size > 128k ...passed 00:26:53.779 Test: blockdev writev readv size > 128k in two iovs ...passed 00:26:53.779 Test: blockdev comparev and writev ...passed 00:26:53.779 Test: blockdev nvme passthru rw ...passed 00:26:53.779 Test: blockdev nvme passthru vendor specific ...passed 00:26:53.779 Test: blockdev nvme admin passthru ...passed 00:26:53.779 Test: blockdev copy ...passed 00:26:53.779 00:26:53.779 Run Summary: Type Total Ran Passed Failed Inactive 00:26:53.780 suites 4 4 n/a 0 0 00:26:53.780 tests 92 92 92 0 0 00:26:53.780 asserts 520 520 520 0 n/a 00:26:53.780 00:26:53.780 Elapsed time = 0.508 seconds 00:26:53.780 0 00:26:53.780 23:48:38 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@294 -- # killprocess 443036 00:26:53.780 23:48:38 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@950 -- # '[' -z 443036 ']' 00:26:53.780 23:48:38 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@954 -- # kill -0 443036 00:26:53.780 23:48:38 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@955 -- # uname 00:26:53.780 23:48:38 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:26:53.780 23:48:38 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 443036 00:26:53.780 23:48:38 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:26:53.780 23:48:38 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:26:53.780 23:48:38 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@968 -- # echo 'killing process with pid 443036' 00:26:53.780 killing process with pid 443036 00:26:53.780 23:48:38 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@969 -- # kill 443036 00:26:53.780 23:48:38 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@974 -- # wait 443036 00:26:54.038 23:48:38 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@295 -- # trap - SIGINT SIGTERM EXIT 00:26:54.038 00:26:54.038 real 0m3.277s 00:26:54.038 user 0m9.253s 00:26:54.038 sys 0m0.393s 00:26:54.038 23:48:38 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@1126 -- # xtrace_disable 00:26:54.038 23:48:38 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:26:54.038 ************************************ 00:26:54.038 END TEST bdev_bounds 00:26:54.038 ************************************ 00:26:54.038 23:48:38 blockdev_crypto_qat -- bdev/blockdev.sh@761 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '' 00:26:54.038 23:48:38 blockdev_crypto_qat -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:26:54.038 23:48:38 blockdev_crypto_qat -- common/autotest_common.sh@1107 -- # xtrace_disable 00:26:54.038 23:48:38 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:26:54.038 ************************************ 00:26:54.038 START TEST bdev_nbd 00:26:54.038 ************************************ 00:26:54.038 23:48:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@1125 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '' 00:26:54.038 23:48:38 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@299 -- # uname -s 00:26:54.038 23:48:38 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@299 -- # [[ Linux == Linux ]] 00:26:54.038 23:48:38 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@301 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:26:54.038 23:48:38 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@302 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:26:54.038 23:48:38 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@303 -- # bdev_all=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:26:54.038 23:48:38 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@303 -- # local bdev_all 00:26:54.038 23:48:38 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_num=4 00:26:54.038 23:48:38 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@308 -- # [[ -e /sys/module/nbd ]] 00:26:54.038 23:48:38 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@310 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:26:54.038 23:48:38 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@310 -- # local nbd_all 00:26:54.038 23:48:38 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@311 -- # bdev_num=4 00:26:54.038 23:48:38 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@313 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:26:54.038 23:48:38 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@313 -- # local nbd_list 00:26:54.038 23:48:38 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@314 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:26:54.038 23:48:38 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@314 -- # local bdev_list 00:26:54.038 23:48:38 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@317 -- # nbd_pid=443739 00:26:54.038 23:48:38 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@318 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:26:54.039 23:48:38 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@316 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:26:54.039 23:48:38 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@319 -- # waitforlisten 443739 /var/tmp/spdk-nbd.sock 00:26:54.039 23:48:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@831 -- # '[' -z 443739 ']' 00:26:54.039 23:48:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:26:54.039 23:48:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@836 -- # local max_retries=100 00:26:54.039 23:48:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:26:54.039 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:26:54.039 23:48:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@840 -- # xtrace_disable 00:26:54.039 23:48:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:26:54.039 [2024-07-24 23:48:39.032978] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:26:54.039 [2024-07-24 23:48:39.033018] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:26:54.298 [2024-07-24 23:48:39.096930] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:54.298 [2024-07-24 23:48:39.174184] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:54.298 [2024-07-24 23:48:39.195038] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:26:54.298 [2024-07-24 23:48:39.203062] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:26:54.298 [2024-07-24 23:48:39.211082] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:26:54.557 [2024-07-24 23:48:39.306418] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:26:56.459 [2024-07-24 23:48:41.437623] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:26:56.459 [2024-07-24 23:48:41.437670] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:26:56.459 [2024-07-24 23:48:41.437682] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:56.459 [2024-07-24 23:48:41.445640] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:26:56.459 [2024-07-24 23:48:41.445653] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:26:56.459 [2024-07-24 23:48:41.445660] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:56.459 [2024-07-24 23:48:41.453665] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:26:56.459 [2024-07-24 23:48:41.453677] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:26:56.459 [2024-07-24 23:48:41.453684] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:56.718 [2024-07-24 23:48:41.461684] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:26:56.718 [2024-07-24 23:48:41.461696] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:26:56.718 [2024-07-24 23:48:41.461703] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:56.718 23:48:41 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:26:56.718 23:48:41 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@864 -- # return 0 00:26:56.718 23:48:41 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@321 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' 00:26:56.718 23:48:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:26:56.718 23:48:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:26:56.718 23:48:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:26:56.718 23:48:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' 00:26:56.718 23:48:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:26:56.718 23:48:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:26:56.718 23:48:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:26:56.718 23:48:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:26:56.718 23:48:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:26:56.718 23:48:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:26:56.718 23:48:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:26:56.718 23:48:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:26:56.977 23:48:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:26:56.977 23:48:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:26:56.977 23:48:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:26:56.977 23:48:41 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:26:56.977 23:48:41 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:26:56.977 23:48:41 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:26:56.977 23:48:41 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:26:56.977 23:48:41 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:26:56.977 23:48:41 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:26:56.977 23:48:41 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:26:56.977 23:48:41 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:26:56.977 23:48:41 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:56.977 1+0 records in 00:26:56.977 1+0 records out 00:26:56.977 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000247709 s, 16.5 MB/s 00:26:56.977 23:48:41 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:56.977 23:48:41 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:26:56.977 23:48:41 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:56.977 23:48:41 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:26:56.977 23:48:41 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:26:56.977 23:48:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:26:56.977 23:48:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:26:56.977 23:48:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram1 00:26:56.977 23:48:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:26:56.977 23:48:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:26:56.977 23:48:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:26:56.977 23:48:41 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:26:56.977 23:48:41 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:26:56.977 23:48:41 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:26:56.977 23:48:41 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:26:56.977 23:48:41 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:26:56.977 23:48:41 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:26:56.977 23:48:41 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:26:56.977 23:48:41 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:26:56.977 23:48:41 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:56.977 1+0 records in 00:26:56.977 1+0 records out 00:26:56.977 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000147747 s, 27.7 MB/s 00:26:56.977 23:48:41 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:56.977 23:48:41 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:26:56.977 23:48:41 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:56.977 23:48:41 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:26:56.977 23:48:41 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:26:56.977 23:48:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:26:56.977 23:48:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:26:56.977 23:48:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 00:26:57.237 23:48:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:26:57.237 23:48:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:26:57.237 23:48:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:26:57.237 23:48:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd2 00:26:57.237 23:48:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:26:57.237 23:48:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:26:57.237 23:48:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:26:57.237 23:48:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd2 /proc/partitions 00:26:57.237 23:48:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:26:57.237 23:48:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:26:57.237 23:48:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:26:57.237 23:48:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:57.237 1+0 records in 00:26:57.237 1+0 records out 00:26:57.237 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000242459 s, 16.9 MB/s 00:26:57.237 23:48:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:57.237 23:48:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:26:57.237 23:48:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:57.237 23:48:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:26:57.237 23:48:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:26:57.237 23:48:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:26:57.237 23:48:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:26:57.237 23:48:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:26:57.495 23:48:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:26:57.496 23:48:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:26:57.496 23:48:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:26:57.496 23:48:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd3 00:26:57.496 23:48:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:26:57.496 23:48:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:26:57.496 23:48:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:26:57.496 23:48:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd3 /proc/partitions 00:26:57.496 23:48:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:26:57.496 23:48:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:26:57.496 23:48:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:26:57.496 23:48:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:57.496 1+0 records in 00:26:57.496 1+0 records out 00:26:57.496 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000140248 s, 29.2 MB/s 00:26:57.496 23:48:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:57.496 23:48:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:26:57.496 23:48:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:57.496 23:48:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:26:57.496 23:48:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:26:57.496 23:48:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:26:57.496 23:48:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:26:57.496 23:48:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:26:57.754 23:48:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:26:57.754 { 00:26:57.754 "nbd_device": "/dev/nbd0", 00:26:57.754 "bdev_name": "crypto_ram" 00:26:57.754 }, 00:26:57.754 { 00:26:57.754 "nbd_device": "/dev/nbd1", 00:26:57.754 "bdev_name": "crypto_ram1" 00:26:57.754 }, 00:26:57.754 { 00:26:57.754 "nbd_device": "/dev/nbd2", 00:26:57.754 "bdev_name": "crypto_ram2" 00:26:57.754 }, 00:26:57.754 { 00:26:57.754 "nbd_device": "/dev/nbd3", 00:26:57.754 "bdev_name": "crypto_ram3" 00:26:57.754 } 00:26:57.754 ]' 00:26:57.754 23:48:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:26:57.754 23:48:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:26:57.754 { 00:26:57.754 "nbd_device": "/dev/nbd0", 00:26:57.754 "bdev_name": "crypto_ram" 00:26:57.754 }, 00:26:57.754 { 00:26:57.754 "nbd_device": "/dev/nbd1", 00:26:57.754 "bdev_name": "crypto_ram1" 00:26:57.754 }, 00:26:57.754 { 00:26:57.754 "nbd_device": "/dev/nbd2", 00:26:57.754 "bdev_name": "crypto_ram2" 00:26:57.754 }, 00:26:57.754 { 00:26:57.754 "nbd_device": "/dev/nbd3", 00:26:57.755 "bdev_name": "crypto_ram3" 00:26:57.755 } 00:26:57.755 ]' 00:26:57.755 23:48:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:26:57.755 23:48:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3' 00:26:57.755 23:48:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:26:57.755 23:48:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3') 00:26:57.755 23:48:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:57.755 23:48:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:26:57.755 23:48:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:57.755 23:48:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:26:57.755 23:48:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:26:57.755 23:48:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:26:57.755 23:48:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:26:57.755 23:48:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:57.755 23:48:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:57.755 23:48:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:26:57.755 23:48:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:26:57.755 23:48:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:26:57.755 23:48:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:57.755 23:48:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:26:58.013 23:48:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:26:58.013 23:48:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:26:58.013 23:48:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:26:58.013 23:48:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:58.013 23:48:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:58.013 23:48:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:26:58.013 23:48:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:26:58.013 23:48:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:26:58.013 23:48:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:58.013 23:48:42 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:26:58.271 23:48:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:26:58.271 23:48:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:26:58.271 23:48:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:26:58.271 23:48:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:58.271 23:48:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:58.271 23:48:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:26:58.271 23:48:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:26:58.271 23:48:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:26:58.271 23:48:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:58.271 23:48:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:26:58.528 23:48:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:26:58.528 23:48:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:26:58.528 23:48:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:26:58.528 23:48:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:58.528 23:48:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:58.528 23:48:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:26:58.528 23:48:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:26:58.528 23:48:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:26:58.528 23:48:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:26:58.528 23:48:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:26:58.528 23:48:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:26:58.528 23:48:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:26:58.528 23:48:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:26:58.528 23:48:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:26:58.528 23:48:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:26:58.528 23:48:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:26:58.528 23:48:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:26:58.528 23:48:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:26:58.528 23:48:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:26:58.528 23:48:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:26:58.528 23:48:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:26:58.528 23:48:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:26:58.528 23:48:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:26:58.528 23:48:43 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:26:58.528 23:48:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:26:58.528 23:48:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:26:58.528 23:48:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:26:58.528 23:48:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:26:58.528 23:48:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:26:58.528 23:48:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:26:58.528 23:48:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:26:58.528 23:48:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:26:58.528 23:48:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:58.528 23:48:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:26:58.528 23:48:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:58.528 23:48:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:26:58.528 23:48:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:58.528 23:48:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:26:58.528 23:48:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:26:58.787 /dev/nbd0 00:26:58.787 23:48:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:26:58.787 23:48:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:26:58.787 23:48:43 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:26:58.787 23:48:43 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:26:58.787 23:48:43 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:26:58.787 23:48:43 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:26:58.787 23:48:43 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:26:58.787 23:48:43 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:26:58.787 23:48:43 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:26:58.787 23:48:43 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:26:58.787 23:48:43 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:58.787 1+0 records in 00:26:58.787 1+0 records out 00:26:58.787 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000239604 s, 17.1 MB/s 00:26:58.787 23:48:43 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:58.787 23:48:43 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:26:58.787 23:48:43 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:58.787 23:48:43 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:26:58.787 23:48:43 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:26:58.787 23:48:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:58.787 23:48:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:26:58.787 23:48:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram1 /dev/nbd1 00:26:59.046 /dev/nbd1 00:26:59.046 23:48:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:26:59.046 23:48:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:26:59.046 23:48:43 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:26:59.046 23:48:43 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:26:59.046 23:48:43 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:26:59.046 23:48:43 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:26:59.046 23:48:43 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:26:59.046 23:48:43 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:26:59.046 23:48:43 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:26:59.046 23:48:43 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:26:59.046 23:48:43 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:59.046 1+0 records in 00:26:59.046 1+0 records out 00:26:59.046 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00021532 s, 19.0 MB/s 00:26:59.046 23:48:43 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:59.046 23:48:43 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:26:59.046 23:48:43 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:59.046 23:48:43 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:26:59.046 23:48:43 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:26:59.046 23:48:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:59.046 23:48:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:26:59.046 23:48:43 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 /dev/nbd10 00:26:59.305 /dev/nbd10 00:26:59.305 23:48:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:26:59.305 23:48:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:26:59.305 23:48:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd10 00:26:59.305 23:48:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:26:59.305 23:48:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:26:59.305 23:48:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:26:59.305 23:48:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd10 /proc/partitions 00:26:59.305 23:48:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:26:59.305 23:48:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:26:59.305 23:48:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:26:59.305 23:48:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:59.305 1+0 records in 00:26:59.305 1+0 records out 00:26:59.305 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000228386 s, 17.9 MB/s 00:26:59.305 23:48:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:59.305 23:48:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:26:59.305 23:48:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:59.305 23:48:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:26:59.305 23:48:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:26:59.305 23:48:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:59.305 23:48:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:26:59.305 23:48:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd11 00:26:59.305 /dev/nbd11 00:26:59.305 23:48:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:26:59.305 23:48:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:26:59.305 23:48:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd11 00:26:59.305 23:48:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:26:59.305 23:48:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:26:59.305 23:48:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:26:59.305 23:48:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd11 /proc/partitions 00:26:59.305 23:48:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:26:59.305 23:48:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:26:59.305 23:48:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:26:59.305 23:48:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:59.305 1+0 records in 00:26:59.305 1+0 records out 00:26:59.305 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000265466 s, 15.4 MB/s 00:26:59.305 23:48:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:59.305 23:48:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:26:59.305 23:48:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:59.305 23:48:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:26:59.305 23:48:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:26:59.305 23:48:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:59.305 23:48:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:26:59.305 23:48:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:26:59.305 23:48:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:26:59.305 23:48:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:26:59.563 23:48:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:26:59.563 { 00:26:59.563 "nbd_device": "/dev/nbd0", 00:26:59.563 "bdev_name": "crypto_ram" 00:26:59.563 }, 00:26:59.563 { 00:26:59.563 "nbd_device": "/dev/nbd1", 00:26:59.563 "bdev_name": "crypto_ram1" 00:26:59.563 }, 00:26:59.563 { 00:26:59.563 "nbd_device": "/dev/nbd10", 00:26:59.563 "bdev_name": "crypto_ram2" 00:26:59.563 }, 00:26:59.563 { 00:26:59.563 "nbd_device": "/dev/nbd11", 00:26:59.563 "bdev_name": "crypto_ram3" 00:26:59.563 } 00:26:59.563 ]' 00:26:59.563 23:48:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:26:59.563 { 00:26:59.563 "nbd_device": "/dev/nbd0", 00:26:59.564 "bdev_name": "crypto_ram" 00:26:59.564 }, 00:26:59.564 { 00:26:59.564 "nbd_device": "/dev/nbd1", 00:26:59.564 "bdev_name": "crypto_ram1" 00:26:59.564 }, 00:26:59.564 { 00:26:59.564 "nbd_device": "/dev/nbd10", 00:26:59.564 "bdev_name": "crypto_ram2" 00:26:59.564 }, 00:26:59.564 { 00:26:59.564 "nbd_device": "/dev/nbd11", 00:26:59.564 "bdev_name": "crypto_ram3" 00:26:59.564 } 00:26:59.564 ]' 00:26:59.564 23:48:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:26:59.564 23:48:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:26:59.564 /dev/nbd1 00:26:59.564 /dev/nbd10 00:26:59.564 /dev/nbd11' 00:26:59.564 23:48:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:26:59.564 /dev/nbd1 00:26:59.564 /dev/nbd10 00:26:59.564 /dev/nbd11' 00:26:59.564 23:48:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:26:59.564 23:48:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=4 00:26:59.564 23:48:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 4 00:26:59.564 23:48:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=4 00:26:59.564 23:48:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 4 -ne 4 ']' 00:26:59.564 23:48:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' write 00:26:59.564 23:48:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:26:59.564 23:48:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:26:59.564 23:48:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:26:59.564 23:48:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:26:59.564 23:48:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:26:59.564 23:48:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:26:59.564 256+0 records in 00:26:59.564 256+0 records out 00:26:59.564 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0103151 s, 102 MB/s 00:26:59.564 23:48:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:26:59.564 23:48:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:26:59.822 256+0 records in 00:26:59.822 256+0 records out 00:26:59.822 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.034912 s, 30.0 MB/s 00:26:59.822 23:48:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:26:59.822 23:48:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:26:59.822 256+0 records in 00:26:59.822 256+0 records out 00:26:59.822 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0295752 s, 35.5 MB/s 00:26:59.822 23:48:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:26:59.822 23:48:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:26:59.822 256+0 records in 00:26:59.822 256+0 records out 00:26:59.822 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0246178 s, 42.6 MB/s 00:26:59.822 23:48:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:26:59.822 23:48:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:26:59.822 256+0 records in 00:26:59.822 256+0 records out 00:26:59.822 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0229036 s, 45.8 MB/s 00:26:59.822 23:48:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' verify 00:26:59.822 23:48:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:26:59.822 23:48:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:26:59.822 23:48:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:26:59.822 23:48:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:26:59.822 23:48:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:26:59.822 23:48:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:26:59.822 23:48:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:26:59.822 23:48:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:26:59.822 23:48:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:26:59.822 23:48:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:26:59.822 23:48:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:26:59.823 23:48:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:26:59.823 23:48:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:26:59.823 23:48:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:26:59.823 23:48:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:26:59.823 23:48:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:26:59.823 23:48:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:26:59.823 23:48:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:26:59.823 23:48:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:59.823 23:48:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:26:59.823 23:48:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:59.823 23:48:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:27:00.081 23:48:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:27:00.081 23:48:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:27:00.081 23:48:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:27:00.081 23:48:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:00.081 23:48:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:00.081 23:48:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:27:00.081 23:48:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:27:00.081 23:48:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:27:00.081 23:48:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:00.081 23:48:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:27:00.081 23:48:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:27:00.081 23:48:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:27:00.082 23:48:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:27:00.082 23:48:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:00.082 23:48:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:00.082 23:48:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:27:00.082 23:48:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:27:00.082 23:48:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:27:00.082 23:48:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:00.082 23:48:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:27:00.339 23:48:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:27:00.339 23:48:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:27:00.339 23:48:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:27:00.339 23:48:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:00.339 23:48:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:00.339 23:48:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:27:00.339 23:48:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:27:00.339 23:48:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:27:00.339 23:48:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:00.340 23:48:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:27:00.597 23:48:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:27:00.597 23:48:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:27:00.597 23:48:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:27:00.597 23:48:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:00.597 23:48:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:00.597 23:48:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:27:00.597 23:48:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:27:00.597 23:48:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:27:00.597 23:48:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:27:00.597 23:48:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:00.597 23:48:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:27:00.597 23:48:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:27:00.597 23:48:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:27:00.597 23:48:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:27:00.855 23:48:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:27:00.855 23:48:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:27:00.855 23:48:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:27:00.855 23:48:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:27:00.855 23:48:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:27:00.855 23:48:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:27:00.855 23:48:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:27:00.855 23:48:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:27:00.855 23:48:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:27:00.855 23:48:45 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:27:00.855 23:48:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:00.855 23:48:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:27:00.855 23:48:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:27:00.855 23:48:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:27:00.855 23:48:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:27:00.855 malloc_lvol_verify 00:27:00.855 23:48:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:27:01.113 fa4b3fa2-34f5-4cec-9be6-0f9784a0b097 00:27:01.113 23:48:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:27:01.113 e4af5fe9-def0-4ecf-9808-3c211664cb7b 00:27:01.371 23:48:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:27:01.371 /dev/nbd0 00:27:01.371 23:48:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:27:01.371 mke2fs 1.46.5 (30-Dec-2021) 00:27:01.371 Discarding device blocks: 0/4096 done 00:27:01.371 Creating filesystem with 4096 1k blocks and 1024 inodes 00:27:01.371 00:27:01.371 Allocating group tables: 0/1 done 00:27:01.371 Writing inode tables: 0/1 done 00:27:01.371 Creating journal (1024 blocks): done 00:27:01.371 Writing superblocks and filesystem accounting information: 0/1 done 00:27:01.371 00:27:01.371 23:48:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:27:01.371 23:48:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:27:01.371 23:48:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:01.371 23:48:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:27:01.371 23:48:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:01.371 23:48:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:27:01.371 23:48:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:01.371 23:48:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:27:01.630 23:48:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:27:01.630 23:48:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:27:01.630 23:48:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:27:01.630 23:48:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:01.630 23:48:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:01.630 23:48:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:27:01.630 23:48:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:27:01.630 23:48:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:27:01.630 23:48:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:27:01.630 23:48:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:27:01.630 23:48:46 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@325 -- # killprocess 443739 00:27:01.630 23:48:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@950 -- # '[' -z 443739 ']' 00:27:01.630 23:48:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@954 -- # kill -0 443739 00:27:01.630 23:48:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@955 -- # uname 00:27:01.630 23:48:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:27:01.630 23:48:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 443739 00:27:01.630 23:48:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:27:01.630 23:48:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:27:01.630 23:48:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@968 -- # echo 'killing process with pid 443739' 00:27:01.630 killing process with pid 443739 00:27:01.630 23:48:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@969 -- # kill 443739 00:27:01.630 23:48:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@974 -- # wait 443739 00:27:01.889 23:48:46 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@326 -- # trap - SIGINT SIGTERM EXIT 00:27:01.889 00:27:01.889 real 0m7.846s 00:27:01.889 user 0m10.500s 00:27:01.889 sys 0m2.366s 00:27:01.889 23:48:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@1126 -- # xtrace_disable 00:27:01.889 23:48:46 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:27:01.889 ************************************ 00:27:01.889 END TEST bdev_nbd 00:27:01.889 ************************************ 00:27:01.889 23:48:46 blockdev_crypto_qat -- bdev/blockdev.sh@762 -- # [[ y == y ]] 00:27:01.889 23:48:46 blockdev_crypto_qat -- bdev/blockdev.sh@763 -- # '[' crypto_qat = nvme ']' 00:27:01.889 23:48:46 blockdev_crypto_qat -- bdev/blockdev.sh@763 -- # '[' crypto_qat = gpt ']' 00:27:01.889 23:48:46 blockdev_crypto_qat -- bdev/blockdev.sh@767 -- # run_test bdev_fio fio_test_suite '' 00:27:01.889 23:48:46 blockdev_crypto_qat -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:27:01.889 23:48:46 blockdev_crypto_qat -- common/autotest_common.sh@1107 -- # xtrace_disable 00:27:01.889 23:48:46 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:27:01.889 ************************************ 00:27:01.889 START TEST bdev_fio 00:27:01.889 ************************************ 00:27:01.889 23:48:46 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1125 -- # fio_test_suite '' 00:27:01.889 23:48:46 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@330 -- # local env_context 00:27:01.889 23:48:46 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@334 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:27:01.889 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:27:01.889 23:48:46 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@335 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:27:01.889 23:48:46 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@338 -- # echo '' 00:27:01.889 23:48:46 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@338 -- # sed s/--env-context=// 00:27:02.148 23:48:46 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@338 -- # env_context= 00:27:02.148 23:48:46 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:27:02.148 23:48:46 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:27:02.148 23:48:46 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:27:02.148 23:48:46 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:27:02.148 23:48:46 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:27:02.148 23:48:46 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:27:02.148 23:48:46 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:27:02.148 23:48:46 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:27:02.148 23:48:46 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:27:02.148 23:48:46 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:27:02.148 23:48:46 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:27:02.148 23:48:46 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:27:02.148 23:48:46 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:27:02.148 23:48:46 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:27:02.148 23:48:46 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:27:02.148 23:48:46 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:27:02.148 23:48:46 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:27:02.148 23:48:46 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:27:02.148 23:48:46 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram]' 00:27:02.148 23:48:46 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram 00:27:02.148 23:48:46 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:27:02.148 23:48:46 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram1]' 00:27:02.148 23:48:46 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram1 00:27:02.148 23:48:46 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:27:02.148 23:48:46 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram2]' 00:27:02.148 23:48:46 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram2 00:27:02.148 23:48:46 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:27:02.148 23:48:46 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram3]' 00:27:02.148 23:48:46 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram3 00:27:02.148 23:48:46 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@346 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:27:02.148 23:48:46 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@348 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:27:02.148 23:48:46 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1101 -- # '[' 11 -le 1 ']' 00:27:02.148 23:48:46 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1107 -- # xtrace_disable 00:27:02.148 23:48:46 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:27:02.148 ************************************ 00:27:02.148 START TEST bdev_fio_rw_verify 00:27:02.148 ************************************ 00:27:02.148 23:48:46 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1125 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:27:02.148 23:48:46 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:27:02.148 23:48:46 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:27:02.148 23:48:46 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:27:02.148 23:48:46 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:27:02.148 23:48:46 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:27:02.148 23:48:46 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:27:02.148 23:48:46 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:27:02.148 23:48:46 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:27:02.148 23:48:46 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:27:02.148 23:48:46 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:27:02.148 23:48:46 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:27:02.148 23:48:47 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:27:02.148 23:48:47 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:27:02.148 23:48:47 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:27:02.148 23:48:47 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:27:02.148 23:48:47 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:27:02.148 23:48:47 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:27:02.148 23:48:47 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:27:02.148 23:48:47 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:27:02.148 23:48:47 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:27:02.149 23:48:47 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:27:02.407 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:27:02.407 job_crypto_ram1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:27:02.407 job_crypto_ram2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:27:02.407 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:27:02.407 fio-3.35 00:27:02.407 Starting 4 threads 00:27:17.281 00:27:17.281 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=445659: Wed Jul 24 23:48:59 2024 00:27:17.281 read: IOPS=26.7k, BW=104MiB/s (109MB/s)(1043MiB/10001msec) 00:27:17.281 slat (usec): min=11, max=294, avg=53.55, stdev=33.34 00:27:17.281 clat (usec): min=15, max=2578, avg=297.23, stdev=196.49 00:27:17.281 lat (usec): min=51, max=2616, avg=350.77, stdev=213.29 00:27:17.281 clat percentiles (usec): 00:27:17.281 | 50.000th=[ 241], 99.000th=[ 930], 99.900th=[ 1106], 99.990th=[ 1319], 00:27:17.281 | 99.999th=[ 2008] 00:27:17.281 write: IOPS=29.3k, BW=114MiB/s (120MB/s)(1114MiB/9741msec); 0 zone resets 00:27:17.281 slat (usec): min=17, max=1186, avg=60.29, stdev=29.41 00:27:17.281 clat (usec): min=17, max=1985, avg=321.26, stdev=190.42 00:27:17.281 lat (usec): min=46, max=2034, avg=381.55, stdev=203.82 00:27:17.281 clat percentiles (usec): 00:27:17.281 | 50.000th=[ 281], 99.000th=[ 914], 99.900th=[ 1074], 99.990th=[ 1172], 00:27:17.281 | 99.999th=[ 1745] 00:27:17.281 bw ( KiB/s): min=99720, max=163949, per=97.71%, avg=114387.21, stdev=3812.05, samples=76 00:27:17.281 iops : min=24930, max=40987, avg=28596.79, stdev=953.00, samples=76 00:27:17.281 lat (usec) : 20=0.01%, 50=0.06%, 100=7.52%, 250=40.19%, 500=36.80% 00:27:17.281 lat (usec) : 750=11.47%, 1000=3.56% 00:27:17.281 lat (msec) : 2=0.40%, 4=0.01% 00:27:17.281 cpu : usr=99.69%, sys=0.00%, ctx=75, majf=0, minf=305 00:27:17.281 IO depths : 1=3.3%, 2=27.7%, 4=55.3%, 8=13.8%, 16=0.0%, 32=0.0%, >=64=0.0% 00:27:17.281 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:27:17.281 complete : 0=0.0%, 4=87.9%, 8=12.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:27:17.281 issued rwts: total=267008,285094,0,0 short=0,0,0,0 dropped=0,0,0,0 00:27:17.281 latency : target=0, window=0, percentile=100.00%, depth=8 00:27:17.281 00:27:17.281 Run status group 0 (all jobs): 00:27:17.281 READ: bw=104MiB/s (109MB/s), 104MiB/s-104MiB/s (109MB/s-109MB/s), io=1043MiB (1094MB), run=10001-10001msec 00:27:17.281 WRITE: bw=114MiB/s (120MB/s), 114MiB/s-114MiB/s (120MB/s-120MB/s), io=1114MiB (1168MB), run=9741-9741msec 00:27:17.281 00:27:17.281 real 0m13.226s 00:27:17.281 user 0m48.214s 00:27:17.281 sys 0m0.380s 00:27:17.281 23:49:00 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1126 -- # xtrace_disable 00:27:17.281 23:49:00 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:27:17.281 ************************************ 00:27:17.281 END TEST bdev_fio_rw_verify 00:27:17.281 ************************************ 00:27:17.281 23:49:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@349 -- # rm -f 00:27:17.281 23:49:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:27:17.281 23:49:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@353 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:27:17.281 23:49:00 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:27:17.281 23:49:00 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:27:17.281 23:49:00 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:27:17.281 23:49:00 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:27:17.281 23:49:00 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:27:17.282 23:49:00 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:27:17.282 23:49:00 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:27:17.282 23:49:00 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:27:17.282 23:49:00 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:27:17.282 23:49:00 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:27:17.282 23:49:00 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:27:17.282 23:49:00 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:27:17.282 23:49:00 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:27:17.282 23:49:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@354 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:27:17.282 23:49:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@354 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "2a1deae4-ea5d-5c02-95c4-18b81905cef9"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "2a1deae4-ea5d-5c02-95c4-18b81905cef9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "ce7fdd5b-48b3-5813-8e89-4a5c110f6eaf"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "ce7fdd5b-48b3-5813-8e89-4a5c110f6eaf",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "aa6c1042-5504-580d-9044-394effe1b844"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "aa6c1042-5504-580d-9044-394effe1b844",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "85342774-5078-5087-b63f-084ee7749123"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "85342774-5078-5087-b63f-084ee7749123",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:27:17.282 23:49:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@354 -- # [[ -n crypto_ram 00:27:17.282 crypto_ram1 00:27:17.282 crypto_ram2 00:27:17.282 crypto_ram3 ]] 00:27:17.282 23:49:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:27:17.282 23:49:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "2a1deae4-ea5d-5c02-95c4-18b81905cef9"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "2a1deae4-ea5d-5c02-95c4-18b81905cef9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "ce7fdd5b-48b3-5813-8e89-4a5c110f6eaf"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "ce7fdd5b-48b3-5813-8e89-4a5c110f6eaf",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "aa6c1042-5504-580d-9044-394effe1b844"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "aa6c1042-5504-580d-9044-394effe1b844",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "85342774-5078-5087-b63f-084ee7749123"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "85342774-5078-5087-b63f-084ee7749123",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:27:17.282 23:49:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:27:17.282 23:49:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram]' 00:27:17.282 23:49:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram 00:27:17.282 23:49:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:27:17.282 23:49:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram1]' 00:27:17.282 23:49:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram1 00:27:17.282 23:49:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:27:17.282 23:49:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram2]' 00:27:17.282 23:49:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram2 00:27:17.282 23:49:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:27:17.282 23:49:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram3]' 00:27:17.282 23:49:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram3 00:27:17.282 23:49:00 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@366 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:27:17.282 23:49:00 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1101 -- # '[' 11 -le 1 ']' 00:27:17.282 23:49:00 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1107 -- # xtrace_disable 00:27:17.282 23:49:00 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:27:17.282 ************************************ 00:27:17.282 START TEST bdev_fio_trim 00:27:17.282 ************************************ 00:27:17.282 23:49:00 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1125 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:27:17.283 23:49:00 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:27:17.283 23:49:00 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:27:17.283 23:49:00 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:27:17.283 23:49:00 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:27:17.283 23:49:00 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:27:17.283 23:49:00 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:27:17.283 23:49:00 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:27:17.283 23:49:00 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:27:17.283 23:49:00 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:27:17.283 23:49:00 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:27:17.283 23:49:00 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:27:17.283 23:49:00 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:27:17.283 23:49:00 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:27:17.283 23:49:00 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:27:17.283 23:49:00 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:27:17.283 23:49:00 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:27:17.283 23:49:00 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:27:17.283 23:49:00 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:27:17.283 23:49:00 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:27:17.283 23:49:00 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:27:17.283 23:49:00 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:27:17.283 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:27:17.283 job_crypto_ram1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:27:17.283 job_crypto_ram2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:27:17.283 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:27:17.283 fio-3.35 00:27:17.283 Starting 4 threads 00:27:29.529 00:27:29.529 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=448038: Wed Jul 24 23:49:13 2024 00:27:29.529 write: IOPS=42.8k, BW=167MiB/s (175MB/s)(1670MiB/10001msec); 0 zone resets 00:27:29.529 slat (usec): min=12, max=322, avg=53.57, stdev=20.05 00:27:29.529 clat (usec): min=30, max=1286, avg=192.15, stdev=89.70 00:27:29.529 lat (usec): min=45, max=1336, avg=245.72, stdev=96.16 00:27:29.529 clat percentiles (usec): 00:27:29.529 | 50.000th=[ 188], 99.000th=[ 404], 99.900th=[ 490], 99.990th=[ 644], 00:27:29.529 | 99.999th=[ 1074] 00:27:29.529 bw ( KiB/s): min=163424, max=243159, per=100.00%, avg=171152.37, stdev=5205.58, samples=76 00:27:29.529 iops : min=40856, max=60789, avg=42788.05, stdev=1301.34, samples=76 00:27:29.529 trim: IOPS=42.8k, BW=167MiB/s (175MB/s)(1670MiB/10001msec); 0 zone resets 00:27:29.529 slat (nsec): min=4261, max=96237, avg=17015.84, stdev=6812.54 00:27:29.529 clat (usec): min=28, max=1336, avg=245.87, stdev=96.18 00:27:29.529 lat (usec): min=33, max=1372, avg=262.88, stdev=97.63 00:27:29.529 clat percentiles (usec): 00:27:29.529 | 50.000th=[ 241], 99.000th=[ 474], 99.900th=[ 562], 99.990th=[ 783], 00:27:29.529 | 99.999th=[ 1270] 00:27:29.529 bw ( KiB/s): min=163424, max=243159, per=100.00%, avg=171151.95, stdev=5205.62, samples=76 00:27:29.529 iops : min=40856, max=60789, avg=42788.05, stdev=1301.34, samples=76 00:27:29.529 lat (usec) : 50=0.55%, 100=10.33%, 250=51.34%, 500=37.52%, 750=0.26% 00:27:29.529 lat (usec) : 1000=0.01% 00:27:29.529 lat (msec) : 2=0.01% 00:27:29.529 cpu : usr=99.69%, sys=0.00%, ctx=48, majf=0, minf=92 00:27:29.529 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:27:29.529 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:27:29.529 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:27:29.529 issued rwts: total=0,427562,427563,0 short=0,0,0,0 dropped=0,0,0,0 00:27:29.529 latency : target=0, window=0, percentile=100.00%, depth=8 00:27:29.529 00:27:29.529 Run status group 0 (all jobs): 00:27:29.529 WRITE: bw=167MiB/s (175MB/s), 167MiB/s-167MiB/s (175MB/s-175MB/s), io=1670MiB (1751MB), run=10001-10001msec 00:27:29.529 TRIM: bw=167MiB/s (175MB/s), 167MiB/s-167MiB/s (175MB/s-175MB/s), io=1670MiB (1751MB), run=10001-10001msec 00:27:29.529 00:27:29.529 real 0m13.222s 00:27:29.529 user 0m48.398s 00:27:29.529 sys 0m0.389s 00:27:29.529 23:49:13 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1126 -- # xtrace_disable 00:27:29.529 23:49:13 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:27:29.529 ************************************ 00:27:29.529 END TEST bdev_fio_trim 00:27:29.529 ************************************ 00:27:29.529 23:49:13 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@367 -- # rm -f 00:27:29.529 23:49:13 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:27:29.529 23:49:13 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@369 -- # popd 00:27:29.529 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:27:29.529 23:49:13 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@370 -- # trap - SIGINT SIGTERM EXIT 00:27:29.529 00:27:29.529 real 0m26.739s 00:27:29.529 user 1m36.779s 00:27:29.529 sys 0m0.910s 00:27:29.529 23:49:13 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1126 -- # xtrace_disable 00:27:29.529 23:49:13 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:27:29.529 ************************************ 00:27:29.529 END TEST bdev_fio 00:27:29.529 ************************************ 00:27:29.529 23:49:13 blockdev_crypto_qat -- bdev/blockdev.sh@774 -- # trap cleanup SIGINT SIGTERM EXIT 00:27:29.529 23:49:13 blockdev_crypto_qat -- bdev/blockdev.sh@776 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:27:29.529 23:49:13 blockdev_crypto_qat -- common/autotest_common.sh@1101 -- # '[' 16 -le 1 ']' 00:27:29.529 23:49:13 blockdev_crypto_qat -- common/autotest_common.sh@1107 -- # xtrace_disable 00:27:29.529 23:49:13 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:27:29.529 ************************************ 00:27:29.529 START TEST bdev_verify 00:27:29.529 ************************************ 00:27:29.529 23:49:13 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:27:29.529 [2024-07-24 23:49:13.715583] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:27:29.529 [2024-07-24 23:49:13.715617] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid449656 ] 00:27:29.529 [2024-07-24 23:49:13.777520] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:27:29.529 [2024-07-24 23:49:13.855924] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:27:29.529 [2024-07-24 23:49:13.855928] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:29.529 [2024-07-24 23:49:13.876909] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:27:29.529 [2024-07-24 23:49:13.884932] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:27:29.529 [2024-07-24 23:49:13.892951] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:27:29.529 [2024-07-24 23:49:13.988699] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:27:31.430 [2024-07-24 23:49:16.122341] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:27:31.430 [2024-07-24 23:49:16.122395] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:27:31.430 [2024-07-24 23:49:16.122403] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:31.430 [2024-07-24 23:49:16.130351] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:27:31.430 [2024-07-24 23:49:16.130361] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:27:31.430 [2024-07-24 23:49:16.130367] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:31.430 [2024-07-24 23:49:16.138372] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:27:31.430 [2024-07-24 23:49:16.138381] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:27:31.430 [2024-07-24 23:49:16.138386] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:31.430 [2024-07-24 23:49:16.146396] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:27:31.430 [2024-07-24 23:49:16.146405] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:27:31.430 [2024-07-24 23:49:16.146410] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:31.430 Running I/O for 5 seconds... 00:27:36.699 00:27:36.699 Latency(us) 00:27:36.700 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:36.700 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:27:36.700 Verification LBA range: start 0x0 length 0x1000 00:27:36.700 crypto_ram : 5.04 686.38 2.68 0.00 0.00 186077.14 8613.30 120336.58 00:27:36.700 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:27:36.700 Verification LBA range: start 0x1000 length 0x1000 00:27:36.700 crypto_ram : 5.04 687.65 2.69 0.00 0.00 185596.87 392.05 119837.26 00:27:36.700 Job: crypto_ram1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:27:36.700 Verification LBA range: start 0x0 length 0x1000 00:27:36.700 crypto_ram1 : 5.04 687.78 2.69 0.00 0.00 185399.12 157.01 111848.11 00:27:36.700 Job: crypto_ram1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:27:36.700 Verification LBA range: start 0x1000 length 0x1000 00:27:36.700 crypto_ram1 : 5.04 692.13 2.70 0.00 0.00 184248.42 565.64 111848.11 00:27:36.700 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:27:36.700 Verification LBA range: start 0x0 length 0x1000 00:27:36.700 crypto_ram2 : 5.02 5425.79 21.19 0.00 0.00 23446.92 5149.26 19848.05 00:27:36.700 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:27:36.700 Verification LBA range: start 0x1000 length 0x1000 00:27:36.700 crypto_ram2 : 5.03 5444.34 21.27 0.00 0.00 23367.48 4805.97 19848.05 00:27:36.700 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:27:36.700 Verification LBA range: start 0x0 length 0x1000 00:27:36.700 crypto_ram3 : 5.03 5442.70 21.26 0.00 0.00 23339.93 1404.34 19848.05 00:27:36.700 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:27:36.700 Verification LBA range: start 0x1000 length 0x1000 00:27:36.700 crypto_ram3 : 5.03 5453.47 21.30 0.00 0.00 23289.30 748.98 19848.05 00:27:36.700 =================================================================================================================== 00:27:36.700 Total : 24520.24 95.78 0.00 0.00 41571.85 157.01 120336.58 00:27:36.700 00:27:36.700 real 0m7.903s 00:27:36.700 user 0m15.235s 00:27:36.700 sys 0m0.239s 00:27:36.700 23:49:21 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@1126 -- # xtrace_disable 00:27:36.700 23:49:21 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:27:36.700 ************************************ 00:27:36.700 END TEST bdev_verify 00:27:36.700 ************************************ 00:27:36.700 23:49:21 blockdev_crypto_qat -- bdev/blockdev.sh@777 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:27:36.700 23:49:21 blockdev_crypto_qat -- common/autotest_common.sh@1101 -- # '[' 16 -le 1 ']' 00:27:36.700 23:49:21 blockdev_crypto_qat -- common/autotest_common.sh@1107 -- # xtrace_disable 00:27:36.700 23:49:21 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:27:36.700 ************************************ 00:27:36.700 START TEST bdev_verify_big_io 00:27:36.700 ************************************ 00:27:36.700 23:49:21 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:27:36.700 [2024-07-24 23:49:21.677005] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:27:36.700 [2024-07-24 23:49:21.677037] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid451021 ] 00:27:36.958 [2024-07-24 23:49:21.738693] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:27:36.958 [2024-07-24 23:49:21.810493] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:27:36.958 [2024-07-24 23:49:21.810495] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:36.958 [2024-07-24 23:49:21.831477] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:27:36.958 [2024-07-24 23:49:21.839502] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:27:36.958 [2024-07-24 23:49:21.847519] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:27:36.958 [2024-07-24 23:49:21.943028] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:27:39.490 [2024-07-24 23:49:24.082458] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:27:39.490 [2024-07-24 23:49:24.082518] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:27:39.490 [2024-07-24 23:49:24.082542] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:39.490 [2024-07-24 23:49:24.090482] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:27:39.491 [2024-07-24 23:49:24.090494] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:27:39.491 [2024-07-24 23:49:24.090499] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:39.491 [2024-07-24 23:49:24.098506] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:27:39.491 [2024-07-24 23:49:24.098516] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:27:39.491 [2024-07-24 23:49:24.098522] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:39.491 [2024-07-24 23:49:24.106529] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:27:39.491 [2024-07-24 23:49:24.106541] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:27:39.491 [2024-07-24 23:49:24.106546] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:39.491 Running I/O for 5 seconds... 00:27:39.752 [2024-07-24 23:49:24.664762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.665053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.665104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.665134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.665161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.665186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.665463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.665476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.668028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.668058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.668084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.668113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.668460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.668491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.668530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.668558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.668885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.668894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.671427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.671460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.671488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.671514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.671828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.671855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.671879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.671905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.672220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.672229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.674593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.674622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.674648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.674688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.675080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.675108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.675132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.675157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.675483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.675492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.677899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.677927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.677952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.677976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.678312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.678342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.678368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.678393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.678717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.678726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.681024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.681052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.681097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.681122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.681457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.681488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.681513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.681538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.681842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.681849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.684309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.684338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.684365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.684389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.684747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.684776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.684801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.684835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.685143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.685151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.687604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.687634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.687659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.687683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.688006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.688054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.688092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.688118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.688404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.688412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.690876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.752 [2024-07-24 23:49:24.690905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.690929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.690957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.691255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.691282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.691328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.691361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.691629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.691638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.694609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.694653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.694687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.694712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.694999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.695029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.695054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.695079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.695381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.695390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.697704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.697732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.697757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.697793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.698123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.698150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.698201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.698228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.698590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.698600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.700944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.700983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.701007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.701035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.701359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.701385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.701411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.701436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.701758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.701767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.704041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.704075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.704110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.704137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.704503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.704531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.704558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.704584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.704898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.704907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.707181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.707209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.707234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.707260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.707610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.707639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.707663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.707691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.707972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.707980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.710337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.710365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.710409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.710435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.710791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.710820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.710846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.710871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.711161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.711170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.713412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.713442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.713470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.713496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.713806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.713833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.713857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.713882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.714193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.714202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.716394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.716422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.716447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.716475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.716813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.716842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.716867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.716892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.717210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.717219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.719565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.719593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.719634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.753 [2024-07-24 23:49:24.719660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.720010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.720041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.720066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.720091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.720406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.720415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.722687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.722719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.722746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.722771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.723109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.723137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.723162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.723187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.723436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.723444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.725657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.725685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.725710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.725735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.726070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.726108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.726146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.726171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.726439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.726448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.728700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.728729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.728754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.728779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.729074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.729105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.729132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.729164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.729410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.729418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.731836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.731885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.731918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.731943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.732239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.732267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.732292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.732319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.732580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.732588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.734682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.734710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.734736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.734761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.735075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.735102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.735137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.735163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.735507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.735517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.737666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.737704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.737729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.737755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.738044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.738076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.738100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.738126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.738439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.738448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.740505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.740533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.740559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.740590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.740988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.741016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.741040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.741066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.741391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.741399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.743532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.743560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.743585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.743610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.743954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.743983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.744008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.744033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.744298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.744307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.746502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.746533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.746559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.746585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.746929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.746957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.754 [2024-07-24 23:49:24.746986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.755 [2024-07-24 23:49:24.747016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.755 [2024-07-24 23:49:24.747353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.755 [2024-07-24 23:49:24.747362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.755 [2024-07-24 23:49:24.749499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.755 [2024-07-24 23:49:24.749530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.755 [2024-07-24 23:49:24.749558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.755 [2024-07-24 23:49:24.749583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.755 [2024-07-24 23:49:24.749904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.755 [2024-07-24 23:49:24.749930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.755 [2024-07-24 23:49:24.749955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.755 [2024-07-24 23:49:24.749981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.017 [2024-07-24 23:49:24.750309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.017 [2024-07-24 23:49:24.750319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.017 [2024-07-24 23:49:24.752376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.017 [2024-07-24 23:49:24.752440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.017 [2024-07-24 23:49:24.752466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.017 [2024-07-24 23:49:24.752496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.017 [2024-07-24 23:49:24.752880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.017 [2024-07-24 23:49:24.752908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.017 [2024-07-24 23:49:24.752935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.017 [2024-07-24 23:49:24.752960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.017 [2024-07-24 23:49:24.753271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.017 [2024-07-24 23:49:24.753282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.017 [2024-07-24 23:49:24.754606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.017 [2024-07-24 23:49:24.754634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.017 [2024-07-24 23:49:24.754678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.017 [2024-07-24 23:49:24.754703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.017 [2024-07-24 23:49:24.754904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.017 [2024-07-24 23:49:24.754938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.017 [2024-07-24 23:49:24.754964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.017 [2024-07-24 23:49:24.754998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.017 [2024-07-24 23:49:24.755178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.017 [2024-07-24 23:49:24.755187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.017 [2024-07-24 23:49:24.756837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.017 [2024-07-24 23:49:24.756866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.017 [2024-07-24 23:49:24.756892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.017 [2024-07-24 23:49:24.756931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.017 [2024-07-24 23:49:24.757307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.017 [2024-07-24 23:49:24.757335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.017 [2024-07-24 23:49:24.757361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.017 [2024-07-24 23:49:24.757386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.017 [2024-07-24 23:49:24.757719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.017 [2024-07-24 23:49:24.757728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.017 [2024-07-24 23:49:24.758917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.017 [2024-07-24 23:49:24.758945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.017 [2024-07-24 23:49:24.758971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.017 [2024-07-24 23:49:24.758993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.017 [2024-07-24 23:49:24.759209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.017 [2024-07-24 23:49:24.759236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.017 [2024-07-24 23:49:24.759260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.017 [2024-07-24 23:49:24.759285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.017 [2024-07-24 23:49:24.759518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.017 [2024-07-24 23:49:24.759526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.017 [2024-07-24 23:49:24.761356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.017 [2024-07-24 23:49:24.761639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.017 [2024-07-24 23:49:24.762067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.017 [2024-07-24 23:49:24.762876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.017 [2024-07-24 23:49:24.764072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.017 [2024-07-24 23:49:24.765098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.017 [2024-07-24 23:49:24.765733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.017 [2024-07-24 23:49:24.766516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.017 [2024-07-24 23:49:24.766697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.017 [2024-07-24 23:49:24.766706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.017 [2024-07-24 23:49:24.768637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.017 [2024-07-24 23:49:24.768897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.017 [2024-07-24 23:49:24.769855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.017 [2024-07-24 23:49:24.770721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.017 [2024-07-24 23:49:24.771846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.017 [2024-07-24 23:49:24.772311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.017 [2024-07-24 23:49:24.773220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.017 [2024-07-24 23:49:24.774224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.017 [2024-07-24 23:49:24.774402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.017 [2024-07-24 23:49:24.774410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.017 [2024-07-24 23:49:24.776361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.017 [2024-07-24 23:49:24.776847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.017 [2024-07-24 23:49:24.777627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.017 [2024-07-24 23:49:24.778566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.017 [2024-07-24 23:49:24.779755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.017 [2024-07-24 23:49:24.780409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.017 [2024-07-24 23:49:24.781168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.017 [2024-07-24 23:49:24.782109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.017 [2024-07-24 23:49:24.782327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.017 [2024-07-24 23:49:24.782336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.784315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.784832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.785612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.786534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.787667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.788405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.789190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.790124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.790307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.790315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.792437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.793432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.794441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.795485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.796098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.796885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.797818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.798755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.798932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.798941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.801689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.802471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.803398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.804315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.805523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.806386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.807308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.808265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.808539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.808549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.811428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.812360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.813294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.814090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.815102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.816038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.816977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.817569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.817925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.817934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.820473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.821407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.822347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.822859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.824116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.825076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.826320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.826621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.826941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.826950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.828526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.829297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.830232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.831164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.831612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.831870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.832126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.832381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.832646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.832654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.834718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.835578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.836509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.837431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.837965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.838225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.838484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.838743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.838921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.838933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.840875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.841803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.842738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.843410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.843962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.844220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.844478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.845199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.845411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.845419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.847378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.848308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.849270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.849538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.850123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.850381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.850644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.851520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.851700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.018 [2024-07-24 23:49:24.851708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.853881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.854850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.855488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.855750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.856338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.856600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.857442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.858259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.858445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.858456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.860686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.861734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.861996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.862255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.862789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.863173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.863954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.864890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.865071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.865079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.867207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.867748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.868019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.868276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.868897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.869823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.870645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.871579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.871759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.871767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.873945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.874208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.874471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.874728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.875550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.876323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.877261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.878197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.878389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.878397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.879975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.880238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.880499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.880756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.882072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.882968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.883921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.884916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.885261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.885269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.886578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.886838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.887096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.887352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.888356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.889293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.890233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.890835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.891018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.891026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.892457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.892720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.892978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.893426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.894594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.895643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.896627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.897276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.897498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.897506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.899008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.899273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.899536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.900498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.901683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.902617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.903095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.904068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.904255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.904263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.905940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.906201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.019 [2024-07-24 23:49:24.906655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.907428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.908643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.909641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.910292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.911092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.911286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.911294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.913034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.913295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.914209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.915021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.916195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.916731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.917745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.918803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.918987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.918996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.920889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.921456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.922248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.923197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.924372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.925108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.925917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.926878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.927062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.927070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.929065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.930021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.931049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.932009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.932660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.933461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.934420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.935387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.935576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.935586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.938251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.939067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.940016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.940970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.942365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.943346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.944361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.945426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.945727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.945738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.948283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.949221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.950194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.950934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.951894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.952830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.953806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.954325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.954689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.954701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.957164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.958128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.959145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.959692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.960959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.961921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.962848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.963111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.963434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.963443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.965967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.966939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.967658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.968581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.969764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.970757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.971251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.971515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.971854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.971863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.974304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.974653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.975646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.976690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.977288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.977553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.977810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.978067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.978404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.978413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.980362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.980630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.980893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.981153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.020 [2024-07-24 23:49:24.981735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:24.981994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:24.982252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:24.982521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:24.982827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:24.982835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:24.984856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:24.985120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:24.985377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:24.985402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:24.986010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:24.986277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:24.986547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:24.986806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:24.987092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:24.987101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:24.989014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:24.989274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:24.989536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:24.989801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:24.989831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:24.990084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:24.990356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:24.990618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:24.990873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:24.991131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:24.991398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:24.991407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:24.993104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:24.993136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:24.993161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:24.993185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:24.993506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:24.993549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:24.993585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:24.993630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:24.993667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:24.994000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:24.994008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:24.995799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:24.995828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:24.995853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:24.995881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:24.996113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:24.996154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:24.996180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:24.996205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:24.996232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:24.996488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:24.996500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:24.998453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:24.998495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:24.998523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:24.998564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:24.998826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:24.998867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:24.998904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:24.998960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:24.998992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:24.999333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:24.999341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:25.001156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:25.001184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:25.001209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:25.001233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:25.001463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:25.001507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:25.001535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:25.001560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:25.001584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:25.001925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:25.001933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:25.003656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:25.003684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:25.003721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.021 [2024-07-24 23:49:25.003746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.022 [2024-07-24 23:49:25.003992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.022 [2024-07-24 23:49:25.004036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.022 [2024-07-24 23:49:25.004064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.022 [2024-07-24 23:49:25.004088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.022 [2024-07-24 23:49:25.004119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.022 [2024-07-24 23:49:25.004461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.022 [2024-07-24 23:49:25.004475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.022 [2024-07-24 23:49:25.006198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.022 [2024-07-24 23:49:25.006239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.022 [2024-07-24 23:49:25.006264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.022 [2024-07-24 23:49:25.006289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.022 [2024-07-24 23:49:25.006574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.022 [2024-07-24 23:49:25.006606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.022 [2024-07-24 23:49:25.006633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.022 [2024-07-24 23:49:25.006657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.022 [2024-07-24 23:49:25.006683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.022 [2024-07-24 23:49:25.007019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.022 [2024-07-24 23:49:25.007028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.022 [2024-07-24 23:49:25.008758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.022 [2024-07-24 23:49:25.008786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.022 [2024-07-24 23:49:25.008814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.022 [2024-07-24 23:49:25.008839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.022 [2024-07-24 23:49:25.009244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.022 [2024-07-24 23:49:25.009276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.022 [2024-07-24 23:49:25.009303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.022 [2024-07-24 23:49:25.009330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.022 [2024-07-24 23:49:25.009357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.022 [2024-07-24 23:49:25.009673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.022 [2024-07-24 23:49:25.009682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.022 [2024-07-24 23:49:25.011413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.022 [2024-07-24 23:49:25.011441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.022 [2024-07-24 23:49:25.011472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.022 [2024-07-24 23:49:25.011498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.022 [2024-07-24 23:49:25.011814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.022 [2024-07-24 23:49:25.011848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.022 [2024-07-24 23:49:25.011875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.022 [2024-07-24 23:49:25.011903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.022 [2024-07-24 23:49:25.011929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.022 [2024-07-24 23:49:25.012241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.022 [2024-07-24 23:49:25.012250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.284 [2024-07-24 23:49:25.013931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.284 [2024-07-24 23:49:25.013960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.284 [2024-07-24 23:49:25.013985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.284 [2024-07-24 23:49:25.014011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.014353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.014389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.014417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.014449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.014478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.014854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.014863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.016583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.016612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.016656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.016682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.016965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.016997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.017024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.017049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.017073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.017395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.017404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.019087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.019115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.019156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.019185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.019534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.019570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.019598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.019623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.019649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.019984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.019992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.021742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.021770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.021798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.021822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.022161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.022192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.022218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.022243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.022269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.022554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.022563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.024208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.024236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.024262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.024288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.024619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.024654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.024682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.024707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.024733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.025028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.025037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.026719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.026751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.026776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.026802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.027120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.027161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.027188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.027226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.027266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.027553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.027561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.029417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.029445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.029490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.029517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.029764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.029805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.029832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.029858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.029883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.030144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.030152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.032020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.032049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.032087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.032114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.032449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.032499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.032526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.032562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.032608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.032897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.032905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.034759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.034799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.034833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.034859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.035110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.035156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.285 [2024-07-24 23:49:25.035182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.035208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.035233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.035536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.035544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.037278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.037306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.037333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.037368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.037649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.037692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.037731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.037758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.037783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.038128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.038137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.039838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.039877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.039913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.039950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.040205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.040236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.040261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.040289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.040313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.040646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.040655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.042283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.042311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.042339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.042364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.042692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.042725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.042753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.042778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.042804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.043144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.043153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.045015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.045044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.045069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.045093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.045414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.045447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.045476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.045502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.045528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.045817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.045825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.047457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.047489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.047515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.047540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.047887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.047919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.047946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.047971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.048006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.048347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.048355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.049743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.049771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.049801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.049836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.050111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.050143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.050170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.050195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.050219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.050537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.050546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.052015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.052042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.052069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.052093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.052272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.052310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.052335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.052360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.052384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.052561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.052569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.053711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.053738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.053767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.053809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.053993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.054026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.054059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.054085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.054111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.054430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.054438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.286 [2024-07-24 23:49:25.056063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.056089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.056114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.056138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.056368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.056406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.056431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.056455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.056482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.056659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.056667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.057823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.057859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.057888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.057913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.058094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.058133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.058162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.058187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.058213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.058397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.058409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.060236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.060263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.060307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.060337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.060530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.060562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.060592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.060618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.060652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.060829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.060837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.061995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.062025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.062050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.062074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.062251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.062292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.062318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.062343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.062367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.062546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.062554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.064203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.064237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.064263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.064288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.064619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.064654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.064680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.064704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.064730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.064938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.064946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.066054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.066081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.066864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.066894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.067072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.067112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.067137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.067161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.067185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.067363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.067371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.069083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.069111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.069137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.069636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.069841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.069892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.069918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.069942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.069966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.070143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.070151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.072154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.073087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.073732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.073994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.074331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.074596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.074856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.075875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.076841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.077019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.077028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.079071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.080064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.080331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.080596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.287 [2024-07-24 23:49:25.080881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.081144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.081773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.082544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.083479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.083655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.083662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.085783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.086423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.086687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.086945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.087285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.087550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.088596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.089588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.090596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.090773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.090782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.092839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.093105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.093363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.093625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.093936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.094483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.095254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.095657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.096628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.096812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.096820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.098879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.099325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.099586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.099843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.100193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.100516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.101373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.102333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.103261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.103439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.103447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.105527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.105794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.106052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.106309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.106643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.107429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.108213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.109147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.110081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.110354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.110363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.111866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.112130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.112387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.112647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.112921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.113737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.114661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.115588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.116404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.116633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.116641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.117945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.118230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.118498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.118762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.118943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.119744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.120720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.121638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.122074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.122251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.122259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.123614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.123875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.124132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.124638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.124849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.125881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.126819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.127712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.128455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.128674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.128682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.130107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.130367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.130628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.131621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.131833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.132781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.133718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.134148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.135042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.288 [2024-07-24 23:49:25.135223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.135231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.136830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.137093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.137705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.138478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.138658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.139599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.140420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.141220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.142000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.142177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.142185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.143825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.144087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.145082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.146115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.146293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.147242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.147676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.148463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.149402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.149583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.149592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.151366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.152060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.152844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.153805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.153988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.154764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.155677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.156486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.157417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.157599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.157607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.159618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.160575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.161609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.162592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.162784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.163240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.164013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.164942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.165911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.166097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.166105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.168589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.169384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.170335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.171291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.171598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.172592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.173484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.174440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.175416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.175700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.175712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.178526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.179518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.180487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.181333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.181573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.182384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.183346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.184307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.184933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.185272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.185281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.187693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.188657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.189610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.190059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.190244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.191184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.192185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.289 [2024-07-24 23:49:25.193262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.193538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.193897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.193906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.196395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.197355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.198141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.199040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.199261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.200244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.201195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.201769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.202042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.202368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.202377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.204858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.205810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.206249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.207051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.207231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.208253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.209309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.209578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.209834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.210125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.210135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.212476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.213313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.214176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.214975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.215160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.216095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.216781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.217050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.217317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.217687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.217696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.219853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.220302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.221167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.222135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.222320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.223335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.223613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.223877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.224141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.224494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.224504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.226646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.227301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.228113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.229071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.229251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.230038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.230299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.230557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.230823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.231149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.231158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.232788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.233850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.234830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.235851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.236032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.236402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.236685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.236948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.237215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.237479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.237488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.239018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.239852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.240793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.241727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.241946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.242218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.242481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.242760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.243028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.243212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.243220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.245359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.246284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.247274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.248355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.248689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.248965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.249230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.249500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.250151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.250366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.250374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.252233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.290 [2024-07-24 23:49:25.253200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.291 [2024-07-24 23:49:25.254166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.291 [2024-07-24 23:49:25.254798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.291 [2024-07-24 23:49:25.255120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.291 [2024-07-24 23:49:25.255385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.291 [2024-07-24 23:49:25.255650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.291 [2024-07-24 23:49:25.255908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.291 [2024-07-24 23:49:25.256796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.291 [2024-07-24 23:49:25.256980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.291 [2024-07-24 23:49:25.256988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.291 [2024-07-24 23:49:25.259110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.291 [2024-07-24 23:49:25.259667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.291 [2024-07-24 23:49:25.259935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.291 [2024-07-24 23:49:25.260198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.291 [2024-07-24 23:49:25.260545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.291 [2024-07-24 23:49:25.260825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.291 [2024-07-24 23:49:25.261085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.291 [2024-07-24 23:49:25.261358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.291 [2024-07-24 23:49:25.261641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.291 [2024-07-24 23:49:25.261955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.291 [2024-07-24 23:49:25.261965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.291 [2024-07-24 23:49:25.263990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.291 [2024-07-24 23:49:25.264252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.291 [2024-07-24 23:49:25.264515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.291 [2024-07-24 23:49:25.264776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.291 [2024-07-24 23:49:25.265052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.291 [2024-07-24 23:49:25.265318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.291 [2024-07-24 23:49:25.265579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.291 [2024-07-24 23:49:25.265836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.291 [2024-07-24 23:49:25.266095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.291 [2024-07-24 23:49:25.266337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.291 [2024-07-24 23:49:25.266346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.291 [2024-07-24 23:49:25.268379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.291 [2024-07-24 23:49:25.268649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.291 [2024-07-24 23:49:25.268914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.291 [2024-07-24 23:49:25.269176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.291 [2024-07-24 23:49:25.269513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.291 [2024-07-24 23:49:25.269784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.291 [2024-07-24 23:49:25.270043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.291 [2024-07-24 23:49:25.270306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.291 [2024-07-24 23:49:25.270595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.291 [2024-07-24 23:49:25.270960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.291 [2024-07-24 23:49:25.270969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.291 [2024-07-24 23:49:25.272953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.291 [2024-07-24 23:49:25.273216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.291 [2024-07-24 23:49:25.273478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.291 [2024-07-24 23:49:25.273736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.291 [2024-07-24 23:49:25.274021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.291 [2024-07-24 23:49:25.274291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.291 [2024-07-24 23:49:25.274567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.291 [2024-07-24 23:49:25.274824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.291 [2024-07-24 23:49:25.275080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.291 [2024-07-24 23:49:25.275398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.291 [2024-07-24 23:49:25.275408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.291 [2024-07-24 23:49:25.277325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.291 [2024-07-24 23:49:25.277596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.291 [2024-07-24 23:49:25.277628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.291 [2024-07-24 23:49:25.277894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.291 [2024-07-24 23:49:25.278143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.291 [2024-07-24 23:49:25.278415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.291 [2024-07-24 23:49:25.278688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.291 [2024-07-24 23:49:25.278952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.291 [2024-07-24 23:49:25.279216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.291 [2024-07-24 23:49:25.279497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.291 [2024-07-24 23:49:25.279506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.554 [2024-07-24 23:49:25.281523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.554 [2024-07-24 23:49:25.281800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.554 [2024-07-24 23:49:25.282070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.554 [2024-07-24 23:49:25.282100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.554 [2024-07-24 23:49:25.282443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.554 [2024-07-24 23:49:25.282720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.554 [2024-07-24 23:49:25.282988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.554 [2024-07-24 23:49:25.283252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.554 [2024-07-24 23:49:25.283545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.554 [2024-07-24 23:49:25.283877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.554 [2024-07-24 23:49:25.283886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.554 [2024-07-24 23:49:25.285785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.554 [2024-07-24 23:49:25.285827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.554 [2024-07-24 23:49:25.285864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.554 [2024-07-24 23:49:25.285889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.554 [2024-07-24 23:49:25.286147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.554 [2024-07-24 23:49:25.286189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.554 [2024-07-24 23:49:25.286217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.554 [2024-07-24 23:49:25.286243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.554 [2024-07-24 23:49:25.286270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.554 [2024-07-24 23:49:25.286568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.554 [2024-07-24 23:49:25.286577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.554 [2024-07-24 23:49:25.288340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.554 [2024-07-24 23:49:25.288368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.554 [2024-07-24 23:49:25.288393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.554 [2024-07-24 23:49:25.288426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.554 [2024-07-24 23:49:25.288698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.554 [2024-07-24 23:49:25.288739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.554 [2024-07-24 23:49:25.288765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.554 [2024-07-24 23:49:25.288797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.554 [2024-07-24 23:49:25.288822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.554 [2024-07-24 23:49:25.289188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.554 [2024-07-24 23:49:25.289196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.554 [2024-07-24 23:49:25.290926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.554 [2024-07-24 23:49:25.290956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.554 [2024-07-24 23:49:25.290992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.554 [2024-07-24 23:49:25.291036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.554 [2024-07-24 23:49:25.291388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.554 [2024-07-24 23:49:25.291427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.554 [2024-07-24 23:49:25.291453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.554 [2024-07-24 23:49:25.291482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.554 [2024-07-24 23:49:25.291506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.554 [2024-07-24 23:49:25.291815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.554 [2024-07-24 23:49:25.291823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.554 [2024-07-24 23:49:25.293528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.554 [2024-07-24 23:49:25.293559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.554 [2024-07-24 23:49:25.293585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.554 [2024-07-24 23:49:25.293610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.554 [2024-07-24 23:49:25.293903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.554 [2024-07-24 23:49:25.293933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.554 [2024-07-24 23:49:25.293959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.554 [2024-07-24 23:49:25.293985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.554 [2024-07-24 23:49:25.294014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.554 [2024-07-24 23:49:25.294355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.294364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.296022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.296051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.296076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.296111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.296480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.296517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.296544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.296572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.296599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.296895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.296904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.298664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.298693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.298728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.298753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.299067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.299098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.299124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.299151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.299176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.299466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.299479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.301106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.301133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.301160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.301185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.301540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.301574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.301601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.301628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.301667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.302023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.302033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.303727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.303757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.303785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.303811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.304096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.304131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.304156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.304181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.304205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.304526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.304535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.306242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.306271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.306313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.306339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.306634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.306669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.306697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.306733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.306759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.307093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.307102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.308912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.308942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.308966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.308991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.309344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.309377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.309402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.309427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.309452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.309769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.309779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.311492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.311521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.311549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.311575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.311912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.311945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.311973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.311997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.312023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.312274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.312283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.313965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.313994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.314023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.314048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.314390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.314436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.314478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.314529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.314575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.314882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.314890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.316654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.316683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.555 [2024-07-24 23:49:25.316709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.316735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.316997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.317041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.317068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.317094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.317119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.317365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.317373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.319201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.319231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.319257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.319292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.319568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.319611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.319651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.319687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.319716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.320080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.320088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.322060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.322100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.322150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.322188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.322486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.322549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.322588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.322615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.322642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.322939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.322948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.324750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.324779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.324807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.324833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.325117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.325158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.325184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.325221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.325251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.325600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.325609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.327214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.327242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.327266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.327298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.327607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.327673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.327714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.327743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.327767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.327946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.327955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.329781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.329810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.329836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.329861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.330116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.330158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.330184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.330234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.330260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.330534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.330543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.332529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.332589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.332615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.332640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.332830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.332868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.332907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.332932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.332956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.333131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.333140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.334271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.334299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.334326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.334350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.334526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.334565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.334590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.334614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.334638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.334814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.334822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.336500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.336529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.336571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.336598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.336936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.336975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.556 [2024-07-24 23:49:25.337000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.337024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.337048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.337257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.337266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.338412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.338440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.338466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.338499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.338710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.338757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.338783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.338807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.338831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.339003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.339011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.340435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.340464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.340494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.340534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.340903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.340936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.340963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.340988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.341013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.341320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.341328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.342384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.342412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.342447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.342479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.342761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.342795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.342820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.342844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.342868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.343073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.343081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.344270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.344306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.344331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.344356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.344719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.344758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.344786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.344811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.344836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.345190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.345199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.346420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.346484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.346510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.346535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.346714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.346762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.346788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.346813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.346839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.347016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.347024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.348107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.348135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.348171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.348207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.348567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.348601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.348627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.348654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.348679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.348984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.348993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.350563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.350614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.350647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.350672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.350856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.350896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.350924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.350949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.350973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.351146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.351155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.352237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.352264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.352289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.352313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.352545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.352586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.352612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.352649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.352675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.557 [2024-07-24 23:49:25.353024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.353033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.354627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.354666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.354712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.354747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.354922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.354960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.354988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.355016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.355041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.355215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.355223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.356364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.356392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.357328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.357359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.357607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.357648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.357675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.357713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.357750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.358161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.358170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.359740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.359775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.359817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.360851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.361043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.361078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.361113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.361142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.361167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.361339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.361347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.363312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.363608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.363878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.364147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.364489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.365462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.366322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.367274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.368232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.368527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.368536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.369952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.370214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.370480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.370738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.371033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.371945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.372964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.373948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.374888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.375136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.375144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.376480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.376743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.377000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.377256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.377435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.378255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.379207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.380137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.380574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.380754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.380764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.382132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.382417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.382690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.383131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.383312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.384301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.385360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.386382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.387049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.387260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.387269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.558 [2024-07-24 23:49:25.388738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.389014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.389273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.390318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.390539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.391495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.392425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.392863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.393775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.393959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.393967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.395564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.395831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.396206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.396983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.397161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.398172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.399232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.399807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.400581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.400763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.400771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.402458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.402756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.403765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.404669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.404851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.405824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.406262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.407148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.408103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.408284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.408291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.410107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.410649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.411442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.412390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.412575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.413548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.414251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.415050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.416011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.416197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.416206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.418186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.419164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.420258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.421284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.421472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.421919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.422721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.423660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.424625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.424809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.424817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.427126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.427932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.428887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.429840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.430031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.430808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.431596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.432550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.433510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.433788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.433797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.436901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.437975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.439017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.440004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.440310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.441113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.442071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.443025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.443798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.444070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.444079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.446546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.447514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.448472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.449091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.449273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.450054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.450995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.451950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.452329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.452724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.452734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.455342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.456324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.457251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.559 [2024-07-24 23:49:25.457986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.458201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.459182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.460138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.460855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.461132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.461476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.461486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.463904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.464845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.465416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.466477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.466664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.467629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.468573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.468934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.469191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.469499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.469507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.471993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.472970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.473689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.474481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.474663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.475610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.476306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.476589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.476855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.477165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.477174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.479405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.480022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.481047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.482027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.482215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.483191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.483544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.483803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.484062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.484399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.484409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.486492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.487301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.488097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.489029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.489211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.489894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.490160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.490419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.490716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.491050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.491059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.492689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.493713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.494686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.495742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.495930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.496294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.496562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.496826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.497089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.497365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.497373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.499137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.499923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.500883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.501848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.502099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.502370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.502635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.502894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.503152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.503331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.503339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.505429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.506366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.507355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.508375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.508731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.508997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.509254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.509530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.510197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.510414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.510425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.512381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.513346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.514294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.514765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.515087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.515357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.515640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.515907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.560 [2024-07-24 23:49:25.516881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.561 [2024-07-24 23:49:25.517068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.561 [2024-07-24 23:49:25.517077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.561 [2024-07-24 23:49:25.519107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.561 [2024-07-24 23:49:25.520089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.561 [2024-07-24 23:49:25.521123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.561 [2024-07-24 23:49:25.521396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.561 [2024-07-24 23:49:25.521771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.561 [2024-07-24 23:49:25.522042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.561 [2024-07-24 23:49:25.522304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.561 [2024-07-24 23:49:25.522945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.561 [2024-07-24 23:49:25.523745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.561 [2024-07-24 23:49:25.523930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.561 [2024-07-24 23:49:25.523939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.561 [2024-07-24 23:49:25.526020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.561 [2024-07-24 23:49:25.526990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.561 [2024-07-24 23:49:25.527477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.561 [2024-07-24 23:49:25.527750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.561 [2024-07-24 23:49:25.528094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.561 [2024-07-24 23:49:25.528366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.561 [2024-07-24 23:49:25.528636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.561 [2024-07-24 23:49:25.529671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.561 [2024-07-24 23:49:25.530698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.561 [2024-07-24 23:49:25.530885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.561 [2024-07-24 23:49:25.530894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.561 [2024-07-24 23:49:25.532421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.561 [2024-07-24 23:49:25.532694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.561 [2024-07-24 23:49:25.532959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.561 [2024-07-24 23:49:25.533222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.561 [2024-07-24 23:49:25.533494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.561 [2024-07-24 23:49:25.534288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.561 [2024-07-24 23:49:25.535259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.561 [2024-07-24 23:49:25.536227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.561 [2024-07-24 23:49:25.536955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.561 [2024-07-24 23:49:25.537221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.561 [2024-07-24 23:49:25.537230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.561 [2024-07-24 23:49:25.538623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.561 [2024-07-24 23:49:25.538933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.561 [2024-07-24 23:49:25.539201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.561 [2024-07-24 23:49:25.539472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.561 [2024-07-24 23:49:25.539740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.561 [2024-07-24 23:49:25.540042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.561 [2024-07-24 23:49:25.540311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.561 [2024-07-24 23:49:25.540580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.561 [2024-07-24 23:49:25.540855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.561 [2024-07-24 23:49:25.541190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.561 [2024-07-24 23:49:25.541200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.561 [2024-07-24 23:49:25.543208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.561 [2024-07-24 23:49:25.543498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.561 [2024-07-24 23:49:25.543766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.561 [2024-07-24 23:49:25.544037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.561 [2024-07-24 23:49:25.544402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.561 [2024-07-24 23:49:25.544684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.561 [2024-07-24 23:49:25.544956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.561 [2024-07-24 23:49:25.545223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.561 [2024-07-24 23:49:25.545502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.561 [2024-07-24 23:49:25.545861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.561 [2024-07-24 23:49:25.545869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.561 [2024-07-24 23:49:25.548106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.561 [2024-07-24 23:49:25.548387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.561 [2024-07-24 23:49:25.548660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.561 [2024-07-24 23:49:25.548924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.561 [2024-07-24 23:49:25.549254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.561 [2024-07-24 23:49:25.549536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.561 [2024-07-24 23:49:25.549806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.824 [2024-07-24 23:49:25.550075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.824 [2024-07-24 23:49:25.550342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.824 [2024-07-24 23:49:25.550625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.824 [2024-07-24 23:49:25.550635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.824 [2024-07-24 23:49:25.552579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.824 [2024-07-24 23:49:25.552888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.824 [2024-07-24 23:49:25.553146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.824 [2024-07-24 23:49:25.553405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.824 [2024-07-24 23:49:25.553662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.824 [2024-07-24 23:49:25.553940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.824 [2024-07-24 23:49:25.554207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.824 [2024-07-24 23:49:25.554479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.554748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.554996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.555004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.556981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.557251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.557543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.557814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.558146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.558430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.558708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.558979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.559251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.559616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.559626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.561588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.561855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.562115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.562374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.562710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.562986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.563258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.563527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.563791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.564132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.564141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.566204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.566476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.566524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.566789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.567077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.567351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.567617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.567881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.568144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.568386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.568395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.570292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.570577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.570848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.570880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.571199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.571473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.571745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.572003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.572267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.572556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.572566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.574502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.574559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.574585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.574623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.574901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.574943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.574982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.575020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.575059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.575420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.575429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.577146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.577185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.577211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.577237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.577490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.577532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.577558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.577583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.577613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.577908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.577917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.579666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.579697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.579744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.579780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.580037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.580080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.580110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.580137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.580163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.580525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.580536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.582273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.582313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.582341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.582366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.582615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.582656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.582697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.582724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.582753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.583092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.825 [2024-07-24 23:49:25.583101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.584761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.584798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.584824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.584850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.585264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.585300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.585330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.585357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.585382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.585699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.585709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.587485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.587513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.587553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.587579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.587915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.587947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.587974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.588003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.588029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.588282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.588291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.590007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.590035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.590062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.590087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.590400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.590432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.590458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.590488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.590516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.590900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.590909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.592567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.592595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.592640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.592666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.592991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.593037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.593064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.593089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.593114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.593464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.593479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.595173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.595202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.595227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.595252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.595574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.595609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.595636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.595662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.595689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.596018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.596027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.597669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.597698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.597724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.597751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.598099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.598141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.598168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.598193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.598218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.598531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.598539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.600231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.600261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.600301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.600326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.600663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.600695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.600733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.600759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.600785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.601042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.601051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.602679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.602707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.602735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.602761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.603080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.603111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.603144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.603179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.603208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.603389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.603397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.604968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.604998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.605023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.605048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.826 [2024-07-24 23:49:25.605311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.605354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.605380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.605405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.605429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.605755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.605768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.607413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.607441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.607470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.607495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.607798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.607837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.607864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.607890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.607915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.608237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.608246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.609306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.609334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.609359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.609386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.609728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.609766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.609791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.609816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.609840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.610062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.610070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.611240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.611269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.611306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.611349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.611685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.611717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.611744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.611773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.611809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.612173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.612182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.613409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.613437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.613464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.613500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.613677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.613709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.613740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.613766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.613793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.614040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.614048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.615099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.615128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.615154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.615179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.615503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.615546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.615574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.615599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.615625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.615945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.615954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.617540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.617576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.617607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.617632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.617810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.617854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.617880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.617904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.617928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.618119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.618128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.619180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.619207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.619234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.619259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.619522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.619560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.619593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.619630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.619656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.620013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.620021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.621617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.621650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.621691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.621716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.621906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.621947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.621973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.621998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.827 [2024-07-24 23:49:25.622022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.622194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.622202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.623338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.623367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.623395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.623419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.623600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.623639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.623665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.623689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.623720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.623994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.624003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.625879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.625907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.625932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.625957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.626175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.626214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.626239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.626264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.626288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.626466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.626479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.627560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.627588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.627615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.627639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.627817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.627855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.627880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.627904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.627947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.628127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.628136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.629977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.630007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.630033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.630058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.630255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.630292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.630317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.630342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.630366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.630580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.630589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.631689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.631724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.631749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.631780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.631960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.631990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.632033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.632061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.632086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.632263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.632271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.633916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.633944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.633970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.633995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.634332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.634364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.634391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.634419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.634447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.634631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.634640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.635736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.635765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.635792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.635817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.636051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.636094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.636119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.636144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.636168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.636341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.636349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.637793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.637822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.637848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.637874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.638208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.638253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.638280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.638305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.638331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.638655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.638664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.828 [2024-07-24 23:49:25.639681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.639708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.639732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.639763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.640031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.640065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.640094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.640118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.640143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.640367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.640375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.641674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.641703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.641989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.642016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.642342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.642385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.642412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.642440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.642467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.642801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.642810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.643885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.643912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.643945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.644622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.644865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.644905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.644932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.644957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.644984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.645167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.645175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.646941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.647202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.648231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.649186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.649374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.650350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.650782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.651600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.652545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.652728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.652736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.654624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.655247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.656028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.656995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.657180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.658066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.658895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.659694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.660664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.660847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.660855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.662808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.663822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.664850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.665938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.666120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.666592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.667410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.668355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.669297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.669483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.669492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.671821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.672629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.673594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.674549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.674765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.675547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.676346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.677305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.829 [2024-07-24 23:49:25.678267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.678575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.678585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.681747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.682751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.683811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.684861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.685116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.685930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.686891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.687789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.688665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.688909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.688918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.691396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.692368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.693330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.693961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.694146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.694951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.695911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.696869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.697292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.697652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.697667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.700093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.701055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.702023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.702507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.702698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.703748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.704764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.705749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.706015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.706331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.706340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.708714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.709702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.710318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.711423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.711624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.712603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.713550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.713938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.714206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.714541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.714549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.717032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.718117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.718738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.719537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.719720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.720686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.721517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.721786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.722045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.722319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.722328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.724549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.725140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.726194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.727143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.727322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.728273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.728641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.728899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.729159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.729515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.729524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.731713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.732306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.733090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.734013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.734192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.735061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.735322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.735583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.735842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.736165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.736174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.738060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.738967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.739774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.740701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.740880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.741425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.741689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.741950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.742216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.742543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.742552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.744032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.830 [2024-07-24 23:49:25.744815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.745746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.746681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.746862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.747130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.747386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.747650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.747910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.748124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.748133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.750098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.750876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.751811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.752734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.753105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.753390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.753653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.753917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.754283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.754460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.754471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.756352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.757289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.758214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.758967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.759231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.759508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.759774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.760040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.760918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.761149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.761157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.763190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.764157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.765191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.765457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.765779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.766046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.766305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.766825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.767609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.767789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.767796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.769828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.770762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.771347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.771619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.771953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.772220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.772481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.773446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.774320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.774502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.774510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.776640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.777699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.777964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.778223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.778490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.778756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.779282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.780064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.780997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.781178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.781186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.783413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.784374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.784648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.784920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.785198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.785464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.786120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.786899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.787831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.788010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.788018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.790095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.790548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.790812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.791071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.791433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.791701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.792614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.793592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.794538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.794722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.794730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.796697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.796968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.797229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.797492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.797826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.831 [2024-07-24 23:49:25.798388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.832 [2024-07-24 23:49:25.799326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.832 [2024-07-24 23:49:25.800269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.832 [2024-07-24 23:49:25.800732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.832 [2024-07-24 23:49:25.800913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.832 [2024-07-24 23:49:25.800921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.832 [2024-07-24 23:49:25.802525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.832 [2024-07-24 23:49:25.802793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.832 [2024-07-24 23:49:25.803163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.832 [2024-07-24 23:49:25.803948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.832 [2024-07-24 23:49:25.804130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.832 [2024-07-24 23:49:25.805125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.832 [2024-07-24 23:49:25.806165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.832 [2024-07-24 23:49:25.806557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.832 [2024-07-24 23:49:25.807336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.832 [2024-07-24 23:49:25.807517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.832 [2024-07-24 23:49:25.807525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.832 [2024-07-24 23:49:25.809178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.832 [2024-07-24 23:49:25.809441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.832 [2024-07-24 23:49:25.809705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.832 [2024-07-24 23:49:25.809966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.832 [2024-07-24 23:49:25.810274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.832 [2024-07-24 23:49:25.810546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.832 [2024-07-24 23:49:25.810809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.832 [2024-07-24 23:49:25.811068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.832 [2024-07-24 23:49:25.811343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.832 [2024-07-24 23:49:25.811625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.832 [2024-07-24 23:49:25.811635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.832 [2024-07-24 23:49:25.813644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.832 [2024-07-24 23:49:25.813913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.832 [2024-07-24 23:49:25.814177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.832 [2024-07-24 23:49:25.814436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.832 [2024-07-24 23:49:25.814753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.832 [2024-07-24 23:49:25.815022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.832 [2024-07-24 23:49:25.815300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.832 [2024-07-24 23:49:25.815577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.832 [2024-07-24 23:49:25.815849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.832 [2024-07-24 23:49:25.816181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.832 [2024-07-24 23:49:25.816190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.832 [2024-07-24 23:49:25.818203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.832 [2024-07-24 23:49:25.818476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.832 [2024-07-24 23:49:25.818744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.832 [2024-07-24 23:49:25.819009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.832 [2024-07-24 23:49:25.819282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.832 [2024-07-24 23:49:25.819565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.832 [2024-07-24 23:49:25.819835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.832 [2024-07-24 23:49:25.820101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.094 [2024-07-24 23:49:25.820377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.094 [2024-07-24 23:49:25.820716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.094 [2024-07-24 23:49:25.820725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.094 [2024-07-24 23:49:25.822785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.094 [2024-07-24 23:49:25.823048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.094 [2024-07-24 23:49:25.823313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.094 [2024-07-24 23:49:25.823584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.094 [2024-07-24 23:49:25.823907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.094 [2024-07-24 23:49:25.824176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.094 [2024-07-24 23:49:25.824434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.094 [2024-07-24 23:49:25.824717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.094 [2024-07-24 23:49:25.824989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.094 [2024-07-24 23:49:25.825306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.094 [2024-07-24 23:49:25.825315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.094 [2024-07-24 23:49:25.827504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.094 [2024-07-24 23:49:25.827781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.094 [2024-07-24 23:49:25.828054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.094 [2024-07-24 23:49:25.828312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.094 [2024-07-24 23:49:25.828645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.094 [2024-07-24 23:49:25.828913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.094 [2024-07-24 23:49:25.829173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.094 [2024-07-24 23:49:25.829438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.094 [2024-07-24 23:49:25.829709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.094 [2024-07-24 23:49:25.830036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.094 [2024-07-24 23:49:25.830045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.094 [2024-07-24 23:49:25.831992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.094 [2024-07-24 23:49:25.832259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.094 [2024-07-24 23:49:25.832534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.094 [2024-07-24 23:49:25.832797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.094 [2024-07-24 23:49:25.833106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.094 [2024-07-24 23:49:25.833395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.094 [2024-07-24 23:49:25.833673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.094 [2024-07-24 23:49:25.833939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.094 [2024-07-24 23:49:25.834211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.094 [2024-07-24 23:49:25.834549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.094 [2024-07-24 23:49:25.834559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.094 [2024-07-24 23:49:25.836655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.094 [2024-07-24 23:49:25.836924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.094 [2024-07-24 23:49:25.837186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.094 [2024-07-24 23:49:25.837452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.094 [2024-07-24 23:49:25.837817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.094 [2024-07-24 23:49:25.838083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.094 [2024-07-24 23:49:25.838346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.094 [2024-07-24 23:49:25.838613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.094 [2024-07-24 23:49:25.838873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.094 [2024-07-24 23:49:25.839127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.094 [2024-07-24 23:49:25.839135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.094 [2024-07-24 23:49:25.840832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.094 [2024-07-24 23:49:25.841095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.094 [2024-07-24 23:49:25.841358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.094 [2024-07-24 23:49:25.841626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.094 [2024-07-24 23:49:25.841807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.094 [2024-07-24 23:49:25.842112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.094 [2024-07-24 23:49:25.842370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.094 [2024-07-24 23:49:25.843380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.094 [2024-07-24 23:49:25.843652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.094 [2024-07-24 23:49:25.843969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.094 [2024-07-24 23:49:25.843979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.094 [2024-07-24 23:49:25.845757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.094 [2024-07-24 23:49:25.846761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.094 [2024-07-24 23:49:25.846798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.094 [2024-07-24 23:49:25.847053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.094 [2024-07-24 23:49:25.847350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.095 [2024-07-24 23:49:25.847622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.095 [2024-07-24 23:49:25.847890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.095 [2024-07-24 23:49:25.848783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.095 [2024-07-24 23:49:25.849040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.095 [2024-07-24 23:49:25.849346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.095 [2024-07-24 23:49:25.849358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.095 [2024-07-24 23:49:25.851251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.095 [2024-07-24 23:49:25.852094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.095 [2024-07-24 23:49:25.852354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.095 [2024-07-24 23:49:25.852383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.095 [2024-07-24 23:49:25.852657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.095 [2024-07-24 23:49:25.853528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.095 [2024-07-24 23:49:25.853787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.095 [2024-07-24 23:49:25.854047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.095 [2024-07-24 23:49:25.854335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.095 [2024-07-24 23:49:25.854604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.095 [2024-07-24 23:49:25.854613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.095 [2024-07-24 23:49:25.856083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.095 [2024-07-24 23:49:25.856112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.095 [2024-07-24 23:49:25.856136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.095 [2024-07-24 23:49:25.856161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.095 [2024-07-24 23:49:25.856473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.095 [2024-07-24 23:49:25.856506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.095 [2024-07-24 23:49:25.856532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.095 [2024-07-24 23:49:25.856557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.095 [2024-07-24 23:49:25.856583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.095 [2024-07-24 23:49:25.856843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.095 [2024-07-24 23:49:25.856851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.095 [2024-07-24 23:49:25.858449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.095 [2024-07-24 23:49:25.858482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.095 [2024-07-24 23:49:25.858508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.095 [2024-07-24 23:49:25.858543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.095 [2024-07-24 23:49:25.858718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.095 [2024-07-24 23:49:25.858749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.095 [2024-07-24 23:49:25.858775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.095 [2024-07-24 23:49:25.858816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.095 [2024-07-24 23:49:25.858859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.095 [2024-07-24 23:49:25.859213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.095 [2024-07-24 23:49:25.859222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.095 [2024-07-24 23:49:25.860726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.095 [2024-07-24 23:49:25.860755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.095 [2024-07-24 23:49:25.860797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.095 [2024-07-24 23:49:25.860823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.095 [2024-07-24 23:49:25.861164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.095 [2024-07-24 23:49:25.861197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.095 [2024-07-24 23:49:25.861229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.095 [2024-07-24 23:49:25.861256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.095 [2024-07-24 23:49:25.861282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.095 [2024-07-24 23:49:25.861517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.095 [2024-07-24 23:49:25.861526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.095 [2024-07-24 23:49:25.863142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.095 [2024-07-24 23:49:25.863170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.095 [2024-07-24 23:49:25.863195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.095 [2024-07-24 23:49:25.863220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.095 [2024-07-24 23:49:25.863403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.095 [2024-07-24 23:49:25.863437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.095 [2024-07-24 23:49:25.863463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.095 [2024-07-24 23:49:25.863490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.095 [2024-07-24 23:49:25.863516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.095 [2024-07-24 23:49:25.863807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.095 [2024-07-24 23:49:25.863815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.095 [2024-07-24 23:49:25.865414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.095 [2024-07-24 23:49:25.865443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.095 [2024-07-24 23:49:25.865473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.095 [2024-07-24 23:49:25.865509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.095 [2024-07-24 23:49:25.865787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.095 [2024-07-24 23:49:25.865836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.095 [2024-07-24 23:49:25.865874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.095 [2024-07-24 23:49:25.865899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.095 [2024-07-24 23:49:25.865936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.095 [2024-07-24 23:49:25.866203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.866211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.867632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.867660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.867685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.867708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.868030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.868062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.868088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.868114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.868139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.868393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.868402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.869976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.870005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.870030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.870055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.870231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.870265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.870290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.870316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.870341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.870668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.870677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.872212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.872246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.872285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.872314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.872664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.872695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.872721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.872746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.872771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.873039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.873047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.874653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.874692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.874717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.874742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.874971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.875008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.875033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.875057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.875081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.875321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.875330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.876967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.876996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.877021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.877047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.877315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.877357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.877382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.877421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.877446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.877706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.877715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.879149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.879189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.879215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.879240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.879562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.879593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.879620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.879645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.879670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.879955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.879964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.881278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.881305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.881332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.881356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.881667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.881707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.881734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.881759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.881785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.882113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.882122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.883652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.883690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.883717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.883753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.884030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.884074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.884101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.884126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.884151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.884488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.884497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.096 [2024-07-24 23:49:25.886228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.886257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.886284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.886308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.886593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.886634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.886661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.886686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.886712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.887025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.887034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.888566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.888593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.888617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.888647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.888823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.888855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.888882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.888917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.888942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.889118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.889127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.890253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.890280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.890307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.890331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.890605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.890643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.890684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.890721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.890748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.891091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.891099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.893789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.893829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.893853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.893878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.894054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.894094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.894121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.894146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.894170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.894345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.894353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.897141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.897192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.897217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.897242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.897576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.897608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.897635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.897661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.897686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.897923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.897931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.900524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.900555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.900580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.900604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.900788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.900826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.900852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.900883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.900911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.901084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.901092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.903121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.903153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.903180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.903204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.903381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.903419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.903444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.097 [2024-07-24 23:49:25.903472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.903496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.903669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.903677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.906338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.906370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.906394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.906419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.906726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.906761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.906791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.906817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.906842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.907096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.907104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.909498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.909537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.909561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.909585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.909770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.909807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.909836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.909860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.909884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.910062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.910070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.912916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.912950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.912976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.913011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.913188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.913219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.913250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.913277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.913311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.913492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.913501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.916401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.916433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.916740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.916770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.916795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.917093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.963485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.967619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.967667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.968613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.968658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.969127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.969166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.969924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.970102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.970110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.970117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.975510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.976524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.977438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.977618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.977627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.979417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.979857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.980664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.981597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.982786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.983432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.984208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.985142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.985320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.985328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.987209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.988223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.989224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.990272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.990890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.991697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.992630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.993559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.993736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.993747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.996207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.996994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.997932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:25.998855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:26.000084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.098 [2024-07-24 23:49:26.000933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.099 [2024-07-24 23:49:26.001900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.099 [2024-07-24 23:49:26.002870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.099 [2024-07-24 23:49:26.003153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.099 [2024-07-24 23:49:26.003163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.099 [2024-07-24 23:49:26.005695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.099 [2024-07-24 23:49:26.006624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.099 [2024-07-24 23:49:26.007560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.099 [2024-07-24 23:49:26.008324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.099 [2024-07-24 23:49:26.009345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.099 [2024-07-24 23:49:26.010293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.099 [2024-07-24 23:49:26.011223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.099 [2024-07-24 23:49:26.011841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.099 [2024-07-24 23:49:26.012169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.099 [2024-07-24 23:49:26.012178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.099 [2024-07-24 23:49:26.014584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.099 [2024-07-24 23:49:26.015520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.099 [2024-07-24 23:49:26.016448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.099 [2024-07-24 23:49:26.016937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.099 [2024-07-24 23:49:26.018151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.099 [2024-07-24 23:49:26.019206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.099 [2024-07-24 23:49:26.020171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.099 [2024-07-24 23:49:26.020429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.099 [2024-07-24 23:49:26.020742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.099 [2024-07-24 23:49:26.020755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.099 [2024-07-24 23:49:26.023143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.099 [2024-07-24 23:49:26.024069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.099 [2024-07-24 23:49:26.024598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.099 [2024-07-24 23:49:26.025559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.099 [2024-07-24 23:49:26.026675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.099 [2024-07-24 23:49:26.027612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.099 [2024-07-24 23:49:26.027934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.099 [2024-07-24 23:49:26.028190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.099 [2024-07-24 23:49:26.028490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.099 [2024-07-24 23:49:26.028499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.099 [2024-07-24 23:49:26.030935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.099 [2024-07-24 23:49:26.031878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.099 [2024-07-24 23:49:26.032620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.099 [2024-07-24 23:49:26.033392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.099 [2024-07-24 23:49:26.034513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.099 [2024-07-24 23:49:26.035229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.099 [2024-07-24 23:49:26.035489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.099 [2024-07-24 23:49:26.035749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.099 [2024-07-24 23:49:26.036096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.099 [2024-07-24 23:49:26.036105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.099 [2024-07-24 23:49:26.038225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.099 [2024-07-24 23:49:26.038673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.099 [2024-07-24 23:49:26.039544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.099 [2024-07-24 23:49:26.040516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.099 [2024-07-24 23:49:26.041622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.099 [2024-07-24 23:49:26.041887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.099 [2024-07-24 23:49:26.042144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.099 [2024-07-24 23:49:26.042399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.099 [2024-07-24 23:49:26.042731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.099 [2024-07-24 23:49:26.042741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.099 [2024-07-24 23:49:26.044746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.099 [2024-07-24 23:49:26.045532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.099 [2024-07-24 23:49:26.046297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.099 [2024-07-24 23:49:26.047227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.099 [2024-07-24 23:49:26.048107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.099 [2024-07-24 23:49:26.048366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.099 [2024-07-24 23:49:26.048623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.099 [2024-07-24 23:49:26.048886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.099 [2024-07-24 23:49:26.049216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.099 [2024-07-24 23:49:26.049226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.100 [2024-07-24 23:49:26.050716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.100 [2024-07-24 23:49:26.051502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.100 [2024-07-24 23:49:26.052435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.100 [2024-07-24 23:49:26.053372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.100 [2024-07-24 23:49:26.053816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.100 [2024-07-24 23:49:26.054075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.100 [2024-07-24 23:49:26.054328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.100 [2024-07-24 23:49:26.054607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.100 [2024-07-24 23:49:26.054825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.100 [2024-07-24 23:49:26.054833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.100 [2024-07-24 23:49:26.056724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.100 [2024-07-24 23:49:26.057506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.100 [2024-07-24 23:49:26.058438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.100 [2024-07-24 23:49:26.059397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.100 [2024-07-24 23:49:26.059959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.100 [2024-07-24 23:49:26.060225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.100 [2024-07-24 23:49:26.060501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.100 [2024-07-24 23:49:26.060955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.100 [2024-07-24 23:49:26.061137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.100 [2024-07-24 23:49:26.061146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.100 [2024-07-24 23:49:26.063071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.100 [2024-07-24 23:49:26.064000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.100 [2024-07-24 23:49:26.064929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.100 [2024-07-24 23:49:26.065660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.100 [2024-07-24 23:49:26.066216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.100 [2024-07-24 23:49:26.066474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.100 [2024-07-24 23:49:26.066729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.100 [2024-07-24 23:49:26.067599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.100 [2024-07-24 23:49:26.067811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.100 [2024-07-24 23:49:26.067819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.100 [2024-07-24 23:49:26.069663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.100 [2024-07-24 23:49:26.070599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.100 [2024-07-24 23:49:26.071536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.100 [2024-07-24 23:49:26.071876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.100 [2024-07-24 23:49:26.072482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.100 [2024-07-24 23:49:26.072740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.100 [2024-07-24 23:49:26.073100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.100 [2024-07-24 23:49:26.073892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.100 [2024-07-24 23:49:26.074068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.100 [2024-07-24 23:49:26.074076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.100 [2024-07-24 23:49:26.076214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.100 [2024-07-24 23:49:26.076737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.100 [2024-07-24 23:49:26.077371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.100 [2024-07-24 23:49:26.077630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.100 [2024-07-24 23:49:26.078617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.100 [2024-07-24 23:49:26.078878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.100 [2024-07-24 23:49:26.079549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.100 [2024-07-24 23:49:26.080328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.100 [2024-07-24 23:49:26.080520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.100 [2024-07-24 23:49:26.080529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.100 [2024-07-24 23:49:26.082593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.100 [2024-07-24 23:49:26.083561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.100 [2024-07-24 23:49:26.084117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.100 [2024-07-24 23:49:26.084388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.100 [2024-07-24 23:49:26.084979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.100 [2024-07-24 23:49:26.085244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.100 [2024-07-24 23:49:26.086272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.100 [2024-07-24 23:49:26.087297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.100 [2024-07-24 23:49:26.087482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.100 [2024-07-24 23:49:26.087491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.100 [2024-07-24 23:49:26.089652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.100 [2024-07-24 23:49:26.090611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.100 [2024-07-24 23:49:26.090882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.362 [2024-07-24 23:49:26.091148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.362 [2024-07-24 23:49:26.091746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.362 [2024-07-24 23:49:26.092014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.362 [2024-07-24 23:49:26.092287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.362 [2024-07-24 23:49:26.092562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.362 [2024-07-24 23:49:26.092908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.362 [2024-07-24 23:49:26.092917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.362 [2024-07-24 23:49:26.094860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.362 [2024-07-24 23:49:26.095146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.362 [2024-07-24 23:49:26.095411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.362 [2024-07-24 23:49:26.095679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.362 [2024-07-24 23:49:26.096208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.362 [2024-07-24 23:49:26.096478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.362 [2024-07-24 23:49:26.096750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.362 [2024-07-24 23:49:26.097007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.362 [2024-07-24 23:49:26.097324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.362 [2024-07-24 23:49:26.097333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.362 [2024-07-24 23:49:26.099279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.362 [2024-07-24 23:49:26.099544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.362 [2024-07-24 23:49:26.099805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.362 [2024-07-24 23:49:26.100070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.362 [2024-07-24 23:49:26.100656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.362 [2024-07-24 23:49:26.100915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.362 [2024-07-24 23:49:26.101172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.362 [2024-07-24 23:49:26.101432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.362 [2024-07-24 23:49:26.101716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.362 [2024-07-24 23:49:26.101726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.362 [2024-07-24 23:49:26.103867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.362 [2024-07-24 23:49:26.104157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.362 [2024-07-24 23:49:26.104424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.362 [2024-07-24 23:49:26.104699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.362 [2024-07-24 23:49:26.105294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.362 [2024-07-24 23:49:26.105564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.362 [2024-07-24 23:49:26.105835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.362 [2024-07-24 23:49:26.106096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.362 [2024-07-24 23:49:26.106388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.362 [2024-07-24 23:49:26.106397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.362 [2024-07-24 23:49:26.108498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.362 [2024-07-24 23:49:26.108757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.362 [2024-07-24 23:49:26.109015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.362 [2024-07-24 23:49:26.109273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.362 [2024-07-24 23:49:26.109819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.362 [2024-07-24 23:49:26.110077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.362 [2024-07-24 23:49:26.110332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.362 [2024-07-24 23:49:26.110590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.362 [2024-07-24 23:49:26.110866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.362 [2024-07-24 23:49:26.110874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.362 [2024-07-24 23:49:26.112861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.362 [2024-07-24 23:49:26.113150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.362 [2024-07-24 23:49:26.113420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.362 [2024-07-24 23:49:26.113682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.362 [2024-07-24 23:49:26.114254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.362 [2024-07-24 23:49:26.114515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.362 [2024-07-24 23:49:26.114775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.362 [2024-07-24 23:49:26.115035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.362 [2024-07-24 23:49:26.115322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.362 [2024-07-24 23:49:26.115330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.362 [2024-07-24 23:49:26.117322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.362 [2024-07-24 23:49:26.117587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.362 [2024-07-24 23:49:26.117861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.362 [2024-07-24 23:49:26.118136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.118741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.119011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.119280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.119545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.119777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.119786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.121761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.122022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.122279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.122309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.122884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.123150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.123407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.123668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.124003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.124012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.125931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.125962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.126217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.126247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.126738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.126771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.127027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.127054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.127382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.127392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.129310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.129342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.129599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.129638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.130238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.130266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.130533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.130570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.130855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.130864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.133290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.133328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.133589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.133616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.134186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.134226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.134489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.134521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.134826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.134835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.136837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.136872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.137130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.137160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.137739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.137769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.138023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.138062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.138365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.138374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.140298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.140337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.140605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.140645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.141183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.141213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.141471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.141497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.141828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.141838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.143831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.143863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.144117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.144143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.144671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.144705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.144961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.144988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.145297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.145306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.147223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.147255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.147515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.147554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.148190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.148239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.149167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.149203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.149489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.149497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.151225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.151257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.151282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.151309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.151870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.363 [2024-07-24 23:49:26.151911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.151949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.152205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.152520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.152530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.154206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.154235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.154259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.154284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.154637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.154666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.154694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.154720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.155040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.155049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.156635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.156662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.156686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.156718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.156922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.156950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.156987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.157013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.157188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.157196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.158328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.158355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.158380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.158404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.158658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.158686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.158710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.158748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.159091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.159101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.160717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.160751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.160775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.160808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.161027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.161061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.161090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.161115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.161297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.161305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.162485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.162512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.162536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.162560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.162763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.162793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.162818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.162842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.163118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.163127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.165002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.165034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.165058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.165081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.165289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.165321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.165346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.165370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.165549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.165558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.166700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.166728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.166752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.166793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.167002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.167029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.167054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.167079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.167256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.167265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.168975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.169005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.169031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.169059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.169385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.169417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.169442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.169467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.169688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.169696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.170822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.170850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.170875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.170900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.171146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.171173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.171198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.364 [2024-07-24 23:49:26.171223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.171402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.171411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.172961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.173002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.173028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.173053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.173419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.173449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.173478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.173505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.173793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.173802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.174873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.174903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.174929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.174955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.175196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.175223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.175250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.175275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.175519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.175529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.176813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.176842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.176868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.177225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.177253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.177278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.177615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.272946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.273009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.273425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.273462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.274222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.274402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.278994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.279040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.279276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.280176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.280955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.281135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.281143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.283149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.284085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.284465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.284723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.285304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.285604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.286449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.287393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.287594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.287603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.289741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.290717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.290975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.291231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.291771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.292456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.293234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.294168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.294349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.294357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.296394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.296942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.297212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.297471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.298078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.299057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.300094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.301093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.301274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.301282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.303453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.303724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.303982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.304239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.305178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.305957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.306887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.307825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.308043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.308051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.309699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.309964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.310220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.310481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.311823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.312886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.313971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.314951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.315213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.315221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.316457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.316723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.316980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.317236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.318252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.319220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.320183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.320787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.320972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.320980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.322316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.322580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.322842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.323101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.324212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.325154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.325417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.326350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.326537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.326545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.328259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.328890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.329666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.330600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.331629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.332441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.333215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.334149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.334328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.334336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.336234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.337251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.338232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.339268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.339798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.340804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.341788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.342836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.343018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.343026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.344967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.345244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.345506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.345536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.346114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.346373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.346637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.346904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.347205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.347213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.349203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.349237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.349501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.365 [2024-07-24 23:49:26.349528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.366 [2024-07-24 23:49:26.350111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.366 [2024-07-24 23:49:26.350140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.366 [2024-07-24 23:49:26.350398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.366 [2024-07-24 23:49:26.350429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.366 [2024-07-24 23:49:26.350778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.366 [2024-07-24 23:49:26.350787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.366 [2024-07-24 23:49:26.352777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.366 [2024-07-24 23:49:26.352828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.366 [2024-07-24 23:49:26.353090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.366 [2024-07-24 23:49:26.353133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.366 [2024-07-24 23:49:26.353643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.366 [2024-07-24 23:49:26.353673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.366 [2024-07-24 23:49:26.353928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.366 [2024-07-24 23:49:26.353953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.366 [2024-07-24 23:49:26.354286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.366 [2024-07-24 23:49:26.354295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.366 [2024-07-24 23:49:26.356212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.366 [2024-07-24 23:49:26.356243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.366 [2024-07-24 23:49:26.356509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.366 [2024-07-24 23:49:26.356776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.366 [2024-07-24 23:49:26.357291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.366 [2024-07-24 23:49:26.357323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.366 [2024-07-24 23:49:26.357592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.366 [2024-07-24 23:49:26.357620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.366 [2024-07-24 23:49:26.357958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.366 [2024-07-24 23:49:26.357971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.631 [2024-07-24 23:49:26.359672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.631 [2024-07-24 23:49:26.359938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.631 [2024-07-24 23:49:26.360202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.631 [2024-07-24 23:49:26.360230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.631 [2024-07-24 23:49:26.360821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.631 [2024-07-24 23:49:26.361090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.631 [2024-07-24 23:49:26.361121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.631 [2024-07-24 23:49:26.361383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.631 [2024-07-24 23:49:26.361693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.631 [2024-07-24 23:49:26.361702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.631 [2024-07-24 23:49:26.363679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.631 [2024-07-24 23:49:26.363960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.631 [2024-07-24 23:49:26.363989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.631 [2024-07-24 23:49:26.364252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.631 [2024-07-24 23:49:26.364819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.631 [2024-07-24 23:49:26.364857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.631 [2024-07-24 23:49:26.365113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.631 [2024-07-24 23:49:26.365369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.631 [2024-07-24 23:49:26.365681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.631 [2024-07-24 23:49:26.365690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.631 [2024-07-24 23:49:26.367585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.631 [2024-07-24 23:49:26.367618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.631 [2024-07-24 23:49:26.367881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.631 [2024-07-24 23:49:26.368144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.631 [2024-07-24 23:49:26.368444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.631 [2024-07-24 23:49:26.368712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.368980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.369010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.369340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.369354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.371171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.371433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.371691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.371718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.372320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.372608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.372649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.372920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.373254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.373264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.375183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.375443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.375476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.375735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.376321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.376359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.376627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.376886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.377213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.377222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.379077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.379109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.379366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.379629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.379991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.380253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.380522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.380558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.380946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.380954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.382649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.382914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.383174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.383430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.384054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.384323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.384365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.384633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.384974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.384985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.386692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.386950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.386977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.387232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.387572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.387829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.388088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.388118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.388368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.388378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.390470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.390505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.390769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.390813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.391399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.391426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.391692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.391719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.392037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.392045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.394035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.394067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.394326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.394364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.395011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.395043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.395298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.395325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.395602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.395611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.397576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.397610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.397865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.397890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.398473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.398505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.398764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.398793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.399099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.399107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.401546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.401579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.402143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.402173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.402672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.402703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.402960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.402988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.632 [2024-07-24 23:49:26.403342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.403350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.405184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.405219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.405494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.405521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.406092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.406142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.406405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.406433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.406755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.406765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.408916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.408950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.409476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.409506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.410460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.410493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.410924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.410954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.411276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.411285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.413045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.413077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.414159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.414186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.415307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.415337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.416297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.416346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.416678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.416686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.419050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.419082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.420058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.420088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.421084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.421116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.422139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.422175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.422361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.422370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.423739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.423771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.424064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.424093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.424546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.424577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.425017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.425046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.425274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.425283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.427191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.427223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.427249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.427269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.428412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.429239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.429268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.429294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.429628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.429637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.431309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.431337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.431372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.431397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.432538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.432569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.432594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.432626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.432806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.432814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.434024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.434083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.434108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.434133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.434344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.434373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.434399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.434425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.434611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.434619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.436139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.436167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.436211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.436237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.436593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.436623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.436651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.436676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.436860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.633 [2024-07-24 23:49:26.436868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.437998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.438026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.438051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.438079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.438324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.438350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.438376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.438401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.438584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.438593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.440078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.440107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.440149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.440175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.440523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.440552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.440578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.440606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.440939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.440947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.441992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.442019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.442044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.442075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.442349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.442376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.442400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.442425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.442665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.442673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.443757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.443784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.443809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.444152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.444181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.444206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.444527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.487220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.487273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.487606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.488903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.488931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.488966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.489713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.489932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.489959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.489995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.490905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.490934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.491136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.492532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.493013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.493040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.493066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.493420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.494283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.494311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.494343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.495333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.495518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.495527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.497591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.497630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.498581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.498614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.498908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.499165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.499192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.499447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.499776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.499785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.503414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.503452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.504372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.504401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.504614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.504986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.505014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.505754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.506088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.506097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.508519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.508555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.509480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.509510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.509721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.510619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.510648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.511502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.511720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.511728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.514649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.634 [2024-07-24 23:49:26.514690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.515638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.515674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.515876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.516879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.516929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.517870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.518129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.518137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.519657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.519698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.520235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.520262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.520611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.521334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.521362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.521705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.522035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.522043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.525802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.525839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.526792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.526828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.527031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.527293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.527320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.527578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.527873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.527881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.530118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.530151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.530963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.530991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.531204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.531984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.532014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.532940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.533121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.533129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.535488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.535526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.536370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.536397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.536606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.537620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.537649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.538494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.538712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.538720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.539978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.540008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.540265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.540286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.540656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.540913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.540940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.541973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.542153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.542161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.545101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.546157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.546710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.546739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.547248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.547589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.547623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.547649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.548396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.548802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.549129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.549138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.550967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.551862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.552650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.553584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.553763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.553803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.554241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.554503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.554759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.555090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.555099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.558458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.559392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.560365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.561117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.561312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.561935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.562201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.563002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.563386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.563741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.635 [2024-07-24 23:49:26.563751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.565588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.566490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.567279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.568251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.568436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.569002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.569263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.569522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.569779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.570089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.570098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.574082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.575027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.575864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.576702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.576989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.577256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.577982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.578407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.578668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.578847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.578855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.580813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.581626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.582557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.583489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.583791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.584068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.584332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.584600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.585160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.585398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.585410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.588911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.589696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.590527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.590860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.591193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.591969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.592392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.592693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.593667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.593848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.593856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.595836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.596231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.596500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.596533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.596832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.597094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.597599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.598371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.599313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.599510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.599520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.603145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.603183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.603461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.603745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.603931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.604296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.604564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.604600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.605534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.605718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.605727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.606869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.607956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.608973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.609003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.609186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.609456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.609487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.636 [2024-07-24 23:49:26.609743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.637 [2024-07-24 23:49:26.609998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.637 [2024-07-24 23:49:26.610334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.637 [2024-07-24 23:49:26.610343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.637 [2024-07-24 23:49:26.614077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.637 [2024-07-24 23:49:26.615031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.637 [2024-07-24 23:49:26.615064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.637 [2024-07-24 23:49:26.616023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.637 [2024-07-24 23:49:26.616288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.637 [2024-07-24 23:49:26.616333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.637 [2024-07-24 23:49:26.617176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.637 [2024-07-24 23:49:26.617510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.637 [2024-07-24 23:49:26.617541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.637 [2024-07-24 23:49:26.617874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.637 [2024-07-24 23:49:26.617884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.637 [2024-07-24 23:49:26.619622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.637 [2024-07-24 23:49:26.619675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.637 [2024-07-24 23:49:26.619937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.637 [2024-07-24 23:49:26.620202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.637 [2024-07-24 23:49:26.620550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.637 [2024-07-24 23:49:26.620828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.637 [2024-07-24 23:49:26.621105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.637 [2024-07-24 23:49:26.621143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.637 [2024-07-24 23:49:26.621822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.637 [2024-07-24 23:49:26.622052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.637 [2024-07-24 23:49:26.622061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.637 [2024-07-24 23:49:26.624264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.637 [2024-07-24 23:49:26.624302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.637 [2024-07-24 23:49:26.624572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.637 [2024-07-24 23:49:26.624838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.637 [2024-07-24 23:49:26.625177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.625450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.625486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.625759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.626823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.627179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.627188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.629134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.629171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.629442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.629714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.630053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.630334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.630365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.630637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.630923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.631203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.631211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.635183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.635230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.635515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.635775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.636058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.636320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.636351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.636634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.636905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.637126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.637134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.638850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.638885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.639151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.639420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.639752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.640023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.640053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.640316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.640584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.640880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.640889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.642954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.643006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.643270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.643564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.643942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.644212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.644243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.644510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.644782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.645089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.645098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.647632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.647666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.647921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.648181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.648532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.648817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.648848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.649110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.649373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.649730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.649740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.652741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.652779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.653059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.653326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.653585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.653856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.653885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.654145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.654407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.654756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.654766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.900 [2024-07-24 23:49:26.656419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.656451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.657411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.657701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.658041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.658313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.658345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.658622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.658895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.659231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.659240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.661586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.661641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.662450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.662721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.663081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.663354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.663394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.663662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.663919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.664212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.664221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.666666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.666955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.666986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.667683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.667920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.668188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.668216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.668478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.668743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.669048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.669056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.671268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.671552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.671820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.671851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.672034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.672068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.672342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.672615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.672662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.672955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.672963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.675036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.675078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.675350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.675382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.675670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.676493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.676523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.676788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.676815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.677038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.677046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.679876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.679915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.680176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.680203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.680500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.680777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.680808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.681833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.681869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.682260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.682269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.684253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.684306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.684577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.684610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.684971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.685241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.685269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.685535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.685563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.685884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.685892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.687944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.687984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.688408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.688439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.688628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.689045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.689075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.689758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.689792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.690086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.690094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.692106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.692142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.693106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.901 [2024-07-24 23:49:26.693136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.693495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.693770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.693810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.694892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.694922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.695260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.695270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.699203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.699245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.699521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.699550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.699838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.700121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.700154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.700854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.700885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.701152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.701161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.703325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.703358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.704159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.704190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.704375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.705350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.705384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.705883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.705914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.706093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.706101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.708961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.708999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.709766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.709795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.710115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.710672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.710703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.711484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.711513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.711690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.711704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.713736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.713770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.714733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.715163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.715345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.715672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.715702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.715958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.715986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.716164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.716173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.718610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.718650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.719317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.719346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.719564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.720534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.721462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.721495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.721520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.721746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.721755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.723178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.723206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.723248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.723279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.723659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.723936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.723966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.723994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.724018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.724200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.724209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.726748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.726780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.726822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.726847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.727026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.727065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.727091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.727117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.727142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.727405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.727414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.728840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.728869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.728901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.728926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.729264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.729295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.729321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.902 [2024-07-24 23:49:26.729348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.729373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.729645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.729654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.732481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.732513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.732554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.732579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.732758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.732800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.732826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.732851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.732875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.733052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.733061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.734705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.734739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.734764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.734788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.734967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.735005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.735033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.735057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.735082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.735399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.735408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.738152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.738185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.738209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.738241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.738422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.738457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.738488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.738513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.738544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.738723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.738731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.740052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.740082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.740112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.740711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.740901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.740942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.740969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.740994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.741020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.741369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.741378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.744270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.745149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.745181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.745214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.745401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.745440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.745472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.746530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.746559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.746813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.746821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.748434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.748473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.748515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.749306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.749527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.749568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.750505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.750535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.750561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.750738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.750746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.753363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.753398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.754263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.754291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.754638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.755047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.755076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.755101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.755698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.756019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.756028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.757154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.757981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.758011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.758036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.758218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.758253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.758279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.759064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.759093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.759271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.759279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.761774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.903 [2024-07-24 23:49:26.762066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.762096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.762122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.762419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.762455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.763270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.763300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.763324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.763516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.763525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.764649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.765605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.765636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.765661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.765843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.765882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.766494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.766536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.766566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.766758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.766767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.768628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.769705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.769739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.769764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.769948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.769990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.770922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.770952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.770978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.771158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.771166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.772254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.773140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.773168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.773198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.773588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.773623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.773894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.773923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.773948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.774130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.774139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.776570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.777145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.777174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.777199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.777411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.777450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.778410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.778439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.778465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.778654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.778663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.780153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.781124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.781161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.781187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.781558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.781594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.781972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.782001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.782026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.782287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.782295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.784634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.785605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.785643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.785670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.785852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.785893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.786433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.786461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.904 [2024-07-24 23:49:26.786490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.786743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.786751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.788376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.789188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.789218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.789243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.789483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.789524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.790445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.790477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.790502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.790679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.790687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.793361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.794120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.794148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.794173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.794498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.794531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.795152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.795180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.795205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.795493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.795517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.796859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.796891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.797824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.797852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.798122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.798163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.799108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.799143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.799169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.799346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.799354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.803014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.803052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.803078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.803334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.803518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.804293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.804322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.804346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.805306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.805495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.805504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.808045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.808954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.808984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.809247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.809521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.809557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.810324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.810351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.810617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.810861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.810873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.813473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.814445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.814479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.815424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.815713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.815753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.816790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.816819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.817074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.817381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.817390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.820384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.821309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.821340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.822201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.822424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.822462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.823438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.823471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.824414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.824700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.824709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.828753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.829607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.829638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.830590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.830776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.830826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.831798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.831836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.832728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.832937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.832945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.836408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.837476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.905 [2024-07-24 23:49:26.837506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.837763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.838062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.838097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.838940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.838969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.839916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.840101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.840109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.843119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.843971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.844002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.844270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.844603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.844639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.845616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.845653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.845912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.846241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.846249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.849339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.850276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.850306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.851236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.851438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.851487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.852335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.852364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.852639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.852985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.852995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.856442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.857420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.857459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.858066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.858278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.858318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.859254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.859283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.860227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.860445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.860453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.863857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.864961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.864993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.865030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.865216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.865250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.866285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.866327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.867240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.867490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.867498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.870486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.871215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.871247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.871571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.871901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.871937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.871963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.873005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.874023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.874207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.874216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.877699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.878453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.878725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.879416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.879665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.879717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.879977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.880769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.881542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.881723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.881731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.885120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.886176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.886438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.886791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.886972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.887241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.887522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.888441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.889501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.889688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.889696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.893435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.894055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.894321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.895116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.895391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.906 [2024-07-24 23:49:26.895674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.169 [2024-07-24 23:49:26.896491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.169 [2024-07-24 23:49:26.897289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.169 [2024-07-24 23:49:26.898249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.169 [2024-07-24 23:49:26.898436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.169 [2024-07-24 23:49:26.898445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.169 [2024-07-24 23:49:26.902187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.169 [2024-07-24 23:49:26.902464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.169 [2024-07-24 23:49:26.902823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.169 [2024-07-24 23:49:26.903650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.169 [2024-07-24 23:49:26.903985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.169 [2024-07-24 23:49:26.904369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.169 [2024-07-24 23:49:26.905146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.169 [2024-07-24 23:49:26.906098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.169 [2024-07-24 23:49:26.907045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.169 [2024-07-24 23:49:26.907232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.169 [2024-07-24 23:49:26.907241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.169 [2024-07-24 23:49:26.910651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.169 [2024-07-24 23:49:26.910942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.169 [2024-07-24 23:49:26.911845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.169 [2024-07-24 23:49:26.912134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.169 [2024-07-24 23:49:26.912467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.169 [2024-07-24 23:49:26.913161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.169 [2024-07-24 23:49:26.914123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.169 [2024-07-24 23:49:26.915088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.169 [2024-07-24 23:49:26.915568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.169 [2024-07-24 23:49:26.915769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.169 [2024-07-24 23:49:26.915777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.169 [2024-07-24 23:49:26.920195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.169 [2024-07-24 23:49:26.920462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.169 [2024-07-24 23:49:26.920964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.169 [2024-07-24 23:49:26.921736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.169 [2024-07-24 23:49:26.921937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.169 [2024-07-24 23:49:26.922963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.169 [2024-07-24 23:49:26.923930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.169 [2024-07-24 23:49:26.924682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.169 [2024-07-24 23:49:26.925483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.169 [2024-07-24 23:49:26.925672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.169 [2024-07-24 23:49:26.925680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.169 [2024-07-24 23:49:26.928941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.169 [2024-07-24 23:49:26.929218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.169 [2024-07-24 23:49:26.930238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.169 [2024-07-24 23:49:26.930276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.169 [2024-07-24 23:49:26.930463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.169 [2024-07-24 23:49:26.931437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.169 [2024-07-24 23:49:26.932398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.169 [2024-07-24 23:49:26.932846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.169 [2024-07-24 23:49:26.933658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.169 [2024-07-24 23:49:26.933847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.169 [2024-07-24 23:49:26.933855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.169 [2024-07-24 23:49:26.937948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.937984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.938248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.939016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.939239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.940239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.941203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.941235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.941794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.942016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.942024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.944686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.944985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.945775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.945806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.946099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.946378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.946406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.946678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.946996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.947183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.947192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.949430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.950449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.950485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.950756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.951047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.951083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.951956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.952220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.952248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.952567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.952576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.954656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.954693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.954955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.955218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.955492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.956384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.956655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.956685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.957304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.957522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.957531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.960510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.960550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.961033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.961298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.961555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.961840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.961871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.962714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.963070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.963405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.963426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.966555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.966595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.966851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.967913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.968292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.968580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.968619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.968884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.969156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.969342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.969350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.971624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.971669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.972563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.972821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.973119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.974143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.974174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.974436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.974707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.975061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.975070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.977758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.977818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.978086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.978768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.979007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.979283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.979313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.980021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.980502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.980830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.170 [2024-07-24 23:49:26.980839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:26.983691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:26.983728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:26.983987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:26.984249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:26.984511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:26.985235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:26.985265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:26.985610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:26.985869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:26.986053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:26.986065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:26.988364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:26.988405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:26.989455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:26.989717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:26.990044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:26.990315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:26.990353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:26.990638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:26.991617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:26.991998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:26.992009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:26.994991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:26.995036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:26.995298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:26.995631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:26.995817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:26.996095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:26.996125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:26.996389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:26.996675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:26.996962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:26.996970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:27.000952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:27.000991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:27.001260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:27.002151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:27.002497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:27.002764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:27.002797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:27.003701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:27.003968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:27.004261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:27.004269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:27.006341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:27.006378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:27.006643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:27.006918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:27.007213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:27.008223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:27.008257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:27.008525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:27.009011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:27.009191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:27.009199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:27.011632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:27.012510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:27.012542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:27.012813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:27.013108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:27.013387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:27.013426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:27.013873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:27.014614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:27.014946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:27.014956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:27.017264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:27.018080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:27.018340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:27.018369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:27.018617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:27.018662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:27.019373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:27.019653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:27.019682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:27.019971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:27.019980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:27.021979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:27.022016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:27.022277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:27.022308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.171 [2024-07-24 23:49:27.022595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.022985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.023015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.023724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.023754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.024071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.024080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.027218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.027256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.027542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.027570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.027801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.028543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.028574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.028829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.028856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.029097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.029105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.031071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.031108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.031532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.031568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.031776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.032167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.032197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.032870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.032900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.033087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.033095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.035848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.035888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.036638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.036670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.036978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.037244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.037273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.038075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.038107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.038441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.038449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.041849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.041888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.042246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.042275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.042614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.042888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.042921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.043192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.043234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.043419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.043427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.045814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.045854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.046893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.046929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.047288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.047577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.047612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.048596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.048625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.048965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.048974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.052231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.052268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.053244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.053274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.053457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.054290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.054320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.055165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.055194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.055561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.055571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.058206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.058244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.059180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.059211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.059541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.060484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.060517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.061477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.061507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.061704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.172 [2024-07-24 23:49:27.061712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.064370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.064407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.065225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.066025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.066208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.067160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.067191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.067722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.067751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.067932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.067940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.070513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.070549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.070997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.071025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.071206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.071477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.072017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.072046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.072071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.072284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.072292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.075199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.075231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.075256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.075280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.075459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.075954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.075989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.076014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.076039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.076217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.076226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.078018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.078053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.078078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.078103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.078282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.078321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.078346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.078371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.078395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.078579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.078588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.081326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.081369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.081399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.081424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.081606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.081644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.081672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.081697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.081721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.082048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.082057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.084100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.084135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.084160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.084184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.084366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.084404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.084429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.084454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.084482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.084717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.084727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.087830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.087870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.087897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.087921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.088248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.088280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.088307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.088331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.088357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.088636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.088645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.091535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.091567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.091591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.091616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.091833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.091872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.091899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.091924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.091961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.092139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.092148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.094596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.173 [2024-07-24 23:49:27.094630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.094663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.095153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.095336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.095377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.095402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.095427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.095451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.095771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.095781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.098613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.099396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.099425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.099451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.099636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.099677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.099703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.100699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.100735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.100995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.101003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.104813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.104851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.104876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.105802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.105985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.106025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.106721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.106749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.106787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.106966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.106978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.109550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.109584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.110086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.110114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.110317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.110593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.110622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.110648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.111447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.111676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.111684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.114759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.115702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.115733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.115765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.116009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.116044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.116070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.116737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.116764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.117080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.117089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.119876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.120821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.120853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.120877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.121150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.121188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.122142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.122181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.122212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.122394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.122402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.125332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.125870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.125899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.125925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.126254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.126287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.127184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.127213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.127244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.127425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.174 [2024-07-24 23:49:27.127433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.130261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.131205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.131235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.131259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.131461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.131499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.131990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.132018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.132044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.132367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.132376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.135741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.136678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.136708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.136740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.137028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.137066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.137843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.137873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.137897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.138078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.138086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.140686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.140967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.140995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.141022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.141363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.141398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.142315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.142348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.142373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.142557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.142566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.145418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.146028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.146070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.146100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.146282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.146315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.146583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.146612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.146637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.146936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.146945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.149953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.150828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.150860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.150891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.151070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.151104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.151873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.151902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.151927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.152107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.152115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.155054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.155819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.155850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.155874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.156072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.156112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.157067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.157096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.157121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.157351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.157360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.160753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.161025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.161054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.161079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.161407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.161440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.162087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.162117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.162143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.162383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.175 [2024-07-24 23:49:27.162391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.438 [2024-07-24 23:49:27.165569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.438 [2024-07-24 23:49:27.165603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.438 [2024-07-24 23:49:27.166604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.438 [2024-07-24 23:49:27.166640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.438 [2024-07-24 23:49:27.166903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.438 [2024-07-24 23:49:27.166936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.438 [2024-07-24 23:49:27.167202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.438 [2024-07-24 23:49:27.167229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.438 [2024-07-24 23:49:27.167255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.438 [2024-07-24 23:49:27.167560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.438 [2024-07-24 23:49:27.167569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.438 [2024-07-24 23:49:27.170785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.438 [2024-07-24 23:49:27.170822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.438 [2024-07-24 23:49:27.170853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.438 [2024-07-24 23:49:27.171866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.438 [2024-07-24 23:49:27.172067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.438 [2024-07-24 23:49:27.173047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.438 [2024-07-24 23:49:27.173078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.438 [2024-07-24 23:49:27.173111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.438 [2024-07-24 23:49:27.173377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.438 [2024-07-24 23:49:27.173725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.438 [2024-07-24 23:49:27.173734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.438 [2024-07-24 23:49:27.176184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.438 [2024-07-24 23:49:27.176995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.438 [2024-07-24 23:49:27.177026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.438 [2024-07-24 23:49:27.178017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.438 [2024-07-24 23:49:27.178271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.438 [2024-07-24 23:49:27.178310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.438 [2024-07-24 23:49:27.179268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.438 [2024-07-24 23:49:27.179299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.438 [2024-07-24 23:49:27.180254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.438 [2024-07-24 23:49:27.180592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.438 [2024-07-24 23:49:27.180602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.438 [2024-07-24 23:49:27.182744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.438 [2024-07-24 23:49:27.183758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.438 [2024-07-24 23:49:27.183789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.438 [2024-07-24 23:49:27.184639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.438 [2024-07-24 23:49:27.184866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.438 [2024-07-24 23:49:27.184902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.438 [2024-07-24 23:49:27.185698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.438 [2024-07-24 23:49:27.185727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.438 [2024-07-24 23:49:27.186676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.438 [2024-07-24 23:49:27.186863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.438 [2024-07-24 23:49:27.186871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.438 [2024-07-24 23:49:27.189487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.438 [2024-07-24 23:49:27.190384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.438 [2024-07-24 23:49:27.190413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.438 [2024-07-24 23:49:27.191465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.438 [2024-07-24 23:49:27.191659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.438 [2024-07-24 23:49:27.191698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.438 [2024-07-24 23:49:27.192184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.438 [2024-07-24 23:49:27.192212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.438 [2024-07-24 23:49:27.193011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.438 [2024-07-24 23:49:27.193194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.193203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.196036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.197018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.197051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.198003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.198189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.198229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.199194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.199227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.199754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.199977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.199985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.202580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.202867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.202896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.203807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.204027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.204066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.204998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.205029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.205954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.206227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.206235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.209541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.209806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.209833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.210090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.210352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.210386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.211155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.211185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.212105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.212285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.212293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.215149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.215416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.215445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.215704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.216047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.216080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.216372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.216400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.217211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.217395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.217403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.220241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.220881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.220913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.221169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.221487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.221521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.221790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.221816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.222160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.222340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.222348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.224889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.225829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.225860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.225884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.226102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.226142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.226403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.226430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.226688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.227016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.227024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.230263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.231244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.231278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.232209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.232389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.232432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.232459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.232717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.232975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.233272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.233280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.439 [2024-07-24 23:49:27.236342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.440 [2024-07-24 23:49:27.237220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.440 [2024-07-24 23:49:27.237557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.440 [2024-07-24 23:49:27.237815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.440 [2024-07-24 23:49:27.238116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.440 [2024-07-24 23:49:27.238154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.440 [2024-07-24 23:49:27.238407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.440 [2024-07-24 23:49:27.238863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.440 [2024-07-24 23:49:27.239635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.440 [2024-07-24 23:49:27.239816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.440 [2024-07-24 23:49:27.239825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.440 [2024-07-24 23:49:27.243451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.440 [2024-07-24 23:49:27.243723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.440 [2024-07-24 23:49:27.243983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.440 [2024-07-24 23:49:27.244247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.440 [2024-07-24 23:49:27.244580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.440 [2024-07-24 23:49:27.245500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.440 [2024-07-24 23:49:27.246320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.440 [2024-07-24 23:49:27.247250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.440 [2024-07-24 23:49:27.248183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.440 [2024-07-24 23:49:27.248463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.440 [2024-07-24 23:49:27.248478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.440 [2024-07-24 23:49:27.251863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.440 [2024-07-24 23:49:27.252129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.440 [2024-07-24 23:49:27.252744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.440 [2024-07-24 23:49:27.253526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.440 [2024-07-24 23:49:27.253707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.440 [2024-07-24 23:49:27.254657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.440 [2024-07-24 23:49:27.255438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.440 [2024-07-24 23:49:27.256329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.440 [2024-07-24 23:49:27.257133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.440 [2024-07-24 23:49:27.257317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.440 [2024-07-24 23:49:27.257325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.440 [2024-07-24 23:49:27.261305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.440 [2024-07-24 23:49:27.262386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.440 [2024-07-24 23:49:27.263446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.440 [2024-07-24 23:49:27.264423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.440 [2024-07-24 23:49:27.264736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.440 [2024-07-24 23:49:27.265708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.440 [2024-07-24 23:49:27.266753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.440 [2024-07-24 23:49:27.267743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.440 [2024-07-24 23:49:27.268657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.440 [2024-07-24 23:49:27.268966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.440 [2024-07-24 23:49:27.268974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.440 [2024-07-24 23:49:27.271460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.440 [2024-07-24 23:49:27.271746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.440 [2024-07-24 23:49:27.272011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.440 [2024-07-24 23:49:27.272278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.440 [2024-07-24 23:49:27.272547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.440 [2024-07-24 23:49:27.272830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.440 [2024-07-24 23:49:27.273094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.440 [2024-07-24 23:49:27.273357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.440 [2024-07-24 23:49:27.273632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.440 [2024-07-24 23:49:27.273906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.440 [2024-07-24 23:49:27.273915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.440 [2024-07-24 23:49:27.276551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.440 [2024-07-24 23:49:27.276817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.440 [2024-07-24 23:49:27.277075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.440 [2024-07-24 23:49:27.277331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.440 [2024-07-24 23:49:27.277629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.440 [2024-07-24 23:49:27.277902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.440 [2024-07-24 23:49:27.278168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.440 [2024-07-24 23:49:27.278427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.440 [2024-07-24 23:49:27.278688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.440 [2024-07-24 23:49:27.279010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.440 [2024-07-24 23:49:27.279019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.440 [2024-07-24 23:49:27.280964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.440 [2024-07-24 23:49:27.281229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.281494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.281527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.281749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.282016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.282275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.282536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.282806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.283132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.283143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.285176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.285209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.285471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.285740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.286082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.286366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.286627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.286657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.286911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.287203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.287211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.289009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.289268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.289532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.289562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.289823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.290092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.290120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.290376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.290636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.290976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.290985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.292881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.293142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.293171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.293430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.293701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.293748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.294005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.294260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.294299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.294674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.294684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.296620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.296653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.296937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.297197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.297483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.297751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.298010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.298040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.298292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.298627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.298637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.300640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.300672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.300929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.301187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.301456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.301726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.301755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.302009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.302265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.302595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.302605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.304483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.304518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.304772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.305031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.305305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.305578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.305609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.305865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.306132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.306466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.306480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.308449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.308487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.308745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.309008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.309348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.309634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.309665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.309921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.310177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.310480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.441 [2024-07-24 23:49:27.310489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.312480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.312513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.312772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.313033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.313320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.313609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.313641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.313904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.314167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.314408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.314417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.316388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.316420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.316683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.316951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.317291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.317561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.317591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.317876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.318143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.318409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.318418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.320444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.320481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.320745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.321009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.321357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.321625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.321655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.321906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.322246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.322427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.322435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.324256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.324288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.324548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.324807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.325097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.325364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.325393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.325651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.325907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.326234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.326242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.328143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.328176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.328431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.328702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.328974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.329242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.329271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.329533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.329788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.330118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.330130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.332103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.332135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.332671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.333440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.333626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.334585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.334616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.335403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.336271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.336494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.336502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.337969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.338230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.338260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.338550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.338730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.339579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.339609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.340603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.341660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.341912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.341920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.342990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.442 [2024-07-24 23:49:27.343252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.443 [2024-07-24 23:49:27.343517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.443 [2024-07-24 23:49:27.343545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.443 [2024-07-24 23:49:27.343901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.443 [2024-07-24 23:49:27.343934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.443 [2024-07-24 23:49:27.344187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.443 [2024-07-24 23:49:27.345198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.443 [2024-07-24 23:49:27.345233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.443 [2024-07-24 23:49:27.345412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.443 [2024-07-24 23:49:27.345420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.443 [2024-07-24 23:49:27.347438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.443 [2024-07-24 23:49:27.347480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.443 [2024-07-24 23:49:27.348451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.443 [2024-07-24 23:49:27.348482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.443 [2024-07-24 23:49:27.348663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.443 [2024-07-24 23:49:27.348932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.443 [2024-07-24 23:49:27.348959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.443 [2024-07-24 23:49:27.349213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.443 [2024-07-24 23:49:27.349239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.443 [2024-07-24 23:49:27.349546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.443 [2024-07-24 23:49:27.349555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.443 [2024-07-24 23:49:27.351806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.443 [2024-07-24 23:49:27.351845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.443 [2024-07-24 23:49:27.352346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.443 [2024-07-24 23:49:27.352374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.443 [2024-07-24 23:49:27.352560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.443 [2024-07-24 23:49:27.353411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.443 [2024-07-24 23:49:27.353439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.443 [2024-07-24 23:49:27.354427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.443 [2024-07-24 23:49:27.354464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.443 [2024-07-24 23:49:27.354648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.443 [2024-07-24 23:49:27.354656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.443 [2024-07-24 23:49:27.356900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.443 [2024-07-24 23:49:27.356932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.443 [2024-07-24 23:49:27.357756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.443 [2024-07-24 23:49:27.357785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.443 [2024-07-24 23:49:27.357965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.443 [2024-07-24 23:49:27.358913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.443 [2024-07-24 23:49:27.358943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.443 [2024-07-24 23:49:27.359544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.443 [2024-07-24 23:49:27.359581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.443 [2024-07-24 23:49:27.359759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.443 [2024-07-24 23:49:27.359767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.443 [2024-07-24 23:49:27.361147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.443 [2024-07-24 23:49:27.361178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.443 [2024-07-24 23:49:27.361441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.443 [2024-07-24 23:49:27.361471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.443 [2024-07-24 23:49:27.361827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.443 [2024-07-24 23:49:27.362196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.443 [2024-07-24 23:49:27.362224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.443 [2024-07-24 23:49:27.363018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.443 [2024-07-24 23:49:27.363048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.443 [2024-07-24 23:49:27.363231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.443 [2024-07-24 23:49:27.363239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.443 [2024-07-24 23:49:27.365312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.443 [2024-07-24 23:49:27.365345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.443 [2024-07-24 23:49:27.366271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.443 [2024-07-24 23:49:27.366300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.443 [2024-07-24 23:49:27.366513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.443 [2024-07-24 23:49:27.366785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.443 [2024-07-24 23:49:27.366812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.443 [2024-07-24 23:49:27.367066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.443 [2024-07-24 23:49:27.367097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.367461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.367479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.369579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.369611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.370049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.370077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.370309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.371312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.371342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.372274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.372302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.372485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.372494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.375056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.375088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.375865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.375894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.376074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.377020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.377050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.377485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.377513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.377696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.377704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.379064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.379094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.379348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.379374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.379698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.380285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.380313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.381096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.381130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.381312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.381320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.383333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.383365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.384296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.384593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.384959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.385222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.385250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.385508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.385535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.385796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.385804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.386817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.386846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.387626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.387655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.387874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.388829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.389758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.389786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.389810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.390070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.390078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.392191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.392221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.392246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.392270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.392474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.393544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.393575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.393599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.393624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.393803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.393811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.394955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.394983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.395007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.395031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.395208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.395245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.395271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.395295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.395319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.395590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.395598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.397529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.397556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.397581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.397604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.397816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.397857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.397882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.397907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.397931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.398109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.398117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.399225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.444 [2024-07-24 23:49:27.399252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.399277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.399304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.399485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.399525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.399550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.399575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.399599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.399779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.399787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.401464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.401495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.401521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.401546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.401801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.401835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.401861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.401885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.401909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.402124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.402132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.403223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.403249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.403274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.403305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.403488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.403524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.403549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.403573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.403604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.403781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.403789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.405245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.405275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.405302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.405566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.405796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.405831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.405855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.405879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.405904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.406137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.406145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.407246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.408028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.408058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.408083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.408259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.408299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.408324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.409283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.409318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.409579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.409588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.412049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.412081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.412105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.413032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.413214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.413253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.414011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.414042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.414069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.414257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.414265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.415426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.415454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.415718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.415745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.416047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.416309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.416337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.416362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.417016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.417240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.417248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.418350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.445 [2024-07-24 23:49:27.419127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.446 [2024-07-24 23:49:27.419157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.446 [2024-07-24 23:49:27.419182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.446 [2024-07-24 23:49:27.419361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.446 [2024-07-24 23:49:27.419400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.446 [2024-07-24 23:49:27.419426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.446 [2024-07-24 23:49:27.420355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.446 [2024-07-24 23:49:27.420385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.446 [2024-07-24 23:49:27.420726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.446 [2024-07-24 23:49:27.420734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.446 [2024-07-24 23:49:27.422708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.446 [2024-07-24 23:49:27.423523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.446 [2024-07-24 23:49:27.423554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.446 [2024-07-24 23:49:27.423579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.446 [2024-07-24 23:49:27.423764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.446 [2024-07-24 23:49:27.423804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.446 [2024-07-24 23:49:27.424849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.446 [2024-07-24 23:49:27.424884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.446 [2024-07-24 23:49:27.424909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.446 [2024-07-24 23:49:27.425217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.446 [2024-07-24 23:49:27.425226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.446 [2024-07-24 23:49:27.426286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.446 [2024-07-24 23:49:27.426676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.446 [2024-07-24 23:49:27.426706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.446 [2024-07-24 23:49:27.426731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.446 [2024-07-24 23:49:27.427084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.446 [2024-07-24 23:49:27.427118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.446 [2024-07-24 23:49:27.427380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.446 [2024-07-24 23:49:27.427406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.446 [2024-07-24 23:49:27.427432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.446 [2024-07-24 23:49:27.427761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.446 [2024-07-24 23:49:27.427770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.446 [2024-07-24 23:49:27.428913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.446 [2024-07-24 23:49:27.429725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.446 [2024-07-24 23:49:27.429756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.446 [2024-07-24 23:49:27.429781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.446 [2024-07-24 23:49:27.429965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.446 [2024-07-24 23:49:27.429999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.446 [2024-07-24 23:49:27.430800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.446 [2024-07-24 23:49:27.430830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.446 [2024-07-24 23:49:27.430856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.446 [2024-07-24 23:49:27.431041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.446 [2024-07-24 23:49:27.431049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.446 [2024-07-24 23:49:27.432694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.446 [2024-07-24 23:49:27.432960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.446 [2024-07-24 23:49:27.432989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.446 [2024-07-24 23:49:27.433015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.446 [2024-07-24 23:49:27.433284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.446 [2024-07-24 23:49:27.433318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.710 [2024-07-24 23:49:27.434112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.710 [2024-07-24 23:49:27.434142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.710 [2024-07-24 23:49:27.434167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.710 [2024-07-24 23:49:27.434350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.710 [2024-07-24 23:49:27.434358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.710 [2024-07-24 23:49:27.435493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.710 [2024-07-24 23:49:27.436454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.710 [2024-07-24 23:49:27.436486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.710 [2024-07-24 23:49:27.436511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.710 [2024-07-24 23:49:27.436693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.710 [2024-07-24 23:49:27.436732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.710 [2024-07-24 23:49:27.437241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.710 [2024-07-24 23:49:27.437270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.710 [2024-07-24 23:49:27.437296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.710 [2024-07-24 23:49:27.437669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.710 [2024-07-24 23:49:27.437678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.710 [2024-07-24 23:49:27.439295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.710 [2024-07-24 23:49:27.440257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.710 [2024-07-24 23:49:27.440287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.710 [2024-07-24 23:49:27.440312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.710 [2024-07-24 23:49:27.440497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.710 [2024-07-24 23:49:27.440540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.710 [2024-07-24 23:49:27.441414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.710 [2024-07-24 23:49:27.441442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.710 [2024-07-24 23:49:27.441471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.710 [2024-07-24 23:49:27.441657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.710 [2024-07-24 23:49:27.441665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.710 [2024-07-24 23:49:27.442792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.710 [2024-07-24 23:49:27.443068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.710 [2024-07-24 23:49:27.443098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.710 [2024-07-24 23:49:27.443125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.710 [2024-07-24 23:49:27.443448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.710 [2024-07-24 23:49:27.443498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.710 [2024-07-24 23:49:27.443761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.710 [2024-07-24 23:49:27.443788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.710 [2024-07-24 23:49:27.443814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.710 [2024-07-24 23:49:27.444115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.710 [2024-07-24 23:49:27.444123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.710 [2024-07-24 23:49:27.445172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.710 [2024-07-24 23:49:27.445680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.710 [2024-07-24 23:49:27.445710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.710 [2024-07-24 23:49:27.445735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.710 [2024-07-24 23:49:27.445978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.710 [2024-07-24 23:49:27.446016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.710 [2024-07-24 23:49:27.446977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.710 [2024-07-24 23:49:27.447008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.710 [2024-07-24 23:49:27.447033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.710 [2024-07-24 23:49:27.447215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.447223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.448892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.449159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.449190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.449215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.449399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.449438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.450433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.450473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.450503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.450686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.450697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.451862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.451889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.452844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.452874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.453105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.453146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.453415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.453443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.453472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.453822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.453832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.456306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.456349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.456374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.457471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.457738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.458545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.458575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.458601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.459532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.459713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.459720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.461339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.461627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.461655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.462704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.462884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.462921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.463885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.463915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.464906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.465180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.465188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.466212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.466482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.466510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.466765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.467062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.467094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.467351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.467380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.468230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.468452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.468460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.469550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.470318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.470347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.471277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.471460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.471503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.471873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.471900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.472155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.472447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.472455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.473875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.474833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.474863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.475662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.475906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.475942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.476721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.476751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.477683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.477861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.477869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.479478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.479737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.479766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.480630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.480811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.480845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.481861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.481896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.482800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.483046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.483054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.484073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.484338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.484366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.484624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.484946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.484978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.485235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.485261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.486113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.486292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.486300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.487398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.488329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.488359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.488628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.488958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.488992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.489249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.489274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.489532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.489774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.489782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.490799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.491529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.491557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.492325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.492507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.492547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.493483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.493512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.493934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.494300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.494309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.495873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.496846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.496882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.496910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.497090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.497123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.498137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.498165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.498917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.499129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.499137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.500561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.500826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.500854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.501109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.501288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.501322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.501347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.502263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.503254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.503434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.503442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.505482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.505749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.506007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.506263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.506594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.506630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.507256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.508035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.508983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.509159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.509167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.511271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.512047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.512307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.512566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.512843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.513106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.513365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.513632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.513924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.514253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.514264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.516322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.516583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.516840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.711 [2024-07-24 23:49:27.517099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.517391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.517666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.517945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.518208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.518476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.518784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.518793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.520839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.521102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.521363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.521626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.521945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.522208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.522463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.522726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.522989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.523304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.523312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.525298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.525562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.525821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.526078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.526420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.526686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.526948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.527215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.527475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.527824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.527833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.529711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.529981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.530241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.530511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.530836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.531115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.531368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.531628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.531888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.532267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.532276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.534299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.534575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.534839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.534867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.535187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.535447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.535706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.535965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.536228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.536542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.536552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.538510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.538544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.538801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.539056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.539391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.539660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.539919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.539959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.540216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.540587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.540597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.542314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.542581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.542840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.542870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.543177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.543440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.543471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.543732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.543995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.544367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.544376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.546425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.546696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.546728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.546983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.547274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.547306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.547566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.547826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.547856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.548088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.548097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.550184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.550219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.550491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.550755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.551079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.551341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.551600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.551638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.551898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.552167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.552175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.554549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.554590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.554850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.555109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.555428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.555698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.555728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.555985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.556250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.556589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.556613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.558664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.558706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.558980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.559236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.559518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.559782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.559818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.560075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.560340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.560616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.560625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.562598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.562633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.562889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.563162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.563495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.563765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.563795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.564460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.564970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.565175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.565183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.567059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.567101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.567359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.567625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.567988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.568252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.568283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.568540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.568800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.569087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.569095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.571108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.571148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.571419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.571687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.572037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.572300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.572329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.572588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.572851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.573118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.573126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.575049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.575083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.575344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.575610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.575944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.576211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.576243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.576501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.577472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.577725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.577735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.712 [2024-07-24 23:49:27.579633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.579666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.580599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.581607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.581922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.582198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.582243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.582510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.582783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.583026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.583035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.584570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.584603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.585373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.586340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.586533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.587300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.587335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.587603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.587862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.588194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.588202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.590415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.590449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.590954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.591890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.592073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.593016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.593047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.594030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.594294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.594660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.594670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.597166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.598133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.598164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.598961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.599173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.599948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.599978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.600907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.601855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.602102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.602111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.603986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.604978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.605897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.605931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.606116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.606156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.607116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.607605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.607634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.607846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.607854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.609248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.609278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.609552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.609580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.609921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.610699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.610739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.611510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.611538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.611717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.611725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.613748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.613779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.614754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.614782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.615118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.615390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.615419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.615681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.615713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.616058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.616067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.618184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.618227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.618920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.618949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.619168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.620111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.620141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.621056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.621084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.621317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.621325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.624030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.624069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.625002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.625038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.625218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.626160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.626190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.626697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.626726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.626939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.626946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.628304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.628336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.628598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.628629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.628950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.629548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.629578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.630349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.630376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.630563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.630571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.632603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.632639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.633569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.633599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.633936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.634201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.634229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.634487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.634513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.634854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.634866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.636934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.636966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.637735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.637763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.637977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.638932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.638962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.639891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.639919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.640161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.640170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.642888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.642921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.643816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.643844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.644022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.644976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.645007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.645475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.645503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.645728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.645737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.647116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.647147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.647402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.647662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.647866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.713 [2024-07-24 23:49:27.648648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.648679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.649602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.649630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.649811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.649819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.650966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.650993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.651930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.651960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.652234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.652511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.652771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.652801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.652827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.653159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.653168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.654337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.654373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.654399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.654423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.654602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.655218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.655246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.655270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.655294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.655509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.655517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.656749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.656778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.656819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.656846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.657186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.657219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.657247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.657274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.657299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.657642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.657651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.658838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.658873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.658900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.658924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.659102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.659140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.659167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.659193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.659219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.659397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.659405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.660470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.660498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.660527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.660566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.660909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.660942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.660969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.660996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.661021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.661356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.661364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.662932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.662960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.662991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.663017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.663196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.663233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.663262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.663286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.663326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.663511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.663519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.664631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.664670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.664695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.664720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.664943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.664982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.665008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.665043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.665069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.665415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.665424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.667014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.667042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.667075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.667991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.668172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.668211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.668240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.668265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.668289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.668465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.668478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.669555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.670308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.670337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.670362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.670706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.670751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.670778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.671034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.671073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.671424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.671432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.673584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.673617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.673642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.674094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.674284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.674326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.675336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.675367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.675395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.675586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.675595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.677195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.677223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.677491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.677521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.677706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.678510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.678540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.678565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.679523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.679709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.679717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.680850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.681813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.681844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.681877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.682143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.682185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.682212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.682479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.682508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.682766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.682775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.684167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.685130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.685161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.685186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.685377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.685416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.686276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.686306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.686332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.686539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.686548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.687843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.688111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.688139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.688165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.688518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.688551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.689017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.689046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.689071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.689295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.689303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.690419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.691216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.691246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.691271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.691457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.691499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.692454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.692486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.692511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.692776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.714 [2024-07-24 23:49:27.692785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.715 [2024-07-24 23:49:27.694647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.715 [2024-07-24 23:49:27.695708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.715 [2024-07-24 23:49:27.695743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.715 [2024-07-24 23:49:27.695780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.715 [2024-07-24 23:49:27.695964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.715 [2024-07-24 23:49:27.695997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.715 [2024-07-24 23:49:27.697009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.715 [2024-07-24 23:49:27.697047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.715 [2024-07-24 23:49:27.697075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.715 [2024-07-24 23:49:27.697258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.715 [2024-07-24 23:49:27.697266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.715 [2024-07-24 23:49:27.698411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.715 [2024-07-24 23:49:27.699314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.715 [2024-07-24 23:49:27.699342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.715 [2024-07-24 23:49:27.699369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.715 [2024-07-24 23:49:27.699685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.715 [2024-07-24 23:49:27.699717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.715 [2024-07-24 23:49:27.699984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.715 [2024-07-24 23:49:27.700012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.715 [2024-07-24 23:49:27.700041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.715 [2024-07-24 23:49:27.700390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.715 [2024-07-24 23:49:27.700398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.715 [2024-07-24 23:49:27.701665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.715 [2024-07-24 23:49:27.702633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.715 [2024-07-24 23:49:27.702663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.715 [2024-07-24 23:49:27.702688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.715 [2024-07-24 23:49:27.702967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.715 [2024-07-24 23:49:27.703009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.715 [2024-07-24 23:49:27.703869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.715 [2024-07-24 23:49:27.703899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.715 [2024-07-24 23:49:27.703924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.715 [2024-07-24 23:49:27.704108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.715 [2024-07-24 23:49:27.704116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.715 [2024-07-24 23:49:27.705472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.977 [2024-07-24 23:49:27.705745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.977 [2024-07-24 23:49:27.705776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.977 [2024-07-24 23:49:27.705801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.977 [2024-07-24 23:49:27.706144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.977 [2024-07-24 23:49:27.706177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.977 [2024-07-24 23:49:27.706882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.977 [2024-07-24 23:49:27.706912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.977 [2024-07-24 23:49:27.706937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.977 [2024-07-24 23:49:27.707157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.977 [2024-07-24 23:49:27.707166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.977 [2024-07-24 23:49:27.708284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.977 [2024-07-24 23:49:27.709091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.977 [2024-07-24 23:49:27.709122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.977 [2024-07-24 23:49:27.709146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.977 [2024-07-24 23:49:27.709326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.977 [2024-07-24 23:49:27.709364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.977 [2024-07-24 23:49:27.710296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.710325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.710357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.710647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.710655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.712420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.713193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.713223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.713247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.713430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.713472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.714428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.714457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.714485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.714741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.714754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.715803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.715830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.716336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.716375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.716748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.716782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.717048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.717086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.717113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.717458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.717467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.719585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.719634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.719666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.720297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.720520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.721488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.721535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.721560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.722518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.722738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.722747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.724479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.725397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.725426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.726333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.726518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.726558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.727489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.727520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.728018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.728229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.728237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.729442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.729703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.729732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.729989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.730325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.730362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.731130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.731160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.731935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.732117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.732125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.733273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.733812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.733851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.734107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.734454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.734500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.734757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.734786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.735276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.735481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.735489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.736648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.737427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.737456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.738389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.738577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.738621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.739189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.739228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.739495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.739824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.739833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.741290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.742224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.742253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.743223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.743530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.743569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.744351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.744379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.978 [2024-07-24 23:49:27.745311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.745494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.745503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.747125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.747386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.747414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.748392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.748621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.748660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.749588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.749617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.750547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.750848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.750857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.751873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.752169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.752203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.752458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.752749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.752790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.753046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.753073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.753696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.753912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.753921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.755070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.755851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.755882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.756812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.757014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.757054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.757523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.757552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.757815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.758138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.758147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.759536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.760568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.760599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.760624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.760803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.760845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.761325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.761353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.762170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.762357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.762365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.764080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.764366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.764397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.764667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.764922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.764964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.764991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.765255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.765515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.765877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.765886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.767750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.768030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.768292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.768557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.768818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.768861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.769116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.769372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.769631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.769932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.769940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.771963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.772251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.772533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.772819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.773157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.773420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.773684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.773943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.774208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.774497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.774507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.776502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.776791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.777060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.777323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.777671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.777937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.778202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.778462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.778725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.778995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.779003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.780994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.781277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.979 [2024-07-24 23:49:27.781587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.781894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.782173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.782439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.782709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.782968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.783226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.783538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.783548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.785627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.785918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.786183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.786459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.786820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.787085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.787352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.787616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.787883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.788224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.788233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.790288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.790566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.790826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.791084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.791415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.791688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.791953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.792216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.792476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.792791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.792800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.794764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.795030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.795290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.795319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.795643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.795924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.796187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.796444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.796728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.797066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.797075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.799032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.799068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.799324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.799612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.799858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.800137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.800403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.800433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.800702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.801049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.801058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.802745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.803014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.803282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.803328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.803636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.803905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.803935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.804202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.804458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.804842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.804852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.806798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.807061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.807092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.807351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.807716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.807758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.808036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.808294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.808322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.808641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.808650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.810593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.810630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.810889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.811147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.811475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.811755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.812019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.812050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.812304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.812619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.812628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.814549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.814601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.814867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.815131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.815394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.816213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.980 [2024-07-24 23:49:27.816248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.816896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.817443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.817771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.817779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.819732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.819767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.820030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.820287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.820621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.820896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.820926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.821190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.821466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.821845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.821858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.823856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.823902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.824159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.824416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.824792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.825060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.825094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.825368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.825640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.825988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.825997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.827921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.827960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.828217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.828480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.828820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.829104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.829135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.829907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.830837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.831020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.831028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.833140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.833174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.833799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.834065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.834404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.834677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.834713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.834971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.835991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.836171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.836179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.838084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.838118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.839136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.840154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.840407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.840711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.840743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.841001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.841259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.841501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.841510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.843349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.843381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.844163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.845095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.845278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.845835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.845873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.846132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.846388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.846747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.846757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.848837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.848869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.849304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.850086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.850268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.851308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.851346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.852230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.852493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.852809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.852818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.855193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.855227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.856186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.856829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.857013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.857834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.857867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.858820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.859755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.860105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.981 [2024-07-24 23:49:27.860113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.863028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.863964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.863996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.865083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.865269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.865709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.865741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.866516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.867444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.867646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.867656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.869284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.869550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.870427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.870456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.870695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.870734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.871681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.872643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.872674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.872966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.872974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.874325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.874358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.874617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.874645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.874913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.875182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.875214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.875776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.875806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.876029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.876038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.877965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.877999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.878924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.878952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.879132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.879745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.879786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.880056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.880086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.880427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.880436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.882956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.882990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.883942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.883978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.884242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.885086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.885117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.886072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.886102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.886281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.886289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.888120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.888154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.889029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.889058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.889291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.890237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.890267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.891190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.891218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.891579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.891588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.893001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.893041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.893299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.893327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.893628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.893896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.893926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.982 [2024-07-24 23:49:27.894392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.894423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.894643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.894651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.896552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.896586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.897524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.897554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.897732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.898346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.898385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.898652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.898683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.899003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.899012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.901423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.901456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.902388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.902423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.902686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.903526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.903557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.904483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.904512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.904691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.904699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.906505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.906540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.907328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.907356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.907569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.908527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.908560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.909486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.909514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.909803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.909811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.911231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.911271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.911532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.911792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.912137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.912410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.912441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.913226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.913256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.913436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.913444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.914608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.914637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.915601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.915631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.915810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.916447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.916714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.916748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.916775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.917104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.917113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.918720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.918748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.918790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.918817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.919001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.919979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.920010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.920041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.920069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.920371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.920379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.921471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.921501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.921528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.921553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.921855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.921896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.921926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.921952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.921977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.922298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.922307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.923761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.923788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.923813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.923837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.924034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.924074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.924100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.924126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.924169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.924353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.983 [2024-07-24 23:49:27.924362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.925587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.925618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.925643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.925668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.925908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.925947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.925975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.926001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.926043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.926374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.926383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.928042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.928077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.928103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.928139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.928320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.928351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.928384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.928414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.928441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.928625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.928633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.929859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.929886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.929911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.929935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.930117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.930156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.930182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.930207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.930232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.930613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.930625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.932741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.932772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.932799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.933597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.933781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.933815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.933856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.933883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.933908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.934090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.934099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.935329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.936268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.936299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.936324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.936583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.936625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.936663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.936934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.936963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.937295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.937303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.939887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.939921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.939949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.941004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.941253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.941289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.942080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.942113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.942138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.942321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.942330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.943805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.943835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.944102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.944134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.944481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.945375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.945404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.945437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.946419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.946612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.946620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.947786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.948869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.948905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.948931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.949112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.949150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.949177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.949446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.949479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.949817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.949826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.951294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.952275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.952307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.984 [2024-07-24 23:49:27.952332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.952518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.952562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.953010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.953039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.953064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.953246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.953255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.954407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.954681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.954710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.954738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.954992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.955033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.955297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.955325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.955351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.955568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.955577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.956738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.957737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.957773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.957799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.957984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.958019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.958980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.959017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.959048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.959225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.959233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.960967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.961495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.961528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.961552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.961786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.961824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.962788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.962819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.962844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.963026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.963034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.964255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.965228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.965259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.965289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.965565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.965607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.965867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.965894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.965920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.966184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.966192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.967603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.968533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.968562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.968587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.968778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.968820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.969714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.969743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.969775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.969972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.969980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.971396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.971672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.971706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.971731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.972066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.972099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.972570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.972600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.972625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.972844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:42.985 [2024-07-24 23:49:27.972852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.249 [2024-07-24 23:49:27.973977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.249 [2024-07-24 23:49:27.974785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.249 [2024-07-24 23:49:27.974816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.249 [2024-07-24 23:49:27.974841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.249 [2024-07-24 23:49:27.975027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.249 [2024-07-24 23:49:27.975066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.249 [2024-07-24 23:49:27.976028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.249 [2024-07-24 23:49:27.976066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.249 [2024-07-24 23:49:27.976091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.249 [2024-07-24 23:49:27.976461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.249 [2024-07-24 23:49:27.976474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.249 [2024-07-24 23:49:27.978523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.249 [2024-07-24 23:49:27.979451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.249 [2024-07-24 23:49:27.979486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.249 [2024-07-24 23:49:27.979511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.249 [2024-07-24 23:49:27.979691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.249 [2024-07-24 23:49:27.979734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.249 [2024-07-24 23:49:27.980663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.249 [2024-07-24 23:49:27.980692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.249 [2024-07-24 23:49:27.980716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.249 [2024-07-24 23:49:27.980895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.249 [2024-07-24 23:49:27.980903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.249 [2024-07-24 23:49:27.981990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.249 [2024-07-24 23:49:27.982017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.249 [2024-07-24 23:49:27.982642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.249 [2024-07-24 23:49:27.982671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.249 [2024-07-24 23:49:27.983005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.249 [2024-07-24 23:49:27.983047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.249 [2024-07-24 23:49:27.983305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.249 [2024-07-24 23:49:27.983343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.249 [2024-07-24 23:49:27.983380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.249 [2024-07-24 23:49:27.983759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.249 [2024-07-24 23:49:27.983769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.249 [2024-07-24 23:49:27.985380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.249 [2024-07-24 23:49:27.985416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.249 [2024-07-24 23:49:27.985443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.249 [2024-07-24 23:49:27.986367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.249 [2024-07-24 23:49:27.986568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.249 [2024-07-24 23:49:27.987630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.249 [2024-07-24 23:49:27.987668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.249 [2024-07-24 23:49:27.987697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.249 [2024-07-24 23:49:27.987964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.249 [2024-07-24 23:49:27.988286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.249 [2024-07-24 23:49:27.988295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.249 [2024-07-24 23:49:27.989852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.249 [2024-07-24 23:49:27.990786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.249 [2024-07-24 23:49:27.990817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.249 [2024-07-24 23:49:27.991770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.249 [2024-07-24 23:49:27.992095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.249 [2024-07-24 23:49:27.992140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.249 [2024-07-24 23:49:27.993186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.249 [2024-07-24 23:49:27.993214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.249 [2024-07-24 23:49:27.994230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.249 [2024-07-24 23:49:27.994408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.249 [2024-07-24 23:49:27.994416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.249 [2024-07-24 23:49:27.995930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.249 [2024-07-24 23:49:27.996189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.249 [2024-07-24 23:49:27.996217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:27.996892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:27.997115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:27.997155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:27.998113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:27.998144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:27.999049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:27.999256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:27.999265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:28.000351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:28.000850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:28.000892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:28.001155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:28.001490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:28.001534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:28.001804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:28.001835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:28.002284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:28.002474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:28.002482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:28.003686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:28.004459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:28.004491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:28.005414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:28.005621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:28.005662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:28.006293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:28.006323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:28.006596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:28.006938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:28.006948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:28.008506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:28.009441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:28.009474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:28.010566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:28.010863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:28.010903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:28.011669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:28.011698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:28.012635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:28.012815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:28.012823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:28.014493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:28.014763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:28.014792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:28.015754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:28.015964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:28.016003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:28.016956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:28.016987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:28.017948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:28.018210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:28.018220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:28.019325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:28.019631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:28.019674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:28.019931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:28.020256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:28.020301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:28.020571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:28.020601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:28.020859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:28.021246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:28.021254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:28.023169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:28.023455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:28.023494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:28.023770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:28.024041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:28.024083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:28.024340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:28.024367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:28.024633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.250 [2024-07-24 23:49:28.024964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.024973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.026800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.027072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.027106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.027133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.027464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.027504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.027786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.027817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.028090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.028456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.028474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.030405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.030694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.030726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.030990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.031329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.031366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.031394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.031665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.031936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.032188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.032197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.034248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.034525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.034788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.035047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.035375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.035412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.035702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.035972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.036240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.036581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.036590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.038489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.038750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.039008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.039265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.039560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.039854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.040126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.040397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.040669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.041006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.041015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.042902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.043165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.043425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.043715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.043980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.044254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.044528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.044805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.045066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.045336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.045345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.047311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.047591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.047863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.048129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.048416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.048728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.048998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.049270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.049543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.049873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.049882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.052281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.052758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.053027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.053931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.054226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.054541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.054824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.055093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.055992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.056276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.056285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.058227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.058510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.059415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.059725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.060050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.060821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.061253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.061523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.061788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.062046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.062055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.251 [2024-07-24 23:49:28.063894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.064183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.064450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.064487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.064758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.065640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.065967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.066231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.067239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.067568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.067577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.069404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.069439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.069722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.070754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.071105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.071381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.071677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.071715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.072008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.072190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.072198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.073941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.074204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.074475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.074530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.074830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.075843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.075875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.076132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.076584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.076771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.076780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.079370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.079662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.079702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.080305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.080513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.080555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.080815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.081073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.081104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.081383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.081392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.083384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.083419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.083703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.084257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.084462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.084895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.085615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.085646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.086629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.086958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.086967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.088882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.088917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.089180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.090218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.090558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.090837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.090867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.091953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.092220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.092552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.092562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.094228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.094262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.095343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.095619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.095949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.096214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.096245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.096515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.097519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.097871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.097880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.099737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.099773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.100034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.101058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.101397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.101672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.101704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.102783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.103054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.103378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.103390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.105235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.105272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.252 [2024-07-24 23:49:28.105535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.105794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.106095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.106366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.106397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.106676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.106942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.107287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.107295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.109384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.109418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.110238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.111056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.111245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.112219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.112255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.112702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.112968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.113293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.113303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.115799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.115833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.116812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.117331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.117558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.118617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.118650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.119604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.120407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.120709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.120718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.123138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.123172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.124133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.125050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.125293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.126124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.126155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.126992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.127958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.128137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.128145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.130066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.130102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.130981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.131962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.132144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.133127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.133165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.133800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.134594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.134778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.134786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.136435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.136490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.136760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.137709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.137931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.138886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.138916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.139861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.140299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.140484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.140492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.141901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.142166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.142197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.142451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.142736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.143599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.143631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.144591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.145545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.145737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.145745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.146877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.147506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.147771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.147805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.148129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.148163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.148423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.148708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.148741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.148924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.148932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.150916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.150950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.151882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.151912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.253 [2024-07-24 23:49:28.152091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.153012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.153043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.153311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.153344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.153676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.153686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.156092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.156127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.157054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.157084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.157461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.158520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.158550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.159576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.159607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.159788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.159800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.161624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.161658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.162263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.162292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.162537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.163505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.163536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.164484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.164514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.164748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.164757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.166387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.166425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.166702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.166733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.167054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.167341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.167374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.167705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.167734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.167917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.167927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.169888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.169922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.170881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.170911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.171097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.171896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.171927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.172202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.172234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.172564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.172574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.175033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.175068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.176044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.176073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.176371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.177334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.177370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.178328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.178357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.178542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.178552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.180387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.180423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.181175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.181205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.181458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.182426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.182457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.183404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.183433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.183701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.183719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.185200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.185237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.185504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.185533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.254 [2024-07-24 23:49:28.185824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.186100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.186130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.186609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.186639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.186871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.186879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.188835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.188869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.189822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.190777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.191002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.191283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.191315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.191582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.191621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.191953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.191962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.193243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.193272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.194271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.194306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.194585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.195441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.196407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.196437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.196463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.196649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.196659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.198268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.198298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.198323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.198352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.198686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.199450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.199484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.199510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.199535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.199757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.199765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.200972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.201003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.201031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.201056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.201237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.201281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.201318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.201343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.201367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.201553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.201562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.203203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.203234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.203260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.203285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.203618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.203652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.203678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.203708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.203735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.203917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.203925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.205210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.205239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.205264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.205295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.205491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.205528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.205555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.205580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.205607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.205789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.205797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.207379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.207408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.207449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.207478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.207806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.207838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.207865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.207891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.207919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.208142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.208150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.209307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.209337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.209362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.209392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.209576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.209608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.209640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.209666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.255 [2024-07-24 23:49:28.209699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.209886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.209894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.211352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.211382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.211411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.211682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.212003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.212037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.212066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.212092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.212120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.212303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.212311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.214697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.215790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.215829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.215854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.216044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.216087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.216112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.216371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.216398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.216721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.216731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.219619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.219659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.219702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.220565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.220758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.220791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.221836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.221873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.221898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.222075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.222083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.224031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.224064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.225035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.225066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.225246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.225807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.225839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.225864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.226659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.226839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.226847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.229744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.230408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.230437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.230462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.230691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.230730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.230756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.231686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.231716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.231894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.231902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.234603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.234872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.234903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.234929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.235227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.235259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.235520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.235547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.235573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.235752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.235760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.238134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.239175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.239213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.239238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.239434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.239477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.239745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.239772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.239799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.240123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.240133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.242474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.242945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.242976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.243001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.243187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.243227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.244252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.244283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.256 [2024-07-24 23:49:28.244308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.519 [2024-07-24 23:49:28.244496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.519 [2024-07-24 23:49:28.244505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.519 [2024-07-24 23:49:28.247363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.519 [2024-07-24 23:49:28.248143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.519 [2024-07-24 23:49:28.248174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.519 [2024-07-24 23:49:28.248199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.519 [2024-07-24 23:49:28.248378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.519 [2024-07-24 23:49:28.248419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.519 [2024-07-24 23:49:28.249384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.519 [2024-07-24 23:49:28.249415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.519 [2024-07-24 23:49:28.249440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.519 [2024-07-24 23:49:28.249744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.519 [2024-07-24 23:49:28.249753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.519 [2024-07-24 23:49:28.253054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.519 [2024-07-24 23:49:28.253340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.519 [2024-07-24 23:49:28.253368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.519 [2024-07-24 23:49:28.253395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.519 [2024-07-24 23:49:28.253728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.519 [2024-07-24 23:49:28.253762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.519 [2024-07-24 23:49:28.254571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.519 [2024-07-24 23:49:28.254601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.519 [2024-07-24 23:49:28.254626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.519 [2024-07-24 23:49:28.254871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.519 [2024-07-24 23:49:28.254879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.519 [2024-07-24 23:49:28.257886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.519 [2024-07-24 23:49:28.258827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.519 [2024-07-24 23:49:28.258860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.519 [2024-07-24 23:49:28.258887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.519 [2024-07-24 23:49:28.259182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.519 [2024-07-24 23:49:28.259217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.519 [2024-07-24 23:49:28.259484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.519 [2024-07-24 23:49:28.259512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.519 [2024-07-24 23:49:28.259539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.519 [2024-07-24 23:49:28.259880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.519 [2024-07-24 23:49:28.259895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.519 [2024-07-24 23:49:28.262302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.519 [2024-07-24 23:49:28.263331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.519 [2024-07-24 23:49:28.263370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.519 [2024-07-24 23:49:28.263396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.519 [2024-07-24 23:49:28.263588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.519 [2024-07-24 23:49:28.263627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.519 [2024-07-24 23:49:28.263898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.519 [2024-07-24 23:49:28.263924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.519 [2024-07-24 23:49:28.263950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.519 [2024-07-24 23:49:28.264276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.519 [2024-07-24 23:49:28.264285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.519 [2024-07-24 23:49:28.266637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.519 [2024-07-24 23:49:28.267164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.519 [2024-07-24 23:49:28.267196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.519 [2024-07-24 23:49:28.267221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.519 [2024-07-24 23:49:28.267408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.519 [2024-07-24 23:49:28.267447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.519 [2024-07-24 23:49:28.268491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.519 [2024-07-24 23:49:28.268523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.519 [2024-07-24 23:49:28.268547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.519 [2024-07-24 23:49:28.268725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.519 [2024-07-24 23:49:28.268733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.519 [2024-07-24 23:49:28.271644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.519 [2024-07-24 23:49:28.272450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.519 [2024-07-24 23:49:28.272486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.519 [2024-07-24 23:49:28.272510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.519 [2024-07-24 23:49:28.272688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.519 [2024-07-24 23:49:28.272727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.519 [2024-07-24 23:49:28.273661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.519 [2024-07-24 23:49:28.273691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.519 [2024-07-24 23:49:28.273720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.519 [2024-07-24 23:49:28.273992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.519 [2024-07-24 23:49:28.274001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.277395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.277429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.277692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.277720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.278053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.278085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.278863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.278893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.278917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.279164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.279172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.283187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.283233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.283260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.283525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.283844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.284108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.284136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.284162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.284419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.284615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.284624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.285856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.286887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.286919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.287920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.288117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.288163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.289129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.289160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.289865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.290100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.290109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.291781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.292625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.292655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.293459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.293644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.293685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.294620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.294650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.295078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.295262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.295270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.296529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.296821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.296849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.297103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.297430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.297463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.298212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.298241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.299009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.299192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.299200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.300400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.301392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.301423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.302359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.302691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.302735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.302999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.303030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.303285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.303664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.303674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.305435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.305700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.305728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.305983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.306247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.306289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.306559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.306592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.306851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.307174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.307186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.308978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.309262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.309293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.309559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.309929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.309962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.310221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.310258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.310532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.310896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.310904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.312890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.313181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.520 [2024-07-24 23:49:28.313222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.313502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.313846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.313884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.314138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.314163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.314416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.314713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.314722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.316483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.316767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.316795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.317060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.317353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.317398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.317672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.317701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.317988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.318310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.318318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.319992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.320253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.320281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.320313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.320691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.320725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.320993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.321030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.321301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.321697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.321706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.323773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.324043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.324074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.324329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.324642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.324678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.324704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.324965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.325232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.325599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.325609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.327718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.327987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.328249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.328525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.328813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.328848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.329110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.329375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.329649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.329963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.329972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.331952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.332216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.332480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.332758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.333095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.333367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.333644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.333914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.334177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.334535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.334544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.336467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.336733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.336992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.337255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.337534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.337815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.338080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.338342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.338617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.338940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.338949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.340944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.341214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.341489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.341771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.342070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.342341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.342615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.342893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.343156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.343501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.343513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.345473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.345760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.346025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.346287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.346586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.346867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.521 [2024-07-24 23:49:28.347126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.347382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.347662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.347992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.348002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.349905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.350166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.350425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.351331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.351614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.386302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.386590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.388542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.389211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.389251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.390152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.390940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.391263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.391308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.391558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.391604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.391847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.391883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.392125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.393139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.393322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.393331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.393337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.395366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.395724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.395988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.396251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.396598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.397005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.397810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.398761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.399712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.400012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.400021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.400028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.401506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.401790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.402056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.402903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.403156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.404122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.404830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.405758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.406510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.406696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.406705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.406712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.408656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.409643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.410581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.411547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.411843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.412781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.413761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.414802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.415077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.415420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.415430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.415438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.417945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.418875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.419577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.420298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.420487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.421397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.421661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.421920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.422177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.422520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.422531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.422539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.424515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.425254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.426214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.426662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.427056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.427326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.427603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.427928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.428798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.428983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.428992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.428998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.431099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.431521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.432288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.432557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.432824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.522 [2024-07-24 23:49:28.433710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.433975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.434289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.435237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.435422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.435431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.435438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.437033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.437301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.437570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.438405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.438651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.439473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.440284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.441018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.441591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.441945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.441954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.441961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.444327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.444865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.445919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.446831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.447117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.447398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.447667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.447935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.448396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.448592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.448602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.448608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.450033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.450319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.450587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.450852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.451085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.451821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.452682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.453426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.454155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.454341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.454350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.454357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.456859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.457560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.458180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.459171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.459398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.460054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.460313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.460572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.460840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.461179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.461188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.461196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.463107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.463506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.463791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.464051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.464410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.464697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.465677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.466623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.467028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.467208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.467217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.467224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.468997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.469458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.470159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.471041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.471341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.472093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.473018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.523 [2024-07-24 23:49:28.473281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.473540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.473847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.473855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.473862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.476034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.476806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.477511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.478089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.478395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.478665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.478925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.479183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.479906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.480147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.480155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.480162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.481713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.481984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.482245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.482603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.482785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.483054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.483317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.483576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.483834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.484139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.484148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.484155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.486127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.486391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.486658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.486920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.487247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.487512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.487769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.488028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.488291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.488625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.488634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.488641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.490580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.490848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.491108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.491365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.491704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.491970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.492239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.492505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.492763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.493122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.493132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.493139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.494857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.495118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.495375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.495955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.496180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.497211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.497588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.498362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.498630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.498949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.498958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.498967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.501025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.501681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.502449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.502716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.503048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.503314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.503579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.504273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.504737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.504969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.504977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.504987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.506903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.507868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.508135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.509146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.509437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.509708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.509963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.510217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.510756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.510978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.510987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.510994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.512889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.524 [2024-07-24 23:49:28.513156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.525 [2024-07-24 23:49:28.513188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.525 [2024-07-24 23:49:28.513968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.525 [2024-07-24 23:49:28.514191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.525 [2024-07-24 23:49:28.514851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.789 [2024-07-24 23:49:28.515371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.789 [2024-07-24 23:49:28.515404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.515672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.516012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.516021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.516029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.518546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.518836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.519117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.519382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.519774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.520051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.521044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.521428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.522235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.522519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.522530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.522537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.525181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.525653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.526368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.526639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.526972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.527243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.527512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.528184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.528699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.528922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.528931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.528937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.530887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.531705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.532075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.533021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.533297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.533579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.533846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.534110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.534374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.534563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.534573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.534580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.536275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.536562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.536950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.537772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.538307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.538541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.538550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.538830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.539097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.539360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.539622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.540168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.540356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.540364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.540371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.540378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.541859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.541894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.542158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.542200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.542570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.542579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.542847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.542885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.543811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.543843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.544129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.544138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.544145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.544152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.545988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.546021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.546674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.546704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.546924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.546932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.547822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.547856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.548673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.548703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.548923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.548931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.548938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.548944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.550841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.550891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.551917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.551945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.552129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.552137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.552582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.790 [2024-07-24 23:49:28.552615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.553443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.553479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.553655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.553664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.553670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.553677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.555973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.556007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.556708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.556743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.556927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.556935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.557666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.557698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.558422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.558452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.558713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.558722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.558729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.558737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.561785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.561825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.562799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.562837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.563142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.563150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.564052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.564098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.565061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.565095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.565393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.565401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.565408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.565415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.567781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.567816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.568405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.568434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.568622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.568634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.569332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.569362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.569751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.569783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.570129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.570138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.570145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.570153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.572629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.572669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.573198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.573228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.573442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.573450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.574411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.574442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.574705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.574732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.575070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.575081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.575089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.575096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.577126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.577175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.577204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.577231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.577408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.577416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.578185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.578217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.578245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.578276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.578657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.578666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.578673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.578679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.580534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.580561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.580585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.580609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.580829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.580837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.580874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.580900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.580924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.580948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.581163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.581172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.581179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.581186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.582502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.791 [2024-07-24 23:49:28.582529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.582571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.582596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.582897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.582905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.582934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.582959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.582984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.583008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.583344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.583357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.583365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.583372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.584597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.584625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.584650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.584674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.584880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.584888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.584925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.584950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.584975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.584999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.585250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.585259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.585266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.585272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.587357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.587385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.587412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.587436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.587649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.587657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.587696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.587722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.587749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.587774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.587953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.587961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.587967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.587977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.589161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.589190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.589229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.589255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.589622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.589632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.589661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.589689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.589713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.589738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.590015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.590023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.590029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.590037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.591335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.591361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.591401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.591425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.591606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.591614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.591649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.591674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.591698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.591723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.592024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.592033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.592040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.592046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.593075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.593104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.593131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.593162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.593428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.593436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.593465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.593494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.593518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.593542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.593865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.593874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.593882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.593889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.595347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.595376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.595420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.595445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.595631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.595640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.595676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.595702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.792 [2024-07-24 23:49:28.595739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.595764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.595943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.595952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.595959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.595965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.597146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.597183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.597209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.597233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.597416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.597425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.597460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.597491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.597517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.597542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.597914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.597923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.597931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.597940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.599584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.599643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.599690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.599715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.599902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.599911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.599946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.599972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.599997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.600021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.600199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.600209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.600216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.600222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.601360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.601388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.601413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.601437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.601622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.601631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.601665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.601694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.601718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.601742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.602113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.602121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.602128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.602134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.603926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.603955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.603982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.604007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.604233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.604242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.604278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.604303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.604330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.604355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.604535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.604544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.604553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.604561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.605687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.605715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.605755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.605780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.605959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.605967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.606016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.606042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.606066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.606094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.606287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.606295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.606302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.606310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.608199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.608241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.608268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.608293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.608475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.608485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.608519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.608550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.608575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.608599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.608778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.608786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.608792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.608798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.609923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.609952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.609977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.793 [2024-07-24 23:49:28.610001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.610177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.610185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.610222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.610246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.610271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.610294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.610473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.610482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.610491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.610514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.612303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.612331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.612359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.612384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.612654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.612663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.612695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.612720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.612747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.612772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.613027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.613035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.613042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.613049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.614207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.614234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.614259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.614283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.614462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.614474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.614512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.614537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.614561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.614585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.614834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.614843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.614850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.614857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.616938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.616966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.617007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.617032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.617217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.617226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.617264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.617289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.617314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.617339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.617545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.617554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.617561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.617568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.618576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.618605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.618629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.618653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.618938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.618947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.618982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.619008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.619032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.619056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.619388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.619399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.619406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.619413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.620979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.621872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.621905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.621933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.622151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.622160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.622196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.622921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.794 [2024-07-24 23:49:28.622953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.622977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.623152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.623161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.623167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.623173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.624928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.624957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.624999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.625027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.625279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.625287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.625322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.625347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.625372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.625397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.625642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.625651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.625658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.625664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.626750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.626778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.626803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.626827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.627005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.627013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.627053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.627078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.627102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.627133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.627398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.627406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.627412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.627419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.629221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.629249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.629290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.629315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.629503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.629511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.629548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.629573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.629598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.629624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.629934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.629943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.629950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.629957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.631111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.631140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.631165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.631190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.631539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.631548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.631577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.631603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.631633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.631658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.631947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.631956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.631963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.631970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.633319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.633972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.634028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.634789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.634969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.634976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.635034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.635493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.635556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.635793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.636133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.636141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.636148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.636154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.637717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.638706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.638772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.639314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.639540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.639549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.639609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.640515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.640567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.641293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.641607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.641621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.641629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.641635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.643353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.644071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.644122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.795 [2024-07-24 23:49:28.645046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.645270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.645279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.645341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.646320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.646378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.647319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.647501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.647510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.647517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.647523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.649271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.649610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.649661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.650395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.650578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.650586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.650657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.651563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.651616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.652303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.652549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.652557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.652564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.652573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.655413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.656236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.656286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.657067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.657247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.657256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.657314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.657740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.657795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.658683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.658868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.658877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.658883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.658890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.660491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.660749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.660797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.661363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.661581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.661589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.661649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.662535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.662586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.663296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.663528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.663536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.663543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.663550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.666050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.666323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.666379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.667298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.667479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.667487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.667523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.668419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.668447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.668872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.669051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.669060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.669066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.669073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.670508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.670768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.671025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.671707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.671938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.671946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.671983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.672786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.673611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.674334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.674559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.674567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.674574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.674580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.678080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.678894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.679837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.680355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.680549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.680557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.681332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.682274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.682780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.683060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.683388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.796 [2024-07-24 23:49:28.683397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.683405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.683413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.685838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.686781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.687227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.688162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.688348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.688356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.689304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.689637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.689904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.690168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.690518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.690527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.690535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.690542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.693407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.693683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.693950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.694214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.694558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.694568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.695253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.695989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.696664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.697578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.697794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.697802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.697808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.697815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.699722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.700684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.701456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.701878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.702062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.702070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.702843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.703248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.703522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.703790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.704135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.704144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.704152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.704159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.706529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.706803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.707068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.707333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.707550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.707559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.708276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.709016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.709901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.710635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.710857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.710866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.710873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.710880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.713589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.714336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.714819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.715828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.716014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.716023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.716549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.716834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.717090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.717347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.717667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.717676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.717682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.717688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.720873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.721139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.721398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.722106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.722333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.722341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.723228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.723916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.724619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.725283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.725582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.725591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.725601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.725608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.727948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.728578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.729494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.730218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.730441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.730449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.797 [2024-07-24 23:49:28.730720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.730980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.731252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.731557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.731735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.731743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.731749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.731756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.735368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.735636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.736218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.736979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.737159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.737167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.737720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.738421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.739332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.740099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.740409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.740417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.740424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.740430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.742661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.743153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.743821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.744085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.744408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.744417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.744681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.744941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.745198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.745464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.745809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.745818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.745825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.745833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.748355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.748622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.748883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.749148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.749431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.749441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.749710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.749967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.750223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.750485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.750796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.750804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.750811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.750817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.752844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.753123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.753386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.753651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.753944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.753952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.754212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.754644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.755547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.756240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.756475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.756484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.756490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.756497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.760126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.760834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.761312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.762039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.762326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.762336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.762609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.762886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.763150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.763416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.763605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.763614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.763621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.763628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.765402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.765689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.766264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.766911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.767089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.767098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.767419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.768265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.768526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.768784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.769077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.769086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.769093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.769099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.798 [2024-07-24 23:49:28.771540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.799 [2024-07-24 23:49:28.771803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.799 [2024-07-24 23:49:28.772060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.799 [2024-07-24 23:49:28.772319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.799 [2024-07-24 23:49:28.772529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.799 [2024-07-24 23:49:28.772537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.799 [2024-07-24 23:49:28.773097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.799 [2024-07-24 23:49:28.773830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.799 [2024-07-24 23:49:28.774247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.799 [2024-07-24 23:49:28.774511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.799 [2024-07-24 23:49:28.774840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.799 [2024-07-24 23:49:28.774849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.799 [2024-07-24 23:49:28.774856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.799 [2024-07-24 23:49:28.774863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.799 [2024-07-24 23:49:28.777226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.799 [2024-07-24 23:49:28.777559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.799 [2024-07-24 23:49:28.777819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.799 [2024-07-24 23:49:28.778075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.799 [2024-07-24 23:49:28.778403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.799 [2024-07-24 23:49:28.778412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.799 [2024-07-24 23:49:28.778679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.799 [2024-07-24 23:49:28.779751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.799 [2024-07-24 23:49:28.780032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.799 [2024-07-24 23:49:28.780989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.799 [2024-07-24 23:49:28.781266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.799 [2024-07-24 23:49:28.781274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.799 [2024-07-24 23:49:28.781281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:43.799 [2024-07-24 23:49:28.781288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.065 [2024-07-24 23:49:28.784596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.065 [2024-07-24 23:49:28.784885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.785150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.785414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.785737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.785746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.786624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.786938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.787910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.788172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.788545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.788554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.788563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.788570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.790446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.791413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.791682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.791942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.792228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.792236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.792501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.792761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.793681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.794046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.794226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.794238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.794244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.794250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.796589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.797343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.797606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.797864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.798172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.798180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.798442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.799065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.799592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.800374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.800637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.800646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.800654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.800661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.803157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.803191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.803704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.804581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.804857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.804866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.805147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.805177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.805431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.805716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.806013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.806022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.806029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.806036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.809105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.809856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.810556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.811161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.811347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.811355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.812069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.812666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.812931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.813188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.813530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.813540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.813548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.813555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.815822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.816692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.817033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.817943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.818284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.818293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.818560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.819535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.819792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.820186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.820366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.820375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.820382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.820388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.824266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.824549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.825200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.825902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.826082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.826091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.066 [2024-07-24 23:49:28.826704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.067 [2024-07-24 23:49:28.827401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.067 [2024-07-24 23:49:28.828135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.067 [2024-07-24 23:49:28.828911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.067 [2024-07-24 23:49:28.829154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.067 [2024-07-24 23:49:28.829162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.067 [2024-07-24 23:49:28.829169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.067 [2024-07-24 23:49:28.829176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.067 [2024-07-24 23:49:28.831371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.067 [2024-07-24 23:49:28.832071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.067 [2024-07-24 23:49:28.832763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.067 [2024-07-24 23:49:28.833641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.067 [2024-07-24 23:49:28.833858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.067 [2024-07-24 23:49:28.833866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.067 [2024-07-24 23:49:28.834642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.067 [2024-07-24 23:49:28.834914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.067 [2024-07-24 23:49:28.835188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.067 [2024-07-24 23:49:28.835446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.067 [2024-07-24 23:49:28.835782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.067 [2024-07-24 23:49:28.835791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.067 [2024-07-24 23:49:28.835798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.067 [2024-07-24 23:49:28.835806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.067 [2024-07-24 23:49:28.839898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.067 [2024-07-24 23:49:28.839942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.067 [2024-07-24 23:49:28.840201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.067 [2024-07-24 23:49:28.840228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.067 [2024-07-24 23:49:28.840543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.067 [2024-07-24 23:49:28.840555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.067 [2024-07-24 23:49:28.841544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.067 [2024-07-24 23:49:28.841573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.067 [2024-07-24 23:49:28.841829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.067 [2024-07-24 23:49:28.841857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.067 [2024-07-24 23:49:28.842138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.067 [2024-07-24 23:49:28.842146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.067 [2024-07-24 23:49:28.842153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.067 [2024-07-24 23:49:28.842159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.067 [2024-07-24 23:49:28.844372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.067 [2024-07-24 23:49:28.844415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.067 [2024-07-24 23:49:28.844676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.067 [2024-07-24 23:49:28.844703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.067 [2024-07-24 23:49:28.845024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.067 [2024-07-24 23:49:28.845034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.067 [2024-07-24 23:49:28.845293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.067 [2024-07-24 23:49:28.845320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.067 [2024-07-24 23:49:28.845580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.067 [2024-07-24 23:49:28.845609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.067 [2024-07-24 23:49:28.845788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.067 [2024-07-24 23:49:28.845796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.067 [2024-07-24 23:49:28.845803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.067 [2024-07-24 23:49:28.845809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.067 [2024-07-24 23:49:28.848764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.067 [2024-07-24 23:49:28.848803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.067 [2024-07-24 23:49:28.849060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.067 [2024-07-24 23:49:28.849089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.067 [2024-07-24 23:49:28.849268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.067 [2024-07-24 23:49:28.849276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.067 [2024-07-24 23:49:28.849548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.067 [2024-07-24 23:49:28.849584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.067 [2024-07-24 23:49:28.850035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.067 [2024-07-24 23:49:28.850063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.067 [2024-07-24 23:49:28.850301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.067 [2024-07-24 23:49:28.850309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.067 [2024-07-24 23:49:28.850316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.067 [2024-07-24 23:49:28.850322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.067 [2024-07-24 23:49:28.851687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.067 [2024-07-24 23:49:28.851722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.067 [2024-07-24 23:49:28.851981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.067 [2024-07-24 23:49:28.852009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.067 [2024-07-24 23:49:28.852312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.067 [2024-07-24 23:49:28.852321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.067 [2024-07-24 23:49:28.852586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.067 [2024-07-24 23:49:28.852614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.853226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.853256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.853435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.853444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.853450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.853456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.855971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.856022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.857025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.857055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.857400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.857410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.857677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.857707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.858529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.858561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.858748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.858757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.858764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.858771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.860868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.860904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.861709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.861738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.862034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.862042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.862303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.862330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.862591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.862618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.862967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.862977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.862984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.862992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.866892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.866947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.867995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.868040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.868221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.868230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.868658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.868689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.869323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.869351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.869679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.869691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.869703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.869710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.871938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.871975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.872000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.872023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.872203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.872212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.873173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.873204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.873229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.873254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.873513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.873522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.873529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.873536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.876758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.876793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.876837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.876863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.877155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.877163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.877193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.877219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.877244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.877270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.877607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.877618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.877625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.877633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.878694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.878725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.878750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.878774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.879042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.879051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.879086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.879115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.879139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.879163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.879343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.879352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.879359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.879366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.068 [2024-07-24 23:49:28.882009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.882045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.882071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.882096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.882323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.882331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.882364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.882389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.882414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.882438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.882722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.882731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.882739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.882746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.884065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.884093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.884120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.884145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.884328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.884336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.884372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.884397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.884423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.884448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.884823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.884832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.884839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.884845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.888026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.888060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.888086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.888114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.888415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.888424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.888453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.888484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.888525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.888552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.888885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.888894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.888902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.888910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.889962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.889990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.890015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.890039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.890335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.890343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.890383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.890408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.890432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.890456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.890694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.890703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.890710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.890717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.893661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.893696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.893723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.893748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.893927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.893935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.893968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.893993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.894017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.894048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.894408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.894417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.894426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.894434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.895879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.895912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.895940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.895965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.896297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.896306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.896337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.896363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.896391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.896420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.896750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.896760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.896767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.896773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.899353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.899386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.899428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.899453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.899646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.899655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.069 [2024-07-24 23:49:28.899691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.070 [2024-07-24 23:49:28.899728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.070 [2024-07-24 23:49:28.899753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.070 [2024-07-24 23:49:28.899777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.070 [2024-07-24 23:49:28.899955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.070 [2024-07-24 23:49:28.899963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.070 [2024-07-24 23:49:28.899970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.070 [2024-07-24 23:49:28.899976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.070 [2024-07-24 23:49:28.901629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.070 [2024-07-24 23:49:28.901662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.070 [2024-07-24 23:49:28.901687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.070 [2024-07-24 23:49:28.901711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.070 [2024-07-24 23:49:28.901888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.070 [2024-07-24 23:49:28.901897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.070 [2024-07-24 23:49:28.901933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.070 [2024-07-24 23:49:28.901959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.070 [2024-07-24 23:49:28.901984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.070 [2024-07-24 23:49:28.902009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.070 [2024-07-24 23:49:28.902328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.070 [2024-07-24 23:49:28.902340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.070 [2024-07-24 23:49:28.902349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.070 [2024-07-24 23:49:28.902357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.070 [2024-07-24 23:49:28.904498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.070 [2024-07-24 23:49:28.904532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.070 [2024-07-24 23:49:28.904557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.070 [2024-07-24 23:49:28.904582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.070 [2024-07-24 23:49:28.904797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.070 [2024-07-24 23:49:28.904806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.070 [2024-07-24 23:49:28.904841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.070 [2024-07-24 23:49:28.904866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.070 [2024-07-24 23:49:28.904913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.070 [2024-07-24 23:49:28.904939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.070 [2024-07-24 23:49:28.905295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.070 [2024-07-24 23:49:28.905305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.070 [2024-07-24 23:49:28.905312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.070 [2024-07-24 23:49:28.905320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.070 [2024-07-24 23:49:28.906880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.070 [2024-07-24 23:49:28.906931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.070 [2024-07-24 23:49:28.906962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.070 [2024-07-24 23:49:28.906991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.070 [2024-07-24 23:49:28.907174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.070 [2024-07-24 23:49:28.907183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.070 [2024-07-24 23:49:28.907219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.070 [2024-07-24 23:49:28.907259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.070 [2024-07-24 23:49:28.907291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.070 [2024-07-24 23:49:28.907317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.070 [2024-07-24 23:49:28.907502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.070 [2024-07-24 23:49:28.907512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.070 [2024-07-24 23:49:28.907520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.070 [2024-07-24 23:49:28.907527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.070 [2024-07-24 23:49:28.910185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.070 [2024-07-24 23:49:28.910219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.070 [2024-07-24 23:49:28.910245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.070 [2024-07-24 23:49:28.910270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.070 [2024-07-24 23:49:28.910446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.070 [2024-07-24 23:49:28.910455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.070 [2024-07-24 23:49:28.910491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.070 [2024-07-24 23:49:28.910534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.070 [2024-07-24 23:49:28.910565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.070 [2024-07-24 23:49:28.910592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.070 [2024-07-24 23:49:28.910951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.070 [2024-07-24 23:49:28.910961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.070 [2024-07-24 23:49:28.910969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.070 [2024-07-24 23:49:28.910976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.070 [2024-07-24 23:49:28.912150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.070 [2024-07-24 23:49:28.912187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.070 [2024-07-24 23:49:28.912211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.070 [2024-07-24 23:49:28.912242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.070 [2024-07-24 23:49:28.912491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.070 [2024-07-24 23:49:28.912499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.070 [2024-07-24 23:49:28.912532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.070 [2024-07-24 23:49:28.912556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.070 [2024-07-24 23:49:28.912580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.912605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.912813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.912822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.912828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.912834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.915606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.915640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.915687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.915716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.915898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.915907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.915938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.915970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.915998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.916037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.916221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.916229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.916235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.916242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.917411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.917447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.917479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.917504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.917688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.917696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.917731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.917756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.917784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.917826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.918009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.918017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.918024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.918030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.920053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.920092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.920118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.920152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.920330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.920342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.920380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.920405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.920443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.920477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.920655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.920663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.920670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.920676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.921797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.921826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.921851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.921877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.922171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.922179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.922208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.922234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.922259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.922285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.922624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.922634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.922642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.922649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.925001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.925035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.925059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.925083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.925299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.925307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.925350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.925378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.925403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.925427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.925627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.925635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.925642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.925649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.927146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.927890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.927920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.927945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.928272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.928280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.928314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.928600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.928634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.928664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.928846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.928855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.928861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.928868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.931804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.931844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.931871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.071 [2024-07-24 23:49:28.931897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.932168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.932178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.932210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.932235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.932260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.932288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.932620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.932632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.932640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.932647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.934143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.934172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.934197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.934222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.934405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.934413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.934449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.934478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.934504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.934529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.934809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.934817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.934823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.934830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.937593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.937629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.937655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.937681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.937873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.937882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.937914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.937940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.937965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.937990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.938333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.938342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.938349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.938360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.939632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.939661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.939686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.939711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.940011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.940019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.940057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.940082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.940107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.940132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.940374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.940382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.940389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.940396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.943222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.944041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.944072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.944791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.944975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.944983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.945024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.945612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.945645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.946609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.946795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.946804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.946810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.946817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.948178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.948551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.948582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.949304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.949652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.949662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.949696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.950256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.950285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.950994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.951180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.951188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.951195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.951201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.954315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.954591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.954633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.954897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.955233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.955242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.955275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.956244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.956278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.957354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.957546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.072 [2024-07-24 23:49:28.957554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.957561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.957568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.960290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.960568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.960598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.961367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.961641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.961651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.961690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.961956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.961983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.962960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.963205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.963213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.963220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.963226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.966182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.966477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.966507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.966775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.967136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.967145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.967175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.967505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.967536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.968437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.968623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.968632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.968638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.968646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.971254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.972018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.972048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.972351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.972697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.972707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.972743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.973688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.973724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.974565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.974817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.974825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.974832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.974839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.977554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.978063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.978092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.978783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.978967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.978975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.979016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.979731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.979760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.980458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.980670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.980679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.980685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.980694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.984176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.985218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.986050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.986411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.986598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.986606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.986646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.987570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.987834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.988095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.988402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.988411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.988418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.988424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.991734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.992317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.992908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.993167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.993393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.993402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.994049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.994308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.995053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.995746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.995941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.995950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.995956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.995963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.998825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:28.999837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:29.000714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:29.001073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:29.001255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:29.001263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.073 [2024-07-24 23:49:29.002128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.002414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.003347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.003616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.003921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.003933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.003940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.003946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.007780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.008717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.008977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.009235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.009538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.009547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.009807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.010473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.011175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.011859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.012065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.012073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.012079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.012086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.015026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.015297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.016250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.017044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.017229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.017238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.017807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.018820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.019693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.019975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.020341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.020350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.020360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.020367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.024205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.024679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.025380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.025641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.025891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.025899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.026676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.026933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.027193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.027865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.028096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.028104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.028110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.028117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.031063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.031445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.031705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.032664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.032999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.033007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.033274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.033538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.033809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.034077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.034407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.034415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.034423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.034430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.036744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.037676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.037935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.038199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.038571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.038580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.038850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.039108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.039364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.039625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.039915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.039924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.039930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.039936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.043425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.044043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.044310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.045190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.045421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.045429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.046363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.047006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.047953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.048768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.048950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.048959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.048965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.074 [2024-07-24 23:49:29.048971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.075 [2024-07-24 23:49:29.052113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.075 [2024-07-24 23:49:29.052408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.075 [2024-07-24 23:49:29.053397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.075 [2024-07-24 23:49:29.053856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.075 [2024-07-24 23:49:29.054044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.075 [2024-07-24 23:49:29.054057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.075 [2024-07-24 23:49:29.054332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.075 [2024-07-24 23:49:29.054772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.075 [2024-07-24 23:49:29.055540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.075 [2024-07-24 23:49:29.055806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.075 [2024-07-24 23:49:29.056102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.075 [2024-07-24 23:49:29.056111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.075 [2024-07-24 23:49:29.056118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.075 [2024-07-24 23:49:29.056125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.368 [2024-07-24 23:49:29.059675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.368 [2024-07-24 23:49:29.059950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.368 [2024-07-24 23:49:29.060620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.368 [2024-07-24 23:49:29.061126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.368 [2024-07-24 23:49:29.061347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.368 [2024-07-24 23:49:29.061355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.368 [2024-07-24 23:49:29.062056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.368 [2024-07-24 23:49:29.062890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.368 [2024-07-24 23:49:29.063259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.368 [2024-07-24 23:49:29.063574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.368 [2024-07-24 23:49:29.063761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.368 [2024-07-24 23:49:29.063770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.368 [2024-07-24 23:49:29.063776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.368 [2024-07-24 23:49:29.063783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.368 [2024-07-24 23:49:29.065852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.368 [2024-07-24 23:49:29.066125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.368 [2024-07-24 23:49:29.066390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.368 [2024-07-24 23:49:29.066960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.368 [2024-07-24 23:49:29.067206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.368 [2024-07-24 23:49:29.067215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.368 [2024-07-24 23:49:29.067689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.368 [2024-07-24 23:49:29.068400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.368 [2024-07-24 23:49:29.068958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.368 [2024-07-24 23:49:29.069610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.368 [2024-07-24 23:49:29.069944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.368 [2024-07-24 23:49:29.069954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.368 [2024-07-24 23:49:29.069961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.368 [2024-07-24 23:49:29.069971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.368 [2024-07-24 23:49:29.073347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.073628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.073896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.074159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.074507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.074517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.075361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.075705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.076699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.076970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.077157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.077165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.077172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.077178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.079416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.080487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.080758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.081025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.081361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.081370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.081655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.081946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.082849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.083245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.083434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.083443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.083450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.083457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.086069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.086841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.087408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.088037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.088375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.088384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.088659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.088922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.089186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.089834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.090054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.090062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.090070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.090076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.093582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.093856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.094601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.095039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.095255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.095264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.095825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.096092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.096356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.096626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.096974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.096983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.096992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.097003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.099750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.100709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.100984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.101253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.101439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.101447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.102310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.102793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.103730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.104668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.104965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.104973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.104981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.104989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.108063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.108549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.109270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.109539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.109808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.109817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.110618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.110876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.111479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.112167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.112348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.112356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.112363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.112370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.115837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.116356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.116626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.117574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.117801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.117809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.118487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.119373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.369 [2024-07-24 23:49:29.120090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.120564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.120750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.120759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.120765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.120772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.123382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.123852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.124867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.125820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.126123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.126131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.127128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.127393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.127655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.128587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.128955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.128965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.128974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.128981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.131112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.131151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.131878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.132141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.132402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.132410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.133241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.133272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.133538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.134286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.134505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.134514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.134520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.134527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.137320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.138129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.138525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.138783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.138964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.138972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.139677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.140223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.141207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.141982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.142274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.142283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.142291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.142297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.147075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.147872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.148231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.149114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.149299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.149307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.149699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.150679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.150937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.151286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.151467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.151479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.151486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.151493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.155146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.155614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.156316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.156578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.156840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.156848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.157660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.157918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.158496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.159157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.159343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.159352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.159358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.159365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.162640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.163201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.163467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.164368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.164617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.164626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.165558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.166263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.166935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.167854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.168082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.168091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.168097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.168105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.370 [2024-07-24 23:49:29.172647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.172686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.173554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.173583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.173764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.173772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.174697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.174727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.175146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.175174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.175418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.175426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.175433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.175439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.178630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.178669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.179254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.179281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.179619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.179629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.180340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.180370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.181130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.181160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.181343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.181352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.181362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.181368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.184813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.184854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.185780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.185809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.186163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.186172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.186456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.186489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.187309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.187335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.187681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.187694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.187702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.187709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.191182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.191221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.192162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.192191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.192376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.192384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.193093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.193124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.194126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.194162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.194528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.194537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.194545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.194553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.197336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.197381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.198399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.198428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.198669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.198678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.199474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.199504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.200437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.200466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.200656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.200665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.200671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.200678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.203203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.203242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.204292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.204319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.204512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.204521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.205464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.205498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.206480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.206518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.206787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.206795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.206802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.206808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.209300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.209337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.210120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.210152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.210435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.210443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.210713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.210741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.371 [2024-07-24 23:49:29.211680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.211711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.211906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.211914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.211920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.211927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.214460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.214505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.214530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.214555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.214734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.214742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.215007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.215034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.215061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.215089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.215403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.215411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.215417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.215424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.218031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.218065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.218090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.218117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.218369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.218381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.218415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.218439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.218464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.218493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.218783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.218792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.218800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.218807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.221098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.221142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.221171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.221196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.221374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.221382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.221420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.221445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.221474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.221499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.221678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.221687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.221694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.221700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.224225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.224265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.224295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.224319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.224501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.224509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.224544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.224570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.224611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.224641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.224821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.224829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.224835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.224844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.226706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.226740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.226768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.226793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.227091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.227100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.227134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.227161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.227189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.227214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.227549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.227558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.227565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.227571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.229009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.229048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.229085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.229111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.229376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.229385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.229417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.229442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.229466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.229496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.229738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.229749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.229756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.229763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.231272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.231303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.372 [2024-07-24 23:49:29.231336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.231362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.231733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.231742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.231772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.231798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.231823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.231848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.232110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.232118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.232124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.232131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.234244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.234277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.234301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.234325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.234520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.234529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.234569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.234595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.234620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.234656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.235006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.235015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.235022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.235034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.236594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.236647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.236685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.236713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.236891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.236900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.236935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.236962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.236990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.237014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.237188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.237197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.237203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.237211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.239354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.239386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.239412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.239437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.239772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.239783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.239812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.239839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.239870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.239895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.240073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.240081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.240088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.240094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.242518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.242551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.242586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.242623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.242877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.242885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.242917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.242942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.242966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.242991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.243198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.243206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.243212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.243219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.244724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.244756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.244780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.244804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.245160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.245169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.245205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.245231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.373 [2024-07-24 23:49:29.245256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.245281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.245590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.245598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.245604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.245611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.247663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.247711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.247738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.247763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.247945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.247953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.247990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.248014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.248041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.248065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.248241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.248250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.248257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.248264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.250760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.250792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.250822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.250846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.251025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.251033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.251068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.251095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.251124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.251148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.251325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.251334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.251340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.251347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.253198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.253233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.253258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.253282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.253551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.253560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.253592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.253621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.253646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.253672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.253996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.254005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.254013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.254019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.255453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.255547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.255575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.255600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.255783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.255792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.255832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.255857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.255884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.255909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.256096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.256104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.256111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.256117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.258232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.258266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.258306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.258333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.258520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.258529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.258564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.258590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.258618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.258646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.258982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.258991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.258999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.259006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.261195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.261228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.261253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.261277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.261501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.261510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.261547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.261572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.261596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.261620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.261798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.261807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.261813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.261820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.263485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.263518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.374 [2024-07-24 23:49:29.263561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.263587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.263871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.263879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.263912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.263937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.263962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.263987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.264235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.264244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.264254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.264261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.265880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.265913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.265954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.265979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.266206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.266214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.266255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.266282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.266308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.266346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.266691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.266702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.266709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.266717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.268236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.269183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.269220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.269250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.269429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.269438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.269472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.269948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.269978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.270003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.270227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.270235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.270241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.270248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.271768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.271803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.271828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.271852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.272173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.272182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.272220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.272247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.272272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.272297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.272627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.272636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.272642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.272648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.274660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.274695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.274719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.274743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.274953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.274961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.275000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.275025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.275050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.275074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.275253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.275261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.275267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.275274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.276725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.276759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.276785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.276813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.277150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.277162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.277193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.277241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.277267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.277296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.277488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.277496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.277503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.277509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.279607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.279640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.279665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.279697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.279967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.279976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.280010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.280035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.280061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.280086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.280399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.375 [2024-07-24 23:49:29.280409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.280417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.280425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.281783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.282528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.282564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.283521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.283734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.283743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.283788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.284356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.284386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.284667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.284972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.284981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.284987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.284993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.286384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.286785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.286819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.287646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.287836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.287844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.287880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.288148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.288179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.288445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.288655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.288664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.288671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.288678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.290531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.291262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.291293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.292019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.292237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.292246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.292289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.292561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.292596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.292992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.293176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.293184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.293191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.293198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.295447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.296487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.296519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.297340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.297646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.297656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.297696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.297968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.297998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.298630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.298849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.298858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.298865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.298872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.301148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.302014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.302058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.303019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.303297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.303307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.303344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.303615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.303644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.304581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.304893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.304908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.304915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.304923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.306843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.307572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.307606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.308242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.308513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.308522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.308557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.308815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.308844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.309700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.310063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.310073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.310081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.310088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.311890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.312640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.312673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.313035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.313381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.313389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.376 [2024-07-24 23:49:29.313426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.313932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.313961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.314531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.314860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.314871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.314879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.314890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.316739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.317732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.317997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.318257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.318459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.318471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.318507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.318982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.319242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.320133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.320341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.320349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.320356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.320362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.322447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.323261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.323626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.323883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.324063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.324071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.324831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.325299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.326293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.327105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.327286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.327295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.327301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.327308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.328992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.329987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.330256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.331225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.331535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.331544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.331813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.332454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.332980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.333240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.333492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.333501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.333509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.333515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.335171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.335450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.335717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.335975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.336290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.336298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.336579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.336844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.337106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.337372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.337740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.337750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.337758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.337765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.339355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.339638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.339900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.340160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.340411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.340420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.340694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.340961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.341222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.341488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.341776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.341784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.341791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.341798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.343238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.344152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.377 [2024-07-24 23:49:29.345065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.378 [2024-07-24 23:49:29.345581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.378 [2024-07-24 23:49:29.345900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.378 [2024-07-24 23:49:29.345909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.378 [2024-07-24 23:49:29.346177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.378 [2024-07-24 23:49:29.346437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.378 [2024-07-24 23:49:29.347117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.378 [2024-07-24 23:49:29.347824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.378 [2024-07-24 23:49:29.348004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.378 [2024-07-24 23:49:29.348013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.378 [2024-07-24 23:49:29.348020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.378 [2024-07-24 23:49:29.348026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.378 [2024-07-24 23:49:29.349256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.378 [2024-07-24 23:49:29.349529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.378 [2024-07-24 23:49:29.349788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.378 [2024-07-24 23:49:29.350050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.378 [2024-07-24 23:49:29.350249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.378 [2024-07-24 23:49:29.350257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.378 [2024-07-24 23:49:29.350759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.378 [2024-07-24 23:49:29.351497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.378 [2024-07-24 23:49:29.351914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.378 [2024-07-24 23:49:29.352176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.378 [2024-07-24 23:49:29.352418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.378 [2024-07-24 23:49:29.352426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.378 [2024-07-24 23:49:29.352433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.378 [2024-07-24 23:49:29.352439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.378 [2024-07-24 23:49:29.354182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.378 [2024-07-24 23:49:29.354943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.378 [2024-07-24 23:49:29.355338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.378 [2024-07-24 23:49:29.355605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.378 [2024-07-24 23:49:29.355837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.378 [2024-07-24 23:49:29.355846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.378 [2024-07-24 23:49:29.356111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.378 [2024-07-24 23:49:29.356518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.378 [2024-07-24 23:49:29.357258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.378 [2024-07-24 23:49:29.358042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.378 [2024-07-24 23:49:29.358254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.378 [2024-07-24 23:49:29.358262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.378 [2024-07-24 23:49:29.358268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.378 [2024-07-24 23:49:29.358275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.378 [2024-07-24 23:49:29.359822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.378 [2024-07-24 23:49:29.360285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.378 [2024-07-24 23:49:29.361010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.378 [2024-07-24 23:49:29.361613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.378 [2024-07-24 23:49:29.361817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.378 [2024-07-24 23:49:29.361826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.378 [2024-07-24 23:49:29.362100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.378 [2024-07-24 23:49:29.362372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.378 [2024-07-24 23:49:29.362646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.378 [2024-07-24 23:49:29.363041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.378 [2024-07-24 23:49:29.363226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.378 [2024-07-24 23:49:29.363235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.378 [2024-07-24 23:49:29.363242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.378 [2024-07-24 23:49:29.363249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.638 [2024-07-24 23:49:29.364677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.638 [2024-07-24 23:49:29.364952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.638 [2024-07-24 23:49:29.365414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.638 [2024-07-24 23:49:29.366134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.638 [2024-07-24 23:49:29.366402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.638 [2024-07-24 23:49:29.366411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.638 [2024-07-24 23:49:29.367286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.638 [2024-07-24 23:49:29.367554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.638 [2024-07-24 23:49:29.367816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.638 [2024-07-24 23:49:29.368076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.638 [2024-07-24 23:49:29.368354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.638 [2024-07-24 23:49:29.368362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.638 [2024-07-24 23:49:29.368369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.638 [2024-07-24 23:49:29.368375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.638 [2024-07-24 23:49:29.369608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.638 [2024-07-24 23:49:29.369915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.638 [2024-07-24 23:49:29.370175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.638 [2024-07-24 23:49:29.370696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.638 [2024-07-24 23:49:29.370876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.638 [2024-07-24 23:49:29.370886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.638 [2024-07-24 23:49:29.371287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.638 [2024-07-24 23:49:29.372051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.638 [2024-07-24 23:49:29.372313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.638 [2024-07-24 23:49:29.372579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.638 [2024-07-24 23:49:29.372919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.638 [2024-07-24 23:49:29.372928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.638 [2024-07-24 23:49:29.372941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.638 [2024-07-24 23:49:29.372949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.638 [2024-07-24 23:49:29.374717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.638 [2024-07-24 23:49:29.374990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.638 [2024-07-24 23:49:29.375253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.638 [2024-07-24 23:49:29.375518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.638 [2024-07-24 23:49:29.375760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.638 [2024-07-24 23:49:29.375768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.638 [2024-07-24 23:49:29.376567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.638 [2024-07-24 23:49:29.377056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.638 [2024-07-24 23:49:29.377726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.638 [2024-07-24 23:49:29.377989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.638 [2024-07-24 23:49:29.378246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.378255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.378263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.378269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.379759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.380521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.380793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.381063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.381389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.381398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.381818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.382583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.383127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.383747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.383996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.384004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.384011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.384018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.385958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.386571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.387121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.387386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.387635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.387644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.387910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.388614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.389061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.389855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.390076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.390085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.390092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.390099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.392277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.393076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.393451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.394365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.394552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.394560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.394985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.395248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.395516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.395951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.396131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.396140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.396147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.396154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.398080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.398351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.398615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.398881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.399083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.399091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.399793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.400509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.401350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.402064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.402301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.402309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.402316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.402323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.404536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.405433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.405804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.406587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.406767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.406776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.407049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.407312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.407576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.408395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.408642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.408650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.408657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.408663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.410147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.410410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.410676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.411154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.411360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.411368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.412368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.412834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.413543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.414422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.414701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.414710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.414717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.414723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.416486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.639 [2024-07-24 23:49:29.417120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.418078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.418821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.419063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.419071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.419339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.419604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.420122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.420855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.421045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.421053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.421060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.421067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.422242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.422515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.422787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.423679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.423863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.423872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.424269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.425231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.426234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.426514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.426849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.426858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.426865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.426871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.428649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.428684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.429638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.430332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.430578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.430588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.430863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.430896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.431173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.431930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.432155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.432163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.432170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.432177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.433751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.434485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.435424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.435975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.436291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.436301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.436584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.437185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.437977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.438925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.439108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.439121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.439128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.439134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.440952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.441907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.442484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.442762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.443111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.443120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.443742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.444521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.445461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.446411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.446633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.446642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.446650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.446657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.448379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.448931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.449202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.449476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.449671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.449680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.450466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.451411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.452356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.452931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.453118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.453127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.453133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.640 [2024-07-24 23:49:29.453140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.641 [2024-07-24 23:49:29.455743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.641 [2024-07-24 23:49:29.456592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.641 [2024-07-24 23:49:29.456780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.641 [2024-07-24 23:49:29.456789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.641 [2024-07-24 23:49:29.457725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.641 [2024-07-24 23:49:29.457789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:44.900 00:27:44.900 Latency(us) 00:27:44.900 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:44.900 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:27:44.900 Verification LBA range: start 0x0 length 0x100 00:27:44.900 crypto_ram : 5.58 60.77 3.80 0.00 0.00 2006707.72 72401.68 1645765.00 00:27:44.900 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:27:44.900 Verification LBA range: start 0x100 length 0x100 00:27:44.900 crypto_ram : 5.49 57.90 3.62 0.00 0.00 2102917.36 48683.89 1725656.50 00:27:44.900 Job: crypto_ram1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:27:44.900 Verification LBA range: start 0x0 length 0x100 00:27:44.900 crypto_ram1 : 5.62 66.91 4.18 0.00 0.00 1834804.15 61915.92 1533916.89 00:27:44.900 Job: crypto_ram1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:27:44.900 Verification LBA range: start 0x100 length 0x100 00:27:44.900 crypto_ram1 : 5.53 62.77 3.92 0.00 0.00 1929070.46 46187.28 1613808.40 00:27:44.900 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:27:44.900 Verification LBA range: start 0x0 length 0x100 00:27:44.900 crypto_ram2 : 5.38 424.88 26.56 0.00 0.00 281401.40 45937.62 425422.26 00:27:44.900 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:27:44.900 Verification LBA range: start 0x100 length 0x100 00:27:44.900 crypto_ram2 : 5.35 416.80 26.05 0.00 0.00 286581.50 24591.60 435408.70 00:27:44.900 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:27:44.900 Verification LBA range: start 0x0 length 0x100 00:27:44.900 crypto_ram3 : 5.43 434.64 27.16 0.00 0.00 269505.40 18974.23 327555.17 00:27:44.900 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:27:44.900 Verification LBA range: start 0x100 length 0x100 00:27:44.900 crypto_ram3 : 5.41 431.21 26.95 0.00 0.00 271910.90 10423.34 321563.31 00:27:44.900 =================================================================================================================== 00:27:44.900 Total : 1955.86 122.24 0.00 0.00 496837.78 10423.34 1725656.50 00:27:45.158 00:27:45.158 real 0m8.494s 00:27:45.158 user 0m16.349s 00:27:45.158 sys 0m0.305s 00:27:45.158 23:49:30 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@1126 -- # xtrace_disable 00:27:45.158 23:49:30 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:27:45.158 ************************************ 00:27:45.158 END TEST bdev_verify_big_io 00:27:45.158 ************************************ 00:27:45.417 23:49:30 blockdev_crypto_qat -- bdev/blockdev.sh@778 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:27:45.417 23:49:30 blockdev_crypto_qat -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:27:45.417 23:49:30 blockdev_crypto_qat -- common/autotest_common.sh@1107 -- # xtrace_disable 00:27:45.417 23:49:30 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:27:45.417 ************************************ 00:27:45.417 START TEST bdev_write_zeroes 00:27:45.417 ************************************ 00:27:45.417 23:49:30 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:27:45.417 [2024-07-24 23:49:30.237125] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:27:45.417 [2024-07-24 23:49:30.237159] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid452392 ] 00:27:45.417 [2024-07-24 23:49:30.298245] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:45.417 [2024-07-24 23:49:30.369339] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:45.417 [2024-07-24 23:49:30.390243] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:27:45.417 [2024-07-24 23:49:30.398269] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:27:45.417 [2024-07-24 23:49:30.406287] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:27:45.675 [2024-07-24 23:49:30.507568] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:27:48.206 [2024-07-24 23:49:32.643820] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:27:48.206 [2024-07-24 23:49:32.643872] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:27:48.206 [2024-07-24 23:49:32.643881] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:48.206 [2024-07-24 23:49:32.651839] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:27:48.206 [2024-07-24 23:49:32.651851] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:27:48.206 [2024-07-24 23:49:32.651857] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:48.206 [2024-07-24 23:49:32.659858] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:27:48.206 [2024-07-24 23:49:32.659868] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:27:48.206 [2024-07-24 23:49:32.659873] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:48.206 [2024-07-24 23:49:32.667877] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:27:48.206 [2024-07-24 23:49:32.667888] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:27:48.206 [2024-07-24 23:49:32.667893] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:48.206 Running I/O for 1 seconds... 00:27:48.772 00:27:48.772 Latency(us) 00:27:48.772 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:48.772 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:27:48.772 crypto_ram : 1.02 3102.02 12.12 0.00 0.00 41058.38 3526.46 48683.89 00:27:48.772 Job: crypto_ram1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:27:48.772 crypto_ram1 : 1.02 3107.60 12.14 0.00 0.00 40834.44 3510.86 45188.63 00:27:48.772 Job: crypto_ram2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:27:48.772 crypto_ram2 : 1.01 24205.82 94.55 0.00 0.00 5233.49 1560.38 6803.26 00:27:48.772 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:27:48.772 crypto_ram3 : 1.01 24238.98 94.68 0.00 0.00 5215.81 1560.38 5867.03 00:27:48.772 =================================================================================================================== 00:27:48.772 Total : 54654.42 213.49 0.00 0.00 9294.58 1560.38 48683.89 00:27:49.339 00:27:49.339 real 0m3.866s 00:27:49.339 user 0m3.575s 00:27:49.339 sys 0m0.249s 00:27:49.339 23:49:34 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@1126 -- # xtrace_disable 00:27:49.339 23:49:34 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:27:49.340 ************************************ 00:27:49.340 END TEST bdev_write_zeroes 00:27:49.340 ************************************ 00:27:49.340 23:49:34 blockdev_crypto_qat -- bdev/blockdev.sh@781 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:27:49.340 23:49:34 blockdev_crypto_qat -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:27:49.340 23:49:34 blockdev_crypto_qat -- common/autotest_common.sh@1107 -- # xtrace_disable 00:27:49.340 23:49:34 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:27:49.340 ************************************ 00:27:49.340 START TEST bdev_json_nonenclosed 00:27:49.340 ************************************ 00:27:49.340 23:49:34 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:27:49.340 [2024-07-24 23:49:34.169260] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:27:49.340 [2024-07-24 23:49:34.169292] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid453081 ] 00:27:49.340 [2024-07-24 23:49:34.230065] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:49.340 [2024-07-24 23:49:34.301815] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:49.340 [2024-07-24 23:49:34.301870] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:27:49.340 [2024-07-24 23:49:34.301897] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:27:49.340 [2024-07-24 23:49:34.301903] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:27:49.598 00:27:49.598 real 0m0.253s 00:27:49.598 user 0m0.163s 00:27:49.598 sys 0m0.089s 00:27:49.598 23:49:34 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1126 -- # xtrace_disable 00:27:49.598 23:49:34 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:27:49.598 ************************************ 00:27:49.598 END TEST bdev_json_nonenclosed 00:27:49.598 ************************************ 00:27:49.598 23:49:34 blockdev_crypto_qat -- bdev/blockdev.sh@784 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:27:49.598 23:49:34 blockdev_crypto_qat -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:27:49.598 23:49:34 blockdev_crypto_qat -- common/autotest_common.sh@1107 -- # xtrace_disable 00:27:49.598 23:49:34 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:27:49.598 ************************************ 00:27:49.598 START TEST bdev_json_nonarray 00:27:49.598 ************************************ 00:27:49.598 23:49:34 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:27:49.598 [2024-07-24 23:49:34.483547] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:27:49.598 [2024-07-24 23:49:34.483581] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid453106 ] 00:27:49.598 [2024-07-24 23:49:34.545259] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:49.857 [2024-07-24 23:49:34.618118] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:49.857 [2024-07-24 23:49:34.618176] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:27:49.857 [2024-07-24 23:49:34.618185] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:27:49.857 [2024-07-24 23:49:34.618190] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:27:49.857 00:27:49.857 real 0m0.254s 00:27:49.857 user 0m0.163s 00:27:49.857 sys 0m0.089s 00:27:49.857 23:49:34 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1126 -- # xtrace_disable 00:27:49.857 23:49:34 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:27:49.857 ************************************ 00:27:49.857 END TEST bdev_json_nonarray 00:27:49.857 ************************************ 00:27:49.857 23:49:34 blockdev_crypto_qat -- bdev/blockdev.sh@786 -- # [[ crypto_qat == bdev ]] 00:27:49.857 23:49:34 blockdev_crypto_qat -- bdev/blockdev.sh@793 -- # [[ crypto_qat == gpt ]] 00:27:49.857 23:49:34 blockdev_crypto_qat -- bdev/blockdev.sh@797 -- # [[ crypto_qat == crypto_sw ]] 00:27:49.857 23:49:34 blockdev_crypto_qat -- bdev/blockdev.sh@809 -- # trap - SIGINT SIGTERM EXIT 00:27:49.857 23:49:34 blockdev_crypto_qat -- bdev/blockdev.sh@810 -- # cleanup 00:27:49.857 23:49:34 blockdev_crypto_qat -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:27:49.857 23:49:34 blockdev_crypto_qat -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:27:49.857 23:49:34 blockdev_crypto_qat -- bdev/blockdev.sh@26 -- # [[ crypto_qat == rbd ]] 00:27:49.857 23:49:34 blockdev_crypto_qat -- bdev/blockdev.sh@30 -- # [[ crypto_qat == daos ]] 00:27:49.857 23:49:34 blockdev_crypto_qat -- bdev/blockdev.sh@34 -- # [[ crypto_qat = \g\p\t ]] 00:27:49.857 23:49:34 blockdev_crypto_qat -- bdev/blockdev.sh@40 -- # [[ crypto_qat == xnvme ]] 00:27:49.857 00:27:49.857 real 1m5.994s 00:27:49.857 user 2m38.762s 00:27:49.857 sys 0m5.796s 00:27:49.857 23:49:34 blockdev_crypto_qat -- common/autotest_common.sh@1126 -- # xtrace_disable 00:27:49.857 23:49:34 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:27:49.857 ************************************ 00:27:49.857 END TEST blockdev_crypto_qat 00:27:49.857 ************************************ 00:27:49.857 23:49:34 -- spdk/autotest.sh@364 -- # run_test chaining /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/chaining.sh 00:27:49.857 23:49:34 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:27:49.857 23:49:34 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:27:49.857 23:49:34 -- common/autotest_common.sh@10 -- # set +x 00:27:49.857 ************************************ 00:27:49.857 START TEST chaining 00:27:49.857 ************************************ 00:27:49.857 23:49:34 chaining -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/chaining.sh 00:27:50.116 * Looking for test storage... 00:27:50.116 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:27:50.116 23:49:34 chaining -- bdev/chaining.sh@14 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:27:50.116 23:49:34 chaining -- nvmf/common.sh@7 -- # uname -s 00:27:50.116 23:49:34 chaining -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:27:50.116 23:49:34 chaining -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:27:50.116 23:49:34 chaining -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:27:50.116 23:49:34 chaining -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:27:50.116 23:49:34 chaining -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:27:50.116 23:49:34 chaining -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:27:50.116 23:49:34 chaining -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:27:50.116 23:49:34 chaining -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:27:50.116 23:49:34 chaining -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:27:50.116 23:49:34 chaining -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:27:50.116 23:49:34 chaining -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:801347e8-3fd0-e911-906e-0017a4403562 00:27:50.116 23:49:34 chaining -- nvmf/common.sh@18 -- # NVME_HOSTID=801347e8-3fd0-e911-906e-0017a4403562 00:27:50.116 23:49:34 chaining -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:27:50.116 23:49:34 chaining -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:27:50.116 23:49:34 chaining -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:27:50.116 23:49:34 chaining -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:27:50.116 23:49:34 chaining -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:27:50.116 23:49:34 chaining -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:27:50.116 23:49:34 chaining -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:27:50.116 23:49:34 chaining -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:27:50.116 23:49:34 chaining -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:27:50.116 23:49:34 chaining -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:27:50.116 23:49:34 chaining -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:27:50.116 23:49:34 chaining -- paths/export.sh@5 -- # export PATH 00:27:50.116 23:49:34 chaining -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:27:50.116 23:49:34 chaining -- nvmf/common.sh@47 -- # : 0 00:27:50.116 23:49:34 chaining -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:27:50.116 23:49:34 chaining -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:27:50.116 23:49:34 chaining -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:27:50.116 23:49:34 chaining -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:27:50.116 23:49:34 chaining -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:27:50.116 23:49:34 chaining -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:27:50.116 23:49:34 chaining -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:27:50.116 23:49:34 chaining -- nvmf/common.sh@51 -- # have_pci_nics=0 00:27:50.116 23:49:34 chaining -- bdev/chaining.sh@16 -- # nqn=nqn.2016-06.io.spdk:cnode0 00:27:50.116 23:49:34 chaining -- bdev/chaining.sh@17 -- # key0=(00112233445566778899001122334455 11223344556677889900112233445500) 00:27:50.116 23:49:34 chaining -- bdev/chaining.sh@18 -- # key1=(22334455667788990011223344550011 33445566778899001122334455001122) 00:27:50.116 23:49:34 chaining -- bdev/chaining.sh@19 -- # bperfsock=/var/tmp/bperf.sock 00:27:50.116 23:49:34 chaining -- bdev/chaining.sh@20 -- # declare -A stats 00:27:50.116 23:49:34 chaining -- bdev/chaining.sh@66 -- # nvmftestinit 00:27:50.116 23:49:34 chaining -- nvmf/common.sh@441 -- # '[' -z tcp ']' 00:27:50.116 23:49:34 chaining -- nvmf/common.sh@446 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:27:50.116 23:49:34 chaining -- nvmf/common.sh@448 -- # prepare_net_devs 00:27:50.116 23:49:34 chaining -- nvmf/common.sh@410 -- # local -g is_hw=no 00:27:50.116 23:49:34 chaining -- nvmf/common.sh@412 -- # remove_spdk_ns 00:27:50.116 23:49:34 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:27:50.116 23:49:34 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:27:50.116 23:49:34 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:27:50.116 23:49:34 chaining -- nvmf/common.sh@414 -- # [[ phy-fallback != virt ]] 00:27:50.116 23:49:34 chaining -- nvmf/common.sh@414 -- # gather_supported_nvmf_pci_devs 00:27:50.116 23:49:34 chaining -- nvmf/common.sh@285 -- # xtrace_disable 00:27:50.116 23:49:34 chaining -- common/autotest_common.sh@10 -- # set +x 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@289 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@291 -- # pci_devs=() 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@291 -- # local -a pci_devs 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@292 -- # pci_net_devs=() 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@292 -- # local -a pci_net_devs 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@293 -- # pci_drivers=() 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@293 -- # local -A pci_drivers 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@295 -- # net_devs=() 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@295 -- # local -ga net_devs 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@296 -- # e810=() 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@296 -- # local -ga e810 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@297 -- # x722=() 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@297 -- # local -ga x722 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@298 -- # mlx=() 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@298 -- # local -ga mlx 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@301 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@302 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@304 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@306 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@308 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@310 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@312 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@314 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@315 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@317 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@318 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@320 -- # pci_devs+=("${e810[@]}") 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@321 -- # [[ tcp == rdma ]] 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@327 -- # [[ '' == mlx5 ]] 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@329 -- # [[ '' == e810 ]] 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@331 -- # [[ '' == x722 ]] 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@335 -- # (( 2 == 0 )) 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@340 -- # for pci in "${pci_devs[@]}" 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@341 -- # echo 'Found 0000:af:00.0 (0x8086 - 0x159b)' 00:27:56.683 Found 0000:af:00.0 (0x8086 - 0x159b) 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@342 -- # [[ ice == unknown ]] 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@346 -- # [[ ice == unbound ]] 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@350 -- # [[ 0x159b == \0\x\1\0\1\7 ]] 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@351 -- # [[ 0x159b == \0\x\1\0\1\9 ]] 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@352 -- # [[ tcp == rdma ]] 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@340 -- # for pci in "${pci_devs[@]}" 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@341 -- # echo 'Found 0000:af:00.1 (0x8086 - 0x159b)' 00:27:56.683 Found 0000:af:00.1 (0x8086 - 0x159b) 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@342 -- # [[ ice == unknown ]] 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@346 -- # [[ ice == unbound ]] 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@350 -- # [[ 0x159b == \0\x\1\0\1\7 ]] 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@351 -- # [[ 0x159b == \0\x\1\0\1\9 ]] 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@352 -- # [[ tcp == rdma ]] 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@366 -- # (( 0 > 0 )) 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@372 -- # [[ '' == e810 ]] 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@382 -- # for pci in "${pci_devs[@]}" 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@383 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@388 -- # [[ tcp == tcp ]] 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@389 -- # for net_dev in "${!pci_net_devs[@]}" 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@390 -- # [[ up == up ]] 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@394 -- # (( 1 == 0 )) 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@399 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@400 -- # echo 'Found net devices under 0000:af:00.0: cvl_0_0' 00:27:56.683 Found net devices under 0000:af:00.0: cvl_0_0 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@401 -- # net_devs+=("${pci_net_devs[@]}") 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@382 -- # for pci in "${pci_devs[@]}" 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@383 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@388 -- # [[ tcp == tcp ]] 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@389 -- # for net_dev in "${!pci_net_devs[@]}" 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@390 -- # [[ up == up ]] 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@394 -- # (( 1 == 0 )) 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@399 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@400 -- # echo 'Found net devices under 0000:af:00.1: cvl_0_1' 00:27:56.683 Found net devices under 0000:af:00.1: cvl_0_1 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@401 -- # net_devs+=("${pci_net_devs[@]}") 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@404 -- # (( 2 == 0 )) 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@414 -- # is_hw=yes 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@416 -- # [[ yes == yes ]] 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@417 -- # [[ tcp == tcp ]] 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@418 -- # nvmf_tcp_init 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@229 -- # NVMF_INITIATOR_IP=10.0.0.1 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@230 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@231 -- # TCP_INTERFACE_LIST=("${net_devs[@]}") 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@234 -- # (( 2 > 1 )) 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@236 -- # NVMF_TARGET_INTERFACE=cvl_0_0 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@237 -- # NVMF_INITIATOR_INTERFACE=cvl_0_1 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@240 -- # NVMF_SECOND_TARGET_IP= 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@242 -- # NVMF_TARGET_NAMESPACE=cvl_0_0_ns_spdk 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@243 -- # NVMF_TARGET_NS_CMD=(ip netns exec "$NVMF_TARGET_NAMESPACE") 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@244 -- # ip -4 addr flush cvl_0_0 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@245 -- # ip -4 addr flush cvl_0_1 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@248 -- # ip netns add cvl_0_0_ns_spdk 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@251 -- # ip link set cvl_0_0 netns cvl_0_0_ns_spdk 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@254 -- # ip addr add 10.0.0.1/24 dev cvl_0_1 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@255 -- # ip netns exec cvl_0_0_ns_spdk ip addr add 10.0.0.2/24 dev cvl_0_0 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@258 -- # ip link set cvl_0_1 up 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@260 -- # ip netns exec cvl_0_0_ns_spdk ip link set cvl_0_0 up 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@261 -- # ip netns exec cvl_0_0_ns_spdk ip link set lo up 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@264 -- # iptables -I INPUT 1 -i cvl_0_1 -p tcp --dport 4420 -j ACCEPT 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@267 -- # ping -c 1 10.0.0.2 00:27:56.683 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:27:56.683 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.221 ms 00:27:56.683 00:27:56.683 --- 10.0.0.2 ping statistics --- 00:27:56.683 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:27:56.683 rtt min/avg/max/mdev = 0.221/0.221/0.221/0.000 ms 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@268 -- # ip netns exec cvl_0_0_ns_spdk ping -c 1 10.0.0.1 00:27:56.683 PING 10.0.0.1 (10.0.0.1) 56(84) bytes of data. 00:27:56.683 64 bytes from 10.0.0.1: icmp_seq=1 ttl=64 time=0.199 ms 00:27:56.683 00:27:56.683 --- 10.0.0.1 ping statistics --- 00:27:56.683 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:27:56.683 rtt min/avg/max/mdev = 0.199/0.199/0.199/0.000 ms 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@270 -- # NVMF_APP=("${NVMF_TARGET_NS_CMD[@]}" "${NVMF_APP[@]}") 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@422 -- # return 0 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@450 -- # '[' '' == iso ']' 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@454 -- # NVMF_TRANSPORT_OPTS='-t tcp' 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@455 -- # [[ tcp == \r\d\m\a ]] 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@464 -- # [[ tcp == \t\c\p ]] 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@465 -- # NVMF_TRANSPORT_OPTS='-t tcp -o' 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@468 -- # '[' tcp == tcp ']' 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@474 -- # modprobe nvme-tcp 00:27:56.683 23:49:41 chaining -- bdev/chaining.sh@67 -- # nvmfappstart -m 0x2 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@479 -- # timing_enter start_nvmf_tgt 00:27:56.683 23:49:41 chaining -- common/autotest_common.sh@724 -- # xtrace_disable 00:27:56.683 23:49:41 chaining -- common/autotest_common.sh@10 -- # set +x 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@481 -- # nvmfpid=456726 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@482 -- # waitforlisten 456726 00:27:56.683 23:49:41 chaining -- common/autotest_common.sh@831 -- # '[' -z 456726 ']' 00:27:56.683 23:49:41 chaining -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:27:56.683 23:49:41 chaining -- common/autotest_common.sh@836 -- # local max_retries=100 00:27:56.683 23:49:41 chaining -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:27:56.683 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:27:56.683 23:49:41 chaining -- nvmf/common.sh@480 -- # ip netns exec cvl_0_0_ns_spdk /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x2 00:27:56.683 23:49:41 chaining -- common/autotest_common.sh@840 -- # xtrace_disable 00:27:56.683 23:49:41 chaining -- common/autotest_common.sh@10 -- # set +x 00:27:56.683 [2024-07-24 23:49:41.434258] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:27:56.683 [2024-07-24 23:49:41.434300] [ DPDK EAL parameters: nvmf -c 0x2 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:27:56.684 [2024-07-24 23:49:41.502329] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:56.684 [2024-07-24 23:49:41.573758] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:27:56.684 [2024-07-24 23:49:41.573794] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:27:56.684 [2024-07-24 23:49:41.573800] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:27:56.684 [2024-07-24 23:49:41.573806] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:27:56.684 [2024-07-24 23:49:41.573810] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:27:56.684 [2024-07-24 23:49:41.573843] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:27:57.249 23:49:42 chaining -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:27:57.249 23:49:42 chaining -- common/autotest_common.sh@864 -- # return 0 00:27:57.249 23:49:42 chaining -- nvmf/common.sh@483 -- # timing_exit start_nvmf_tgt 00:27:57.249 23:49:42 chaining -- common/autotest_common.sh@730 -- # xtrace_disable 00:27:57.249 23:49:42 chaining -- common/autotest_common.sh@10 -- # set +x 00:27:57.249 23:49:42 chaining -- nvmf/common.sh@484 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:27:57.507 23:49:42 chaining -- bdev/chaining.sh@69 -- # mktemp 00:27:57.508 23:49:42 chaining -- bdev/chaining.sh@69 -- # input=/tmp/tmp.A9tOLerdM9 00:27:57.508 23:49:42 chaining -- bdev/chaining.sh@69 -- # mktemp 00:27:57.508 23:49:42 chaining -- bdev/chaining.sh@69 -- # output=/tmp/tmp.q6jwwLEU8R 00:27:57.508 23:49:42 chaining -- bdev/chaining.sh@70 -- # trap 'tgtcleanup; exit 1' SIGINT SIGTERM EXIT 00:27:57.508 23:49:42 chaining -- bdev/chaining.sh@72 -- # rpc_cmd 00:27:57.508 23:49:42 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:27:57.508 23:49:42 chaining -- common/autotest_common.sh@10 -- # set +x 00:27:57.508 malloc0 00:27:57.508 true 00:27:57.508 true 00:27:57.508 [2024-07-24 23:49:42.295458] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:27:57.508 crypto0 00:27:57.508 [2024-07-24 23:49:42.303486] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:27:57.508 crypto1 00:27:57.508 [2024-07-24 23:49:42.311592] tcp.c: 677:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:27:57.508 [2024-07-24 23:49:42.327746] tcp.c:1006:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 10.0.0.2 port 4420 *** 00:27:57.508 23:49:42 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:27:57.508 23:49:42 chaining -- bdev/chaining.sh@85 -- # update_stats 00:27:57.508 23:49:42 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:27:57.508 23:49:42 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:27:57.508 23:49:42 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:27:57.508 23:49:42 chaining -- bdev/chaining.sh@39 -- # opcode= 00:27:57.508 23:49:42 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:27:57.508 23:49:42 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:27:57.508 23:49:42 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:27:57.508 23:49:42 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:27:57.508 23:49:42 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:27:57.508 23:49:42 chaining -- common/autotest_common.sh@10 -- # set +x 00:27:57.508 23:49:42 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:27:57.508 23:49:42 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=12 00:27:57.508 23:49:42 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:27:57.508 23:49:42 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:27:57.508 23:49:42 chaining -- bdev/chaining.sh@39 -- # event=executed 00:27:57.508 23:49:42 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:27:57.508 23:49:42 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:27:57.508 23:49:42 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:27:57.508 23:49:42 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:27:57.508 23:49:42 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:27:57.508 23:49:42 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:27:57.508 23:49:42 chaining -- common/autotest_common.sh@10 -- # set +x 00:27:57.508 23:49:42 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:27:57.508 23:49:42 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]= 00:27:57.508 23:49:42 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:27:57.508 23:49:42 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:27:57.508 23:49:42 chaining -- bdev/chaining.sh@39 -- # event=executed 00:27:57.508 23:49:42 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:27:57.508 23:49:42 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:27:57.508 23:49:42 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:27:57.508 23:49:42 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:27:57.508 23:49:42 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:27:57.508 23:49:42 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:27:57.508 23:49:42 chaining -- common/autotest_common.sh@10 -- # set +x 00:27:57.508 23:49:42 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:27:57.508 23:49:42 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=12 00:27:57.508 23:49:42 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:27:57.508 23:49:42 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:27:57.508 23:49:42 chaining -- bdev/chaining.sh@39 -- # event=executed 00:27:57.508 23:49:42 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:27:57.508 23:49:42 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:27:57.508 23:49:42 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:27:57.508 23:49:42 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:27:57.508 23:49:42 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:27:57.508 23:49:42 chaining -- common/autotest_common.sh@10 -- # set +x 00:27:57.508 23:49:42 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:27:57.508 23:49:42 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:27:57.767 23:49:42 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:27:57.767 23:49:42 chaining -- bdev/chaining.sh@88 -- # dd if=/dev/urandom of=/tmp/tmp.A9tOLerdM9 bs=1K count=64 00:27:57.767 64+0 records in 00:27:57.767 64+0 records out 00:27:57.767 65536 bytes (66 kB, 64 KiB) copied, 0.000241809 s, 271 MB/s 00:27:57.767 23:49:42 chaining -- bdev/chaining.sh@89 -- # spdk_dd --if /tmp/tmp.A9tOLerdM9 --ob Nvme0n1 --bs 65536 --count 1 00:27:57.767 23:49:42 chaining -- bdev/chaining.sh@25 -- # local config 00:27:57.767 23:49:42 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:27:57.767 23:49:42 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:27:57.767 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:27:57.767 23:49:42 chaining -- bdev/chaining.sh@31 -- # config='{ 00:27:57.767 "subsystems": [ 00:27:57.767 { 00:27:57.767 "subsystem": "bdev", 00:27:57.767 "config": [ 00:27:57.767 { 00:27:57.767 "method": "bdev_nvme_attach_controller", 00:27:57.767 "params": { 00:27:57.767 "trtype": "tcp", 00:27:57.767 "adrfam": "IPv4", 00:27:57.767 "name": "Nvme0", 00:27:57.767 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:27:57.767 "traddr": "10.0.0.2", 00:27:57.767 "trsvcid": "4420" 00:27:57.767 } 00:27:57.767 }, 00:27:57.767 { 00:27:57.767 "method": "bdev_set_options", 00:27:57.767 "params": { 00:27:57.767 "bdev_auto_examine": false 00:27:57.767 } 00:27:57.767 } 00:27:57.767 ] 00:27:57.767 } 00:27:57.767 ] 00:27:57.767 }' 00:27:57.767 23:49:42 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /tmp/tmp.A9tOLerdM9 --ob Nvme0n1 --bs 65536 --count 1 00:27:57.767 23:49:42 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:27:57.767 "subsystems": [ 00:27:57.767 { 00:27:57.767 "subsystem": "bdev", 00:27:57.767 "config": [ 00:27:57.767 { 00:27:57.767 "method": "bdev_nvme_attach_controller", 00:27:57.767 "params": { 00:27:57.767 "trtype": "tcp", 00:27:57.767 "adrfam": "IPv4", 00:27:57.767 "name": "Nvme0", 00:27:57.767 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:27:57.767 "traddr": "10.0.0.2", 00:27:57.767 "trsvcid": "4420" 00:27:57.767 } 00:27:57.767 }, 00:27:57.767 { 00:27:57.767 "method": "bdev_set_options", 00:27:57.767 "params": { 00:27:57.767 "bdev_auto_examine": false 00:27:57.767 } 00:27:57.767 } 00:27:57.767 ] 00:27:57.767 } 00:27:57.767 ] 00:27:57.767 }' 00:27:57.767 [2024-07-24 23:49:42.609517] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:27:57.767 [2024-07-24 23:49:42.609557] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid456998 ] 00:27:57.767 [2024-07-24 23:49:42.673251] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:57.767 [2024-07-24 23:49:42.751238] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:58.284  Copying: 64/64 [kB] (average 10 MBps) 00:27:58.284 00:27:58.284 23:49:43 chaining -- bdev/chaining.sh@90 -- # get_stat sequence_executed 00:27:58.284 23:49:43 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:27:58.284 23:49:43 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:27:58.284 23:49:43 chaining -- bdev/chaining.sh@39 -- # opcode= 00:27:58.284 23:49:43 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:27:58.284 23:49:43 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:27:58.284 23:49:43 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:27:58.284 23:49:43 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:27:58.284 23:49:43 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:27:58.284 23:49:43 chaining -- common/autotest_common.sh@10 -- # set +x 00:27:58.284 23:49:43 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:27:58.284 23:49:43 chaining -- bdev/chaining.sh@90 -- # (( 13 == stats[sequence_executed] + 1 )) 00:27:58.284 23:49:43 chaining -- bdev/chaining.sh@91 -- # get_stat executed encrypt 00:27:58.284 23:49:43 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:27:58.284 23:49:43 chaining -- bdev/chaining.sh@39 -- # event=executed 00:27:58.284 23:49:43 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:27:58.284 23:49:43 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:27:58.284 23:49:43 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:27:58.284 23:49:43 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:27:58.284 23:49:43 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:27:58.284 23:49:43 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:27:58.284 23:49:43 chaining -- common/autotest_common.sh@10 -- # set +x 00:27:58.284 23:49:43 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:27:58.284 23:49:43 chaining -- bdev/chaining.sh@91 -- # (( 2 == stats[encrypt_executed] + 2 )) 00:27:58.284 23:49:43 chaining -- bdev/chaining.sh@92 -- # get_stat executed decrypt 00:27:58.284 23:49:43 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:27:58.284 23:49:43 chaining -- bdev/chaining.sh@39 -- # event=executed 00:27:58.284 23:49:43 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:27:58.284 23:49:43 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:27:58.284 23:49:43 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:27:58.284 23:49:43 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:27:58.284 23:49:43 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:27:58.284 23:49:43 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:27:58.284 23:49:43 chaining -- common/autotest_common.sh@10 -- # set +x 00:27:58.284 23:49:43 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:27:58.284 23:49:43 chaining -- bdev/chaining.sh@92 -- # (( 12 == stats[decrypt_executed] )) 00:27:58.284 23:49:43 chaining -- bdev/chaining.sh@95 -- # get_stat executed copy 00:27:58.284 23:49:43 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:27:58.284 23:49:43 chaining -- bdev/chaining.sh@39 -- # event=executed 00:27:58.284 23:49:43 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:27:58.284 23:49:43 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:27:58.284 23:49:43 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:27:58.284 23:49:43 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:27:58.284 23:49:43 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:27:58.284 23:49:43 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:27:58.284 23:49:43 chaining -- common/autotest_common.sh@10 -- # set +x 00:27:58.542 23:49:43 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:27:58.542 23:49:43 chaining -- bdev/chaining.sh@95 -- # (( 4 == stats[copy_executed] )) 00:27:58.542 23:49:43 chaining -- bdev/chaining.sh@96 -- # update_stats 00:27:58.542 23:49:43 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:27:58.542 23:49:43 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:27:58.542 23:49:43 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:27:58.542 23:49:43 chaining -- bdev/chaining.sh@39 -- # opcode= 00:27:58.542 23:49:43 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:27:58.542 23:49:43 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:27:58.542 23:49:43 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:27:58.542 23:49:43 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:27:58.542 23:49:43 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:27:58.542 23:49:43 chaining -- common/autotest_common.sh@10 -- # set +x 00:27:58.542 23:49:43 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:27:58.542 23:49:43 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=13 00:27:58.542 23:49:43 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:27:58.542 23:49:43 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:27:58.542 23:49:43 chaining -- bdev/chaining.sh@39 -- # event=executed 00:27:58.542 23:49:43 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:27:58.542 23:49:43 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:27:58.542 23:49:43 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:27:58.542 23:49:43 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:27:58.542 23:49:43 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:27:58.542 23:49:43 chaining -- common/autotest_common.sh@10 -- # set +x 00:27:58.542 23:49:43 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:27:58.542 23:49:43 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:27:58.542 23:49:43 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=2 00:27:58.542 23:49:43 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:27:58.542 23:49:43 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:27:58.542 23:49:43 chaining -- bdev/chaining.sh@39 -- # event=executed 00:27:58.542 23:49:43 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:27:58.542 23:49:43 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:27:58.542 23:49:43 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:27:58.542 23:49:43 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:27:58.542 23:49:43 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:27:58.542 23:49:43 chaining -- common/autotest_common.sh@10 -- # set +x 00:27:58.542 23:49:43 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:27:58.542 23:49:43 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:27:58.542 23:49:43 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=12 00:27:58.542 23:49:43 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:27:58.542 23:49:43 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:27:58.543 23:49:43 chaining -- bdev/chaining.sh@39 -- # event=executed 00:27:58.543 23:49:43 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:27:58.543 23:49:43 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:27:58.543 23:49:43 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:27:58.543 23:49:43 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:27:58.543 23:49:43 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:27:58.543 23:49:43 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:27:58.543 23:49:43 chaining -- common/autotest_common.sh@10 -- # set +x 00:27:58.543 23:49:43 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:27:58.543 23:49:43 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:27:58.543 23:49:43 chaining -- bdev/chaining.sh@99 -- # spdk_dd --of /tmp/tmp.q6jwwLEU8R --ib Nvme0n1 --bs 65536 --count 1 00:27:58.543 23:49:43 chaining -- bdev/chaining.sh@25 -- # local config 00:27:58.543 23:49:43 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:27:58.543 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:27:58.543 23:49:43 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:27:58.543 23:49:43 chaining -- bdev/chaining.sh@31 -- # config='{ 00:27:58.543 "subsystems": [ 00:27:58.543 { 00:27:58.543 "subsystem": "bdev", 00:27:58.543 "config": [ 00:27:58.543 { 00:27:58.543 "method": "bdev_nvme_attach_controller", 00:27:58.543 "params": { 00:27:58.543 "trtype": "tcp", 00:27:58.543 "adrfam": "IPv4", 00:27:58.543 "name": "Nvme0", 00:27:58.543 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:27:58.543 "traddr": "10.0.0.2", 00:27:58.543 "trsvcid": "4420" 00:27:58.543 } 00:27:58.543 }, 00:27:58.543 { 00:27:58.543 "method": "bdev_set_options", 00:27:58.543 "params": { 00:27:58.543 "bdev_auto_examine": false 00:27:58.543 } 00:27:58.543 } 00:27:58.543 ] 00:27:58.543 } 00:27:58.543 ] 00:27:58.543 }' 00:27:58.543 23:49:43 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --of /tmp/tmp.q6jwwLEU8R --ib Nvme0n1 --bs 65536 --count 1 00:27:58.543 23:49:43 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:27:58.543 "subsystems": [ 00:27:58.543 { 00:27:58.543 "subsystem": "bdev", 00:27:58.543 "config": [ 00:27:58.543 { 00:27:58.543 "method": "bdev_nvme_attach_controller", 00:27:58.543 "params": { 00:27:58.543 "trtype": "tcp", 00:27:58.543 "adrfam": "IPv4", 00:27:58.543 "name": "Nvme0", 00:27:58.543 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:27:58.543 "traddr": "10.0.0.2", 00:27:58.543 "trsvcid": "4420" 00:27:58.543 } 00:27:58.543 }, 00:27:58.543 { 00:27:58.543 "method": "bdev_set_options", 00:27:58.543 "params": { 00:27:58.543 "bdev_auto_examine": false 00:27:58.543 } 00:27:58.543 } 00:27:58.543 ] 00:27:58.543 } 00:27:58.543 ] 00:27:58.543 }' 00:27:58.801 [2024-07-24 23:49:43.564988] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:27:58.801 [2024-07-24 23:49:43.565028] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid457049 ] 00:27:58.801 [2024-07-24 23:49:43.629937] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:58.801 [2024-07-24 23:49:43.702450] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:59.317  Copying: 64/64 [kB] (average 31 MBps) 00:27:59.317 00:27:59.317 23:49:44 chaining -- bdev/chaining.sh@100 -- # get_stat sequence_executed 00:27:59.317 23:49:44 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:27:59.317 23:49:44 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:27:59.317 23:49:44 chaining -- bdev/chaining.sh@39 -- # opcode= 00:27:59.317 23:49:44 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:27:59.317 23:49:44 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:27:59.317 23:49:44 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:27:59.317 23:49:44 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:27:59.317 23:49:44 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:27:59.317 23:49:44 chaining -- common/autotest_common.sh@10 -- # set +x 00:27:59.317 23:49:44 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:27:59.317 23:49:44 chaining -- bdev/chaining.sh@100 -- # (( 14 == stats[sequence_executed] + 1 )) 00:27:59.317 23:49:44 chaining -- bdev/chaining.sh@101 -- # get_stat executed encrypt 00:27:59.317 23:49:44 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:27:59.317 23:49:44 chaining -- bdev/chaining.sh@39 -- # event=executed 00:27:59.317 23:49:44 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:27:59.317 23:49:44 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:27:59.317 23:49:44 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:27:59.317 23:49:44 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:27:59.317 23:49:44 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:27:59.317 23:49:44 chaining -- common/autotest_common.sh@10 -- # set +x 00:27:59.317 23:49:44 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:27:59.317 23:49:44 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:27:59.576 23:49:44 chaining -- bdev/chaining.sh@101 -- # (( 2 == stats[encrypt_executed] )) 00:27:59.576 23:49:44 chaining -- bdev/chaining.sh@102 -- # get_stat executed decrypt 00:27:59.576 23:49:44 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:27:59.576 23:49:44 chaining -- bdev/chaining.sh@39 -- # event=executed 00:27:59.576 23:49:44 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:27:59.576 23:49:44 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:27:59.576 23:49:44 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:27:59.576 23:49:44 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:27:59.576 23:49:44 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:27:59.576 23:49:44 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:27:59.576 23:49:44 chaining -- common/autotest_common.sh@10 -- # set +x 00:27:59.576 23:49:44 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:27:59.576 23:49:44 chaining -- bdev/chaining.sh@102 -- # (( 14 == stats[decrypt_executed] + 2 )) 00:27:59.576 23:49:44 chaining -- bdev/chaining.sh@103 -- # get_stat executed copy 00:27:59.576 23:49:44 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:27:59.576 23:49:44 chaining -- bdev/chaining.sh@39 -- # event=executed 00:27:59.576 23:49:44 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:27:59.576 23:49:44 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:27:59.576 23:49:44 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:27:59.576 23:49:44 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:27:59.576 23:49:44 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:27:59.576 23:49:44 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:27:59.576 23:49:44 chaining -- common/autotest_common.sh@10 -- # set +x 00:27:59.576 23:49:44 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:27:59.576 23:49:44 chaining -- bdev/chaining.sh@103 -- # (( 4 == stats[copy_executed] )) 00:27:59.576 23:49:44 chaining -- bdev/chaining.sh@104 -- # cmp /tmp/tmp.A9tOLerdM9 /tmp/tmp.q6jwwLEU8R 00:27:59.576 23:49:44 chaining -- bdev/chaining.sh@105 -- # spdk_dd --if /dev/zero --ob Nvme0n1 --bs 65536 --count 1 00:27:59.576 23:49:44 chaining -- bdev/chaining.sh@25 -- # local config 00:27:59.576 23:49:44 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:27:59.576 23:49:44 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:27:59.576 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:27:59.576 23:49:44 chaining -- bdev/chaining.sh@31 -- # config='{ 00:27:59.576 "subsystems": [ 00:27:59.576 { 00:27:59.576 "subsystem": "bdev", 00:27:59.576 "config": [ 00:27:59.576 { 00:27:59.576 "method": "bdev_nvme_attach_controller", 00:27:59.576 "params": { 00:27:59.576 "trtype": "tcp", 00:27:59.576 "adrfam": "IPv4", 00:27:59.576 "name": "Nvme0", 00:27:59.576 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:27:59.576 "traddr": "10.0.0.2", 00:27:59.576 "trsvcid": "4420" 00:27:59.576 } 00:27:59.576 }, 00:27:59.576 { 00:27:59.576 "method": "bdev_set_options", 00:27:59.576 "params": { 00:27:59.576 "bdev_auto_examine": false 00:27:59.576 } 00:27:59.576 } 00:27:59.576 ] 00:27:59.576 } 00:27:59.576 ] 00:27:59.576 }' 00:27:59.576 23:49:44 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /dev/zero --ob Nvme0n1 --bs 65536 --count 1 00:27:59.576 23:49:44 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:27:59.576 "subsystems": [ 00:27:59.576 { 00:27:59.576 "subsystem": "bdev", 00:27:59.576 "config": [ 00:27:59.576 { 00:27:59.576 "method": "bdev_nvme_attach_controller", 00:27:59.576 "params": { 00:27:59.576 "trtype": "tcp", 00:27:59.576 "adrfam": "IPv4", 00:27:59.576 "name": "Nvme0", 00:27:59.576 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:27:59.576 "traddr": "10.0.0.2", 00:27:59.576 "trsvcid": "4420" 00:27:59.576 } 00:27:59.576 }, 00:27:59.576 { 00:27:59.576 "method": "bdev_set_options", 00:27:59.576 "params": { 00:27:59.576 "bdev_auto_examine": false 00:27:59.576 } 00:27:59.576 } 00:27:59.576 ] 00:27:59.576 } 00:27:59.576 ] 00:27:59.576 }' 00:27:59.576 [2024-07-24 23:49:44.487889] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:27:59.576 [2024-07-24 23:49:44.487929] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid457296 ] 00:27:59.576 [2024-07-24 23:49:44.550666] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:59.835 [2024-07-24 23:49:44.623296] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:00.093  Copying: 64/64 [kB] (average 12 MBps) 00:28:00.093 00:28:00.093 23:49:44 chaining -- bdev/chaining.sh@106 -- # update_stats 00:28:00.093 23:49:44 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:28:00.093 23:49:44 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:00.093 23:49:44 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:28:00.093 23:49:44 chaining -- bdev/chaining.sh@39 -- # opcode= 00:28:00.093 23:49:44 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:00.093 23:49:44 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:28:00.093 23:49:44 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:28:00.093 23:49:44 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:28:00.093 23:49:44 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:28:00.093 23:49:44 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:00.093 23:49:45 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:28:00.093 23:49:45 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=15 00:28:00.093 23:49:45 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:28:00.093 23:49:45 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:00.093 23:49:45 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:00.093 23:49:45 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:28:00.093 23:49:45 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:00.093 23:49:45 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:28:00.093 23:49:45 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:00.093 23:49:45 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:28:00.093 23:49:45 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:28:00.093 23:49:45 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:00.093 23:49:45 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:28:00.093 23:49:45 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=4 00:28:00.093 23:49:45 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:28:00.093 23:49:45 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:00.093 23:49:45 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:00.093 23:49:45 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:28:00.093 23:49:45 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:00.093 23:49:45 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:28:00.093 23:49:45 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:00.093 23:49:45 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:28:00.093 23:49:45 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:28:00.093 23:49:45 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:00.352 23:49:45 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:28:00.352 23:49:45 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=14 00:28:00.352 23:49:45 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:28:00.352 23:49:45 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:00.352 23:49:45 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:00.352 23:49:45 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:28:00.352 23:49:45 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:00.352 23:49:45 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:28:00.352 23:49:45 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:00.352 23:49:45 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:28:00.352 23:49:45 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:28:00.352 23:49:45 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:00.352 23:49:45 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:28:00.352 23:49:45 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:28:00.352 23:49:45 chaining -- bdev/chaining.sh@109 -- # spdk_dd --if /tmp/tmp.A9tOLerdM9 --ob Nvme0n1 --bs 4096 --count 16 00:28:00.352 23:49:45 chaining -- bdev/chaining.sh@25 -- # local config 00:28:00.352 23:49:45 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:28:00.352 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:28:00.352 23:49:45 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:28:00.352 23:49:45 chaining -- bdev/chaining.sh@31 -- # config='{ 00:28:00.352 "subsystems": [ 00:28:00.352 { 00:28:00.352 "subsystem": "bdev", 00:28:00.352 "config": [ 00:28:00.352 { 00:28:00.352 "method": "bdev_nvme_attach_controller", 00:28:00.352 "params": { 00:28:00.352 "trtype": "tcp", 00:28:00.352 "adrfam": "IPv4", 00:28:00.352 "name": "Nvme0", 00:28:00.352 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:28:00.352 "traddr": "10.0.0.2", 00:28:00.352 "trsvcid": "4420" 00:28:00.352 } 00:28:00.352 }, 00:28:00.352 { 00:28:00.352 "method": "bdev_set_options", 00:28:00.352 "params": { 00:28:00.352 "bdev_auto_examine": false 00:28:00.352 } 00:28:00.352 } 00:28:00.352 ] 00:28:00.352 } 00:28:00.352 ] 00:28:00.352 }' 00:28:00.352 23:49:45 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /tmp/tmp.A9tOLerdM9 --ob Nvme0n1 --bs 4096 --count 16 00:28:00.352 23:49:45 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:28:00.352 "subsystems": [ 00:28:00.352 { 00:28:00.352 "subsystem": "bdev", 00:28:00.352 "config": [ 00:28:00.352 { 00:28:00.352 "method": "bdev_nvme_attach_controller", 00:28:00.352 "params": { 00:28:00.352 "trtype": "tcp", 00:28:00.352 "adrfam": "IPv4", 00:28:00.352 "name": "Nvme0", 00:28:00.352 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:28:00.352 "traddr": "10.0.0.2", 00:28:00.352 "trsvcid": "4420" 00:28:00.352 } 00:28:00.352 }, 00:28:00.352 { 00:28:00.352 "method": "bdev_set_options", 00:28:00.352 "params": { 00:28:00.352 "bdev_auto_examine": false 00:28:00.352 } 00:28:00.352 } 00:28:00.352 ] 00:28:00.352 } 00:28:00.352 ] 00:28:00.352 }' 00:28:00.352 [2024-07-24 23:49:45.256767] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:28:00.352 [2024-07-24 23:49:45.256809] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid457463 ] 00:28:00.352 [2024-07-24 23:49:45.319636] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:00.610 [2024-07-24 23:49:45.392049] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:00.869  Copying: 64/64 [kB] (average 9142 kBps) 00:28:00.869 00:28:00.869 23:49:45 chaining -- bdev/chaining.sh@110 -- # get_stat sequence_executed 00:28:00.869 23:49:45 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:00.869 23:49:45 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:28:00.869 23:49:45 chaining -- bdev/chaining.sh@39 -- # opcode= 00:28:00.869 23:49:45 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:00.869 23:49:45 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:28:00.869 23:49:45 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:28:00.869 23:49:45 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:28:00.869 23:49:45 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:28:00.869 23:49:45 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:00.869 23:49:45 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:28:00.869 23:49:45 chaining -- bdev/chaining.sh@110 -- # (( 31 == stats[sequence_executed] + 16 )) 00:28:00.869 23:49:45 chaining -- bdev/chaining.sh@111 -- # get_stat executed encrypt 00:28:00.869 23:49:45 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:00.869 23:49:45 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:00.869 23:49:45 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:28:00.869 23:49:45 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:00.869 23:49:45 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:28:00.869 23:49:45 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:28:00.869 23:49:45 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:00.869 23:49:45 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:28:00.869 23:49:45 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:00.869 23:49:45 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:28:00.869 23:49:45 chaining -- bdev/chaining.sh@111 -- # (( 36 == stats[encrypt_executed] + 32 )) 00:28:00.869 23:49:45 chaining -- bdev/chaining.sh@112 -- # get_stat executed decrypt 00:28:00.869 23:49:45 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:00.869 23:49:45 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:00.869 23:49:45 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:28:00.869 23:49:45 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:00.869 23:49:45 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:28:00.869 23:49:45 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:28:00.869 23:49:45 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:00.869 23:49:45 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:28:00.869 23:49:45 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:00.869 23:49:45 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:28:00.869 23:49:45 chaining -- bdev/chaining.sh@112 -- # (( 14 == stats[decrypt_executed] )) 00:28:00.869 23:49:45 chaining -- bdev/chaining.sh@113 -- # get_stat executed copy 00:28:00.869 23:49:45 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:00.869 23:49:45 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:00.869 23:49:45 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:28:00.869 23:49:45 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:00.869 23:49:45 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:28:00.869 23:49:45 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:00.869 23:49:45 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:28:00.869 23:49:45 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:28:00.869 23:49:45 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:00.869 23:49:45 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:28:01.128 23:49:45 chaining -- bdev/chaining.sh@113 -- # (( 4 == stats[copy_executed] )) 00:28:01.128 23:49:45 chaining -- bdev/chaining.sh@114 -- # update_stats 00:28:01.128 23:49:45 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:28:01.128 23:49:45 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:01.128 23:49:45 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:28:01.128 23:49:45 chaining -- bdev/chaining.sh@39 -- # opcode= 00:28:01.128 23:49:45 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:01.128 23:49:45 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:28:01.128 23:49:45 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:28:01.128 23:49:45 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:28:01.128 23:49:45 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:28:01.128 23:49:45 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:01.128 23:49:45 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:28:01.128 23:49:45 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=31 00:28:01.128 23:49:45 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:28:01.128 23:49:45 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:01.128 23:49:45 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:01.128 23:49:45 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:28:01.128 23:49:45 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:01.128 23:49:45 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:28:01.128 23:49:45 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:01.128 23:49:45 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:28:01.128 23:49:45 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:28:01.128 23:49:45 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:01.128 23:49:45 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:28:01.128 23:49:45 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=36 00:28:01.128 23:49:45 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:28:01.128 23:49:45 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:01.128 23:49:45 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:01.128 23:49:45 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:28:01.128 23:49:45 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:01.128 23:49:45 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:28:01.128 23:49:45 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:01.128 23:49:45 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:28:01.128 23:49:45 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:28:01.128 23:49:45 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:01.128 23:49:45 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:28:01.128 23:49:46 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=14 00:28:01.128 23:49:46 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:28:01.128 23:49:46 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:01.128 23:49:46 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:01.128 23:49:46 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:28:01.128 23:49:46 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:01.128 23:49:46 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:28:01.128 23:49:46 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:01.128 23:49:46 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:28:01.128 23:49:46 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:28:01.128 23:49:46 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:01.128 23:49:46 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:28:01.128 23:49:46 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:28:01.128 23:49:46 chaining -- bdev/chaining.sh@117 -- # : 00:28:01.128 23:49:46 chaining -- bdev/chaining.sh@118 -- # spdk_dd --of /tmp/tmp.q6jwwLEU8R --ib Nvme0n1 --bs 4096 --count 16 00:28:01.128 23:49:46 chaining -- bdev/chaining.sh@25 -- # local config 00:28:01.128 23:49:46 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:28:01.128 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:28:01.128 23:49:46 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:28:01.128 23:49:46 chaining -- bdev/chaining.sh@31 -- # config='{ 00:28:01.128 "subsystems": [ 00:28:01.128 { 00:28:01.128 "subsystem": "bdev", 00:28:01.128 "config": [ 00:28:01.128 { 00:28:01.128 "method": "bdev_nvme_attach_controller", 00:28:01.128 "params": { 00:28:01.128 "trtype": "tcp", 00:28:01.128 "adrfam": "IPv4", 00:28:01.128 "name": "Nvme0", 00:28:01.128 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:28:01.128 "traddr": "10.0.0.2", 00:28:01.128 "trsvcid": "4420" 00:28:01.128 } 00:28:01.128 }, 00:28:01.128 { 00:28:01.128 "method": "bdev_set_options", 00:28:01.128 "params": { 00:28:01.128 "bdev_auto_examine": false 00:28:01.128 } 00:28:01.128 } 00:28:01.128 ] 00:28:01.128 } 00:28:01.128 ] 00:28:01.128 }' 00:28:01.128 23:49:46 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --of /tmp/tmp.q6jwwLEU8R --ib Nvme0n1 --bs 4096 --count 16 00:28:01.128 23:49:46 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:28:01.128 "subsystems": [ 00:28:01.128 { 00:28:01.128 "subsystem": "bdev", 00:28:01.128 "config": [ 00:28:01.128 { 00:28:01.128 "method": "bdev_nvme_attach_controller", 00:28:01.128 "params": { 00:28:01.128 "trtype": "tcp", 00:28:01.128 "adrfam": "IPv4", 00:28:01.128 "name": "Nvme0", 00:28:01.128 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:28:01.128 "traddr": "10.0.0.2", 00:28:01.128 "trsvcid": "4420" 00:28:01.128 } 00:28:01.128 }, 00:28:01.128 { 00:28:01.128 "method": "bdev_set_options", 00:28:01.128 "params": { 00:28:01.128 "bdev_auto_examine": false 00:28:01.128 } 00:28:01.128 } 00:28:01.128 ] 00:28:01.128 } 00:28:01.128 ] 00:28:01.128 }' 00:28:01.128 [2024-07-24 23:49:46.120363] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:28:01.128 [2024-07-24 23:49:46.120405] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid457586 ] 00:28:01.387 [2024-07-24 23:49:46.184021] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:01.387 [2024-07-24 23:49:46.256813] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:01.904  Copying: 64/64 [kB] (average 761 kBps) 00:28:01.904 00:28:01.904 23:49:46 chaining -- bdev/chaining.sh@119 -- # get_stat sequence_executed 00:28:01.904 23:49:46 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:01.904 23:49:46 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:28:01.904 23:49:46 chaining -- bdev/chaining.sh@39 -- # opcode= 00:28:01.904 23:49:46 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:01.904 23:49:46 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:28:01.904 23:49:46 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:28:01.904 23:49:46 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:28:01.904 23:49:46 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:28:01.904 23:49:46 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:01.904 23:49:46 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:28:01.904 23:49:46 chaining -- bdev/chaining.sh@119 -- # (( 47 == stats[sequence_executed] + 16 )) 00:28:01.904 23:49:46 chaining -- bdev/chaining.sh@120 -- # get_stat executed encrypt 00:28:01.904 23:49:46 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:01.904 23:49:46 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:01.904 23:49:46 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:28:01.904 23:49:46 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:01.904 23:49:46 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:28:01.905 23:49:46 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:01.905 23:49:46 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:28:01.905 23:49:46 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:28:01.905 23:49:46 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:01.905 23:49:46 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:28:02.163 23:49:46 chaining -- bdev/chaining.sh@120 -- # (( 36 == stats[encrypt_executed] )) 00:28:02.163 23:49:46 chaining -- bdev/chaining.sh@121 -- # get_stat executed decrypt 00:28:02.163 23:49:46 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:02.163 23:49:46 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:02.163 23:49:46 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:28:02.163 23:49:46 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:02.163 23:49:46 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:28:02.163 23:49:46 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:02.163 23:49:46 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:28:02.163 23:49:46 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:28:02.163 23:49:46 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:02.163 23:49:46 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:28:02.163 23:49:46 chaining -- bdev/chaining.sh@121 -- # (( 46 == stats[decrypt_executed] + 32 )) 00:28:02.163 23:49:46 chaining -- bdev/chaining.sh@122 -- # get_stat executed copy 00:28:02.163 23:49:46 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:02.163 23:49:46 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:02.163 23:49:46 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:28:02.163 23:49:46 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:28:02.163 23:49:46 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:28:02.163 23:49:46 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:28:02.163 23:49:46 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:28:02.163 23:49:46 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:28:02.163 23:49:46 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:02.163 23:49:46 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:28:02.163 23:49:46 chaining -- bdev/chaining.sh@122 -- # (( 4 == stats[copy_executed] )) 00:28:02.163 23:49:46 chaining -- bdev/chaining.sh@123 -- # cmp /tmp/tmp.A9tOLerdM9 /tmp/tmp.q6jwwLEU8R 00:28:02.163 23:49:46 chaining -- bdev/chaining.sh@125 -- # trap - SIGINT SIGTERM EXIT 00:28:02.163 23:49:46 chaining -- bdev/chaining.sh@126 -- # tgtcleanup 00:28:02.163 23:49:46 chaining -- bdev/chaining.sh@58 -- # rm -f /tmp/tmp.A9tOLerdM9 /tmp/tmp.q6jwwLEU8R 00:28:02.163 23:49:46 chaining -- bdev/chaining.sh@59 -- # nvmftestfini 00:28:02.163 23:49:46 chaining -- nvmf/common.sh@488 -- # nvmfcleanup 00:28:02.163 23:49:46 chaining -- nvmf/common.sh@117 -- # sync 00:28:02.163 23:49:46 chaining -- nvmf/common.sh@119 -- # '[' tcp == tcp ']' 00:28:02.163 23:49:46 chaining -- nvmf/common.sh@120 -- # set +e 00:28:02.163 23:49:47 chaining -- nvmf/common.sh@121 -- # for i in {1..20} 00:28:02.163 23:49:47 chaining -- nvmf/common.sh@122 -- # modprobe -v -r nvme-tcp 00:28:02.163 rmmod nvme_tcp 00:28:02.163 rmmod nvme_fabrics 00:28:02.163 rmmod nvme_keyring 00:28:02.163 23:49:47 chaining -- nvmf/common.sh@123 -- # modprobe -v -r nvme-fabrics 00:28:02.163 23:49:47 chaining -- nvmf/common.sh@124 -- # set -e 00:28:02.163 23:49:47 chaining -- nvmf/common.sh@125 -- # return 0 00:28:02.163 23:49:47 chaining -- nvmf/common.sh@489 -- # '[' -n 456726 ']' 00:28:02.163 23:49:47 chaining -- nvmf/common.sh@490 -- # killprocess 456726 00:28:02.163 23:49:47 chaining -- common/autotest_common.sh@950 -- # '[' -z 456726 ']' 00:28:02.163 23:49:47 chaining -- common/autotest_common.sh@954 -- # kill -0 456726 00:28:02.163 23:49:47 chaining -- common/autotest_common.sh@955 -- # uname 00:28:02.163 23:49:47 chaining -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:28:02.163 23:49:47 chaining -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 456726 00:28:02.163 23:49:47 chaining -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:28:02.163 23:49:47 chaining -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:28:02.163 23:49:47 chaining -- common/autotest_common.sh@968 -- # echo 'killing process with pid 456726' 00:28:02.163 killing process with pid 456726 00:28:02.163 23:49:47 chaining -- common/autotest_common.sh@969 -- # kill 456726 00:28:02.163 23:49:47 chaining -- common/autotest_common.sh@974 -- # wait 456726 00:28:02.422 23:49:47 chaining -- nvmf/common.sh@492 -- # '[' '' == iso ']' 00:28:02.422 23:49:47 chaining -- nvmf/common.sh@495 -- # [[ tcp == \t\c\p ]] 00:28:02.422 23:49:47 chaining -- nvmf/common.sh@496 -- # nvmf_tcp_fini 00:28:02.422 23:49:47 chaining -- nvmf/common.sh@274 -- # [[ cvl_0_0_ns_spdk == \n\v\m\f\_\t\g\t\_\n\s ]] 00:28:02.422 23:49:47 chaining -- nvmf/common.sh@278 -- # remove_spdk_ns 00:28:02.422 23:49:47 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:28:02.422 23:49:47 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:28:02.422 23:49:47 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:28:04.952 23:49:49 chaining -- nvmf/common.sh@279 -- # ip -4 addr flush cvl_0_1 00:28:04.952 23:49:49 chaining -- bdev/chaining.sh@129 -- # trap 'bperfcleanup; exit 1' SIGINT SIGTERM EXIT 00:28:04.952 23:49:49 chaining -- bdev/chaining.sh@132 -- # bperfpid=458155 00:28:04.952 23:49:49 chaining -- bdev/chaining.sh@131 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:28:04.953 23:49:49 chaining -- bdev/chaining.sh@134 -- # waitforlisten 458155 00:28:04.953 23:49:49 chaining -- common/autotest_common.sh@831 -- # '[' -z 458155 ']' 00:28:04.953 23:49:49 chaining -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:28:04.953 23:49:49 chaining -- common/autotest_common.sh@836 -- # local max_retries=100 00:28:04.953 23:49:49 chaining -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:28:04.953 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:28:04.953 23:49:49 chaining -- common/autotest_common.sh@840 -- # xtrace_disable 00:28:04.953 23:49:49 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:04.953 [2024-07-24 23:49:49.392166] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:28:04.953 [2024-07-24 23:49:49.392211] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid458155 ] 00:28:04.953 [2024-07-24 23:49:49.456297] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:04.953 [2024-07-24 23:49:49.534456] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:05.211 23:49:50 chaining -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:28:05.211 23:49:50 chaining -- common/autotest_common.sh@864 -- # return 0 00:28:05.211 23:49:50 chaining -- bdev/chaining.sh@135 -- # rpc_cmd 00:28:05.211 23:49:50 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:28:05.211 23:49:50 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:05.469 malloc0 00:28:05.469 true 00:28:05.469 true 00:28:05.469 [2024-07-24 23:49:50.310119] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:28:05.469 crypto0 00:28:05.469 [2024-07-24 23:49:50.318141] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:28:05.469 crypto1 00:28:05.469 23:49:50 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:28:05.469 23:49:50 chaining -- bdev/chaining.sh@145 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:28:05.469 Running I/O for 5 seconds... 00:28:10.763 00:28:10.763 Latency(us) 00:28:10.763 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:10.763 Job: crypto1 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:28:10.763 Verification LBA range: start 0x0 length 0x2000 00:28:10.763 crypto1 : 5.01 17700.22 69.14 0.00 0.00 14429.79 364.74 11546.82 00:28:10.763 =================================================================================================================== 00:28:10.763 Total : 17700.22 69.14 0.00 0.00 14429.79 364.74 11546.82 00:28:10.763 0 00:28:10.763 23:49:55 chaining -- bdev/chaining.sh@146 -- # killprocess 458155 00:28:10.763 23:49:55 chaining -- common/autotest_common.sh@950 -- # '[' -z 458155 ']' 00:28:10.763 23:49:55 chaining -- common/autotest_common.sh@954 -- # kill -0 458155 00:28:10.763 23:49:55 chaining -- common/autotest_common.sh@955 -- # uname 00:28:10.763 23:49:55 chaining -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:28:10.763 23:49:55 chaining -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 458155 00:28:10.763 23:49:55 chaining -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:28:10.763 23:49:55 chaining -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:28:10.763 23:49:55 chaining -- common/autotest_common.sh@968 -- # echo 'killing process with pid 458155' 00:28:10.763 killing process with pid 458155 00:28:10.763 23:49:55 chaining -- common/autotest_common.sh@969 -- # kill 458155 00:28:10.763 Received shutdown signal, test time was about 5.000000 seconds 00:28:10.763 00:28:10.763 Latency(us) 00:28:10.763 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:10.763 =================================================================================================================== 00:28:10.763 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:28:10.763 23:49:55 chaining -- common/autotest_common.sh@974 -- # wait 458155 00:28:10.763 23:49:55 chaining -- bdev/chaining.sh@152 -- # bperfpid=459231 00:28:10.763 23:49:55 chaining -- bdev/chaining.sh@151 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:28:10.763 23:49:55 chaining -- bdev/chaining.sh@154 -- # waitforlisten 459231 00:28:10.763 23:49:55 chaining -- common/autotest_common.sh@831 -- # '[' -z 459231 ']' 00:28:10.763 23:49:55 chaining -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:28:10.763 23:49:55 chaining -- common/autotest_common.sh@836 -- # local max_retries=100 00:28:10.763 23:49:55 chaining -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:28:10.763 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:28:10.763 23:49:55 chaining -- common/autotest_common.sh@840 -- # xtrace_disable 00:28:10.763 23:49:55 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:10.763 [2024-07-24 23:49:55.709623] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:28:10.763 [2024-07-24 23:49:55.709668] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid459231 ] 00:28:11.022 [2024-07-24 23:49:55.775094] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:11.022 [2024-07-24 23:49:55.855189] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:11.588 23:49:56 chaining -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:28:11.588 23:49:56 chaining -- common/autotest_common.sh@864 -- # return 0 00:28:11.588 23:49:56 chaining -- bdev/chaining.sh@155 -- # rpc_cmd 00:28:11.588 23:49:56 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:28:11.588 23:49:56 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:11.846 malloc0 00:28:11.846 true 00:28:11.846 true 00:28:11.846 [2024-07-24 23:49:56.633483] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc0 00:28:11.846 [2024-07-24 23:49:56.633521] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:11.846 [2024-07-24 23:49:56.633535] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x205eee0 00:28:11.846 [2024-07-24 23:49:56.633541] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:11.846 [2024-07-24 23:49:56.634265] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:11.846 [2024-07-24 23:49:56.634281] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt0 00:28:11.846 pt0 00:28:11.846 [2024-07-24 23:49:56.641508] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:28:11.846 crypto0 00:28:11.846 [2024-07-24 23:49:56.649522] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:28:11.846 crypto1 00:28:11.846 23:49:56 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:28:11.846 23:49:56 chaining -- bdev/chaining.sh@166 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:28:11.846 Running I/O for 5 seconds... 00:28:17.111 00:28:17.111 Latency(us) 00:28:17.111 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:17.111 Job: crypto1 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:28:17.111 Verification LBA range: start 0x0 length 0x2000 00:28:17.111 crypto1 : 5.01 13864.43 54.16 0.00 0.00 18419.99 1763.23 13044.78 00:28:17.111 =================================================================================================================== 00:28:17.112 Total : 13864.43 54.16 0.00 0.00 18419.99 1763.23 13044.78 00:28:17.112 0 00:28:17.112 23:50:01 chaining -- bdev/chaining.sh@167 -- # killprocess 459231 00:28:17.112 23:50:01 chaining -- common/autotest_common.sh@950 -- # '[' -z 459231 ']' 00:28:17.112 23:50:01 chaining -- common/autotest_common.sh@954 -- # kill -0 459231 00:28:17.112 23:50:01 chaining -- common/autotest_common.sh@955 -- # uname 00:28:17.112 23:50:01 chaining -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:28:17.112 23:50:01 chaining -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 459231 00:28:17.112 23:50:01 chaining -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:28:17.112 23:50:01 chaining -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:28:17.112 23:50:01 chaining -- common/autotest_common.sh@968 -- # echo 'killing process with pid 459231' 00:28:17.112 killing process with pid 459231 00:28:17.112 23:50:01 chaining -- common/autotest_common.sh@969 -- # kill 459231 00:28:17.112 Received shutdown signal, test time was about 5.000000 seconds 00:28:17.112 00:28:17.112 Latency(us) 00:28:17.112 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:17.112 =================================================================================================================== 00:28:17.112 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:28:17.112 23:50:01 chaining -- common/autotest_common.sh@974 -- # wait 459231 00:28:17.112 23:50:01 chaining -- bdev/chaining.sh@169 -- # trap - SIGINT SIGTERM EXIT 00:28:17.112 23:50:01 chaining -- bdev/chaining.sh@170 -- # killprocess 459231 00:28:17.112 23:50:01 chaining -- common/autotest_common.sh@950 -- # '[' -z 459231 ']' 00:28:17.112 23:50:01 chaining -- common/autotest_common.sh@954 -- # kill -0 459231 00:28:17.112 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh: line 954: kill: (459231) - No such process 00:28:17.112 23:50:01 chaining -- common/autotest_common.sh@977 -- # echo 'Process with pid 459231 is not found' 00:28:17.112 Process with pid 459231 is not found 00:28:17.112 23:50:01 chaining -- bdev/chaining.sh@171 -- # wait 459231 00:28:17.112 23:50:01 chaining -- bdev/chaining.sh@175 -- # nvmftestinit 00:28:17.112 23:50:01 chaining -- nvmf/common.sh@441 -- # '[' -z tcp ']' 00:28:17.112 23:50:01 chaining -- nvmf/common.sh@446 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:28:17.112 23:50:01 chaining -- nvmf/common.sh@448 -- # prepare_net_devs 00:28:17.112 23:50:01 chaining -- nvmf/common.sh@410 -- # local -g is_hw=no 00:28:17.112 23:50:01 chaining -- nvmf/common.sh@412 -- # remove_spdk_ns 00:28:17.112 23:50:01 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:28:17.112 23:50:01 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:28:17.112 23:50:01 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@414 -- # [[ phy-fallback != virt ]] 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@414 -- # gather_supported_nvmf_pci_devs 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@285 -- # xtrace_disable 00:28:17.112 23:50:02 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@289 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@291 -- # pci_devs=() 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@291 -- # local -a pci_devs 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@292 -- # pci_net_devs=() 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@292 -- # local -a pci_net_devs 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@293 -- # pci_drivers=() 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@293 -- # local -A pci_drivers 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@295 -- # net_devs=() 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@295 -- # local -ga net_devs 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@296 -- # e810=() 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@296 -- # local -ga e810 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@297 -- # x722=() 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@297 -- # local -ga x722 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@298 -- # mlx=() 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@298 -- # local -ga mlx 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@301 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@302 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@304 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@306 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@308 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@310 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@312 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@314 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@315 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@317 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@318 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@320 -- # pci_devs+=("${e810[@]}") 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@321 -- # [[ tcp == rdma ]] 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@327 -- # [[ '' == mlx5 ]] 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@329 -- # [[ '' == e810 ]] 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@331 -- # [[ '' == x722 ]] 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@335 -- # (( 2 == 0 )) 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@340 -- # for pci in "${pci_devs[@]}" 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@341 -- # echo 'Found 0000:af:00.0 (0x8086 - 0x159b)' 00:28:17.112 Found 0000:af:00.0 (0x8086 - 0x159b) 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@342 -- # [[ ice == unknown ]] 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@346 -- # [[ ice == unbound ]] 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@350 -- # [[ 0x159b == \0\x\1\0\1\7 ]] 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@351 -- # [[ 0x159b == \0\x\1\0\1\9 ]] 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@352 -- # [[ tcp == rdma ]] 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@340 -- # for pci in "${pci_devs[@]}" 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@341 -- # echo 'Found 0000:af:00.1 (0x8086 - 0x159b)' 00:28:17.112 Found 0000:af:00.1 (0x8086 - 0x159b) 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@342 -- # [[ ice == unknown ]] 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@346 -- # [[ ice == unbound ]] 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@350 -- # [[ 0x159b == \0\x\1\0\1\7 ]] 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@351 -- # [[ 0x159b == \0\x\1\0\1\9 ]] 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@352 -- # [[ tcp == rdma ]] 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@366 -- # (( 0 > 0 )) 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@372 -- # [[ '' == e810 ]] 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@382 -- # for pci in "${pci_devs[@]}" 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@383 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@388 -- # [[ tcp == tcp ]] 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@389 -- # for net_dev in "${!pci_net_devs[@]}" 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@390 -- # [[ up == up ]] 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@394 -- # (( 1 == 0 )) 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@399 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@400 -- # echo 'Found net devices under 0000:af:00.0: cvl_0_0' 00:28:17.112 Found net devices under 0000:af:00.0: cvl_0_0 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@401 -- # net_devs+=("${pci_net_devs[@]}") 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@382 -- # for pci in "${pci_devs[@]}" 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@383 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@388 -- # [[ tcp == tcp ]] 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@389 -- # for net_dev in "${!pci_net_devs[@]}" 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@390 -- # [[ up == up ]] 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@394 -- # (( 1 == 0 )) 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@399 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@400 -- # echo 'Found net devices under 0000:af:00.1: cvl_0_1' 00:28:17.112 Found net devices under 0000:af:00.1: cvl_0_1 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@401 -- # net_devs+=("${pci_net_devs[@]}") 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@404 -- # (( 2 == 0 )) 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@414 -- # is_hw=yes 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@416 -- # [[ yes == yes ]] 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@417 -- # [[ tcp == tcp ]] 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@418 -- # nvmf_tcp_init 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@229 -- # NVMF_INITIATOR_IP=10.0.0.1 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@230 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@231 -- # TCP_INTERFACE_LIST=("${net_devs[@]}") 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@234 -- # (( 2 > 1 )) 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@236 -- # NVMF_TARGET_INTERFACE=cvl_0_0 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@237 -- # NVMF_INITIATOR_INTERFACE=cvl_0_1 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@240 -- # NVMF_SECOND_TARGET_IP= 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@242 -- # NVMF_TARGET_NAMESPACE=cvl_0_0_ns_spdk 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@243 -- # NVMF_TARGET_NS_CMD=(ip netns exec "$NVMF_TARGET_NAMESPACE") 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@244 -- # ip -4 addr flush cvl_0_0 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@245 -- # ip -4 addr flush cvl_0_1 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@248 -- # ip netns add cvl_0_0_ns_spdk 00:28:17.112 23:50:02 chaining -- nvmf/common.sh@251 -- # ip link set cvl_0_0 netns cvl_0_0_ns_spdk 00:28:17.371 23:50:02 chaining -- nvmf/common.sh@254 -- # ip addr add 10.0.0.1/24 dev cvl_0_1 00:28:17.371 23:50:02 chaining -- nvmf/common.sh@255 -- # ip netns exec cvl_0_0_ns_spdk ip addr add 10.0.0.2/24 dev cvl_0_0 00:28:17.371 23:50:02 chaining -- nvmf/common.sh@258 -- # ip link set cvl_0_1 up 00:28:17.371 23:50:02 chaining -- nvmf/common.sh@260 -- # ip netns exec cvl_0_0_ns_spdk ip link set cvl_0_0 up 00:28:17.371 23:50:02 chaining -- nvmf/common.sh@261 -- # ip netns exec cvl_0_0_ns_spdk ip link set lo up 00:28:17.371 23:50:02 chaining -- nvmf/common.sh@264 -- # iptables -I INPUT 1 -i cvl_0_1 -p tcp --dport 4420 -j ACCEPT 00:28:17.371 23:50:02 chaining -- nvmf/common.sh@267 -- # ping -c 1 10.0.0.2 00:28:17.371 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:28:17.371 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.164 ms 00:28:17.371 00:28:17.371 --- 10.0.0.2 ping statistics --- 00:28:17.371 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:28:17.371 rtt min/avg/max/mdev = 0.164/0.164/0.164/0.000 ms 00:28:17.371 23:50:02 chaining -- nvmf/common.sh@268 -- # ip netns exec cvl_0_0_ns_spdk ping -c 1 10.0.0.1 00:28:17.371 PING 10.0.0.1 (10.0.0.1) 56(84) bytes of data. 00:28:17.371 64 bytes from 10.0.0.1: icmp_seq=1 ttl=64 time=0.219 ms 00:28:17.371 00:28:17.371 --- 10.0.0.1 ping statistics --- 00:28:17.371 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:28:17.371 rtt min/avg/max/mdev = 0.219/0.219/0.219/0.000 ms 00:28:17.371 23:50:02 chaining -- nvmf/common.sh@270 -- # NVMF_APP=("${NVMF_TARGET_NS_CMD[@]}" "${NVMF_APP[@]}") 00:28:17.371 23:50:02 chaining -- nvmf/common.sh@422 -- # return 0 00:28:17.371 23:50:02 chaining -- nvmf/common.sh@450 -- # '[' '' == iso ']' 00:28:17.371 23:50:02 chaining -- nvmf/common.sh@454 -- # NVMF_TRANSPORT_OPTS='-t tcp' 00:28:17.371 23:50:02 chaining -- nvmf/common.sh@455 -- # [[ tcp == \r\d\m\a ]] 00:28:17.371 23:50:02 chaining -- nvmf/common.sh@464 -- # [[ tcp == \t\c\p ]] 00:28:17.371 23:50:02 chaining -- nvmf/common.sh@465 -- # NVMF_TRANSPORT_OPTS='-t tcp -o' 00:28:17.371 23:50:02 chaining -- nvmf/common.sh@468 -- # '[' tcp == tcp ']' 00:28:17.371 23:50:02 chaining -- nvmf/common.sh@474 -- # modprobe nvme-tcp 00:28:17.371 23:50:02 chaining -- bdev/chaining.sh@176 -- # nvmfappstart -m 0x2 00:28:17.371 23:50:02 chaining -- nvmf/common.sh@479 -- # timing_enter start_nvmf_tgt 00:28:17.371 23:50:02 chaining -- common/autotest_common.sh@724 -- # xtrace_disable 00:28:17.371 23:50:02 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:17.371 23:50:02 chaining -- nvmf/common.sh@481 -- # nvmfpid=460368 00:28:17.371 23:50:02 chaining -- nvmf/common.sh@482 -- # waitforlisten 460368 00:28:17.371 23:50:02 chaining -- nvmf/common.sh@480 -- # ip netns exec cvl_0_0_ns_spdk ip netns exec cvl_0_0_ns_spdk /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x2 00:28:17.371 23:50:02 chaining -- common/autotest_common.sh@831 -- # '[' -z 460368 ']' 00:28:17.371 23:50:02 chaining -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:28:17.371 23:50:02 chaining -- common/autotest_common.sh@836 -- # local max_retries=100 00:28:17.371 23:50:02 chaining -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:28:17.371 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:28:17.371 23:50:02 chaining -- common/autotest_common.sh@840 -- # xtrace_disable 00:28:17.371 23:50:02 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:17.371 [2024-07-24 23:50:02.337402] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:28:17.371 [2024-07-24 23:50:02.337446] [ DPDK EAL parameters: nvmf -c 0x2 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:28:17.629 [2024-07-24 23:50:02.399850] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:17.629 [2024-07-24 23:50:02.477218] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:28:17.629 [2024-07-24 23:50:02.477254] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:28:17.629 [2024-07-24 23:50:02.477260] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:28:17.629 [2024-07-24 23:50:02.477266] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:28:17.629 [2024-07-24 23:50:02.477271] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:28:17.629 [2024-07-24 23:50:02.477303] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:28:18.195 23:50:03 chaining -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:28:18.195 23:50:03 chaining -- common/autotest_common.sh@864 -- # return 0 00:28:18.195 23:50:03 chaining -- nvmf/common.sh@483 -- # timing_exit start_nvmf_tgt 00:28:18.195 23:50:03 chaining -- common/autotest_common.sh@730 -- # xtrace_disable 00:28:18.195 23:50:03 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:18.195 23:50:03 chaining -- nvmf/common.sh@484 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:28:18.195 23:50:03 chaining -- bdev/chaining.sh@178 -- # rpc_cmd 00:28:18.195 23:50:03 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:28:18.195 23:50:03 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:18.195 malloc0 00:28:18.195 [2024-07-24 23:50:03.191610] tcp.c: 677:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:28:18.453 [2024-07-24 23:50:03.207746] tcp.c:1006:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 10.0.0.2 port 4420 *** 00:28:18.453 23:50:03 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:28:18.453 23:50:03 chaining -- bdev/chaining.sh@186 -- # trap 'bperfcleanup || :; nvmftestfini || :; exit 1' SIGINT SIGTERM EXIT 00:28:18.453 23:50:03 chaining -- bdev/chaining.sh@189 -- # bperfpid=460426 00:28:18.453 23:50:03 chaining -- bdev/chaining.sh@191 -- # waitforlisten 460426 /var/tmp/bperf.sock 00:28:18.453 23:50:03 chaining -- bdev/chaining.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/bperf.sock -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:28:18.453 23:50:03 chaining -- common/autotest_common.sh@831 -- # '[' -z 460426 ']' 00:28:18.453 23:50:03 chaining -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/bperf.sock 00:28:18.453 23:50:03 chaining -- common/autotest_common.sh@836 -- # local max_retries=100 00:28:18.453 23:50:03 chaining -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock...' 00:28:18.453 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock... 00:28:18.453 23:50:03 chaining -- common/autotest_common.sh@840 -- # xtrace_disable 00:28:18.453 23:50:03 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:18.453 [2024-07-24 23:50:03.267096] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:28:18.453 [2024-07-24 23:50:03.267132] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid460426 ] 00:28:18.453 [2024-07-24 23:50:03.329844] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:18.453 [2024-07-24 23:50:03.402345] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:19.387 23:50:04 chaining -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:28:19.387 23:50:04 chaining -- common/autotest_common.sh@864 -- # return 0 00:28:19.387 23:50:04 chaining -- bdev/chaining.sh@192 -- # rpc_bperf 00:28:19.387 23:50:04 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock 00:28:19.387 [2024-07-24 23:50:04.381823] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:28:19.387 nvme0n1 00:28:19.387 true 00:28:19.387 crypto0 00:28:19.646 23:50:04 chaining -- bdev/chaining.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/bperf.sock perform_tests 00:28:19.646 Running I/O for 5 seconds... 00:28:24.912 00:28:24.912 Latency(us) 00:28:24.912 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:24.912 Job: crypto0 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:28:24.912 Verification LBA range: start 0x0 length 0x2000 00:28:24.912 crypto0 : 5.01 12977.28 50.69 0.00 0.00 19679.70 1583.79 17725.93 00:28:24.912 =================================================================================================================== 00:28:24.912 Total : 12977.28 50.69 0.00 0.00 19679.70 1583.79 17725.93 00:28:24.912 0 00:28:24.912 23:50:09 chaining -- bdev/chaining.sh@205 -- # get_stat_bperf sequence_executed 00:28:24.912 23:50:09 chaining -- bdev/chaining.sh@48 -- # get_stat sequence_executed '' rpc_bperf 00:28:24.912 23:50:09 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:24.912 23:50:09 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:28:24.912 23:50:09 chaining -- bdev/chaining.sh@39 -- # opcode= 00:28:24.912 23:50:09 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:28:24.912 23:50:09 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:28:24.912 23:50:09 chaining -- bdev/chaining.sh@41 -- # rpc_bperf accel_get_stats 00:28:24.912 23:50:09 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:28:24.912 23:50:09 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:28:24.912 23:50:09 chaining -- bdev/chaining.sh@205 -- # sequence=130126 00:28:24.912 23:50:09 chaining -- bdev/chaining.sh@206 -- # get_stat_bperf executed encrypt 00:28:24.912 23:50:09 chaining -- bdev/chaining.sh@48 -- # get_stat executed encrypt rpc_bperf 00:28:24.912 23:50:09 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:24.912 23:50:09 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:24.912 23:50:09 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:28:24.912 23:50:09 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:28:24.912 23:50:09 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:28:24.912 23:50:09 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:28:24.912 23:50:09 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:28:24.912 23:50:09 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:28:24.912 23:50:09 chaining -- bdev/chaining.sh@206 -- # encrypt=65063 00:28:24.912 23:50:09 chaining -- bdev/chaining.sh@207 -- # get_stat_bperf executed decrypt 00:28:24.912 23:50:09 chaining -- bdev/chaining.sh@48 -- # get_stat executed decrypt rpc_bperf 00:28:24.912 23:50:09 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:24.912 23:50:09 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:24.912 23:50:09 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:28:24.912 23:50:09 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:28:24.912 23:50:09 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:28:24.912 23:50:09 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:28:24.912 23:50:09 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:28:24.912 23:50:09 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:28:25.171 23:50:10 chaining -- bdev/chaining.sh@207 -- # decrypt=65063 00:28:25.171 23:50:10 chaining -- bdev/chaining.sh@208 -- # get_stat_bperf executed crc32c 00:28:25.171 23:50:10 chaining -- bdev/chaining.sh@48 -- # get_stat executed crc32c rpc_bperf 00:28:25.171 23:50:10 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:25.171 23:50:10 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:25.171 23:50:10 chaining -- bdev/chaining.sh@39 -- # opcode=crc32c 00:28:25.171 23:50:10 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:28:25.171 23:50:10 chaining -- bdev/chaining.sh@40 -- # [[ -z crc32c ]] 00:28:25.171 23:50:10 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:28:25.171 23:50:10 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "crc32c").executed' 00:28:25.171 23:50:10 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:28:25.429 23:50:10 chaining -- bdev/chaining.sh@208 -- # crc32c=130126 00:28:25.429 23:50:10 chaining -- bdev/chaining.sh@210 -- # (( sequence > 0 )) 00:28:25.429 23:50:10 chaining -- bdev/chaining.sh@211 -- # (( encrypt + decrypt == sequence )) 00:28:25.429 23:50:10 chaining -- bdev/chaining.sh@212 -- # (( encrypt + decrypt == crc32c )) 00:28:25.429 23:50:10 chaining -- bdev/chaining.sh@214 -- # killprocess 460426 00:28:25.429 23:50:10 chaining -- common/autotest_common.sh@950 -- # '[' -z 460426 ']' 00:28:25.429 23:50:10 chaining -- common/autotest_common.sh@954 -- # kill -0 460426 00:28:25.429 23:50:10 chaining -- common/autotest_common.sh@955 -- # uname 00:28:25.429 23:50:10 chaining -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:28:25.429 23:50:10 chaining -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 460426 00:28:25.429 23:50:10 chaining -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:28:25.429 23:50:10 chaining -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:28:25.429 23:50:10 chaining -- common/autotest_common.sh@968 -- # echo 'killing process with pid 460426' 00:28:25.429 killing process with pid 460426 00:28:25.429 23:50:10 chaining -- common/autotest_common.sh@969 -- # kill 460426 00:28:25.429 Received shutdown signal, test time was about 5.000000 seconds 00:28:25.429 00:28:25.429 Latency(us) 00:28:25.429 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:25.429 =================================================================================================================== 00:28:25.429 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:28:25.429 23:50:10 chaining -- common/autotest_common.sh@974 -- # wait 460426 00:28:25.687 23:50:10 chaining -- bdev/chaining.sh@219 -- # bperfpid=461579 00:28:25.687 23:50:10 chaining -- bdev/chaining.sh@217 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/bperf.sock -t 5 -w verify -o 65536 -q 32 --wait-for-rpc -z 00:28:25.687 23:50:10 chaining -- bdev/chaining.sh@221 -- # waitforlisten 461579 /var/tmp/bperf.sock 00:28:25.687 23:50:10 chaining -- common/autotest_common.sh@831 -- # '[' -z 461579 ']' 00:28:25.687 23:50:10 chaining -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/bperf.sock 00:28:25.687 23:50:10 chaining -- common/autotest_common.sh@836 -- # local max_retries=100 00:28:25.687 23:50:10 chaining -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock...' 00:28:25.687 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock... 00:28:25.687 23:50:10 chaining -- common/autotest_common.sh@840 -- # xtrace_disable 00:28:25.687 23:50:10 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:25.687 [2024-07-24 23:50:10.500257] Starting SPDK v24.09-pre git sha1 68f798423 / DPDK 24.03.0 initialization... 00:28:25.687 [2024-07-24 23:50:10.500302] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid461579 ] 00:28:25.687 [2024-07-24 23:50:10.567030] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:25.687 [2024-07-24 23:50:10.640553] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:26.620 23:50:11 chaining -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:28:26.620 23:50:11 chaining -- common/autotest_common.sh@864 -- # return 0 00:28:26.620 23:50:11 chaining -- bdev/chaining.sh@222 -- # rpc_bperf 00:28:26.620 23:50:11 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock 00:28:26.620 [2024-07-24 23:50:11.615458] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:28:26.620 nvme0n1 00:28:26.620 true 00:28:26.620 crypto0 00:28:26.878 23:50:11 chaining -- bdev/chaining.sh@231 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/bperf.sock perform_tests 00:28:26.878 Running I/O for 5 seconds... 00:28:32.146 00:28:32.146 Latency(us) 00:28:32.146 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:32.146 Job: crypto0 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:28:32.146 Verification LBA range: start 0x0 length 0x200 00:28:32.146 crypto0 : 5.00 2526.34 157.90 0.00 0.00 12425.07 889.42 13419.28 00:28:32.146 =================================================================================================================== 00:28:32.146 Total : 2526.34 157.90 0.00 0.00 12425.07 889.42 13419.28 00:28:32.146 0 00:28:32.146 23:50:16 chaining -- bdev/chaining.sh@233 -- # get_stat_bperf sequence_executed 00:28:32.146 23:50:16 chaining -- bdev/chaining.sh@48 -- # get_stat sequence_executed '' rpc_bperf 00:28:32.146 23:50:16 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:32.146 23:50:16 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:28:32.146 23:50:16 chaining -- bdev/chaining.sh@39 -- # opcode= 00:28:32.146 23:50:16 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:28:32.146 23:50:16 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:28:32.146 23:50:16 chaining -- bdev/chaining.sh@41 -- # rpc_bperf accel_get_stats 00:28:32.146 23:50:16 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:28:32.146 23:50:16 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:28:32.146 23:50:16 chaining -- bdev/chaining.sh@233 -- # sequence=25284 00:28:32.146 23:50:16 chaining -- bdev/chaining.sh@234 -- # get_stat_bperf executed encrypt 00:28:32.146 23:50:16 chaining -- bdev/chaining.sh@48 -- # get_stat executed encrypt rpc_bperf 00:28:32.146 23:50:16 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:32.146 23:50:16 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:32.146 23:50:16 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:28:32.146 23:50:16 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:28:32.146 23:50:16 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:28:32.146 23:50:16 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:28:32.146 23:50:16 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:28:32.146 23:50:16 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:28:32.146 23:50:17 chaining -- bdev/chaining.sh@234 -- # encrypt=12642 00:28:32.146 23:50:17 chaining -- bdev/chaining.sh@235 -- # get_stat_bperf executed decrypt 00:28:32.146 23:50:17 chaining -- bdev/chaining.sh@48 -- # get_stat executed decrypt rpc_bperf 00:28:32.146 23:50:17 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:32.146 23:50:17 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:32.146 23:50:17 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:28:32.146 23:50:17 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:28:32.146 23:50:17 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:28:32.146 23:50:17 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:28:32.146 23:50:17 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:28:32.146 23:50:17 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:28:32.405 23:50:17 chaining -- bdev/chaining.sh@235 -- # decrypt=12642 00:28:32.405 23:50:17 chaining -- bdev/chaining.sh@236 -- # get_stat_bperf executed crc32c 00:28:32.405 23:50:17 chaining -- bdev/chaining.sh@48 -- # get_stat executed crc32c rpc_bperf 00:28:32.405 23:50:17 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:28:32.405 23:50:17 chaining -- bdev/chaining.sh@39 -- # event=executed 00:28:32.405 23:50:17 chaining -- bdev/chaining.sh@39 -- # opcode=crc32c 00:28:32.405 23:50:17 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:28:32.405 23:50:17 chaining -- bdev/chaining.sh@40 -- # [[ -z crc32c ]] 00:28:32.405 23:50:17 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:28:32.405 23:50:17 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:28:32.405 23:50:17 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "crc32c").executed' 00:28:32.663 23:50:17 chaining -- bdev/chaining.sh@236 -- # crc32c=25284 00:28:32.663 23:50:17 chaining -- bdev/chaining.sh@238 -- # (( sequence > 0 )) 00:28:32.663 23:50:17 chaining -- bdev/chaining.sh@239 -- # (( encrypt + decrypt == sequence )) 00:28:32.663 23:50:17 chaining -- bdev/chaining.sh@240 -- # (( encrypt + decrypt == crc32c )) 00:28:32.663 23:50:17 chaining -- bdev/chaining.sh@242 -- # killprocess 461579 00:28:32.663 23:50:17 chaining -- common/autotest_common.sh@950 -- # '[' -z 461579 ']' 00:28:32.663 23:50:17 chaining -- common/autotest_common.sh@954 -- # kill -0 461579 00:28:32.663 23:50:17 chaining -- common/autotest_common.sh@955 -- # uname 00:28:32.663 23:50:17 chaining -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:28:32.663 23:50:17 chaining -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 461579 00:28:32.663 23:50:17 chaining -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:28:32.663 23:50:17 chaining -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:28:32.663 23:50:17 chaining -- common/autotest_common.sh@968 -- # echo 'killing process with pid 461579' 00:28:32.663 killing process with pid 461579 00:28:32.663 23:50:17 chaining -- common/autotest_common.sh@969 -- # kill 461579 00:28:32.663 Received shutdown signal, test time was about 5.000000 seconds 00:28:32.663 00:28:32.663 Latency(us) 00:28:32.663 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:32.663 =================================================================================================================== 00:28:32.663 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:28:32.663 23:50:17 chaining -- common/autotest_common.sh@974 -- # wait 461579 00:28:32.922 23:50:17 chaining -- bdev/chaining.sh@243 -- # nvmftestfini 00:28:32.922 23:50:17 chaining -- nvmf/common.sh@488 -- # nvmfcleanup 00:28:32.922 23:50:17 chaining -- nvmf/common.sh@117 -- # sync 00:28:32.922 23:50:17 chaining -- nvmf/common.sh@119 -- # '[' tcp == tcp ']' 00:28:32.922 23:50:17 chaining -- nvmf/common.sh@120 -- # set +e 00:28:32.922 23:50:17 chaining -- nvmf/common.sh@121 -- # for i in {1..20} 00:28:32.922 23:50:17 chaining -- nvmf/common.sh@122 -- # modprobe -v -r nvme-tcp 00:28:32.922 rmmod nvme_tcp 00:28:32.922 rmmod nvme_fabrics 00:28:32.922 rmmod nvme_keyring 00:28:32.922 23:50:17 chaining -- nvmf/common.sh@123 -- # modprobe -v -r nvme-fabrics 00:28:32.922 23:50:17 chaining -- nvmf/common.sh@124 -- # set -e 00:28:32.922 23:50:17 chaining -- nvmf/common.sh@125 -- # return 0 00:28:32.922 23:50:17 chaining -- nvmf/common.sh@489 -- # '[' -n 460368 ']' 00:28:32.922 23:50:17 chaining -- nvmf/common.sh@490 -- # killprocess 460368 00:28:32.922 23:50:17 chaining -- common/autotest_common.sh@950 -- # '[' -z 460368 ']' 00:28:32.922 23:50:17 chaining -- common/autotest_common.sh@954 -- # kill -0 460368 00:28:32.922 23:50:17 chaining -- common/autotest_common.sh@955 -- # uname 00:28:32.922 23:50:17 chaining -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:28:32.922 23:50:17 chaining -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 460368 00:28:32.922 23:50:17 chaining -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:28:32.922 23:50:17 chaining -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:28:32.922 23:50:17 chaining -- common/autotest_common.sh@968 -- # echo 'killing process with pid 460368' 00:28:32.922 killing process with pid 460368 00:28:32.922 23:50:17 chaining -- common/autotest_common.sh@969 -- # kill 460368 00:28:32.922 23:50:17 chaining -- common/autotest_common.sh@974 -- # wait 460368 00:28:33.181 23:50:17 chaining -- nvmf/common.sh@492 -- # '[' '' == iso ']' 00:28:33.181 23:50:17 chaining -- nvmf/common.sh@495 -- # [[ tcp == \t\c\p ]] 00:28:33.181 23:50:17 chaining -- nvmf/common.sh@496 -- # nvmf_tcp_fini 00:28:33.181 23:50:17 chaining -- nvmf/common.sh@274 -- # [[ cvl_0_0_ns_spdk == \n\v\m\f\_\t\g\t\_\n\s ]] 00:28:33.181 23:50:17 chaining -- nvmf/common.sh@278 -- # remove_spdk_ns 00:28:33.181 23:50:17 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:28:33.181 23:50:17 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:28:33.181 23:50:17 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:28:35.084 23:50:20 chaining -- nvmf/common.sh@279 -- # ip -4 addr flush cvl_0_1 00:28:35.084 23:50:20 chaining -- bdev/chaining.sh@245 -- # trap - SIGINT SIGTERM EXIT 00:28:35.084 00:28:35.084 real 0m45.245s 00:28:35.084 user 0m54.841s 00:28:35.084 sys 0m10.149s 00:28:35.084 23:50:20 chaining -- common/autotest_common.sh@1126 -- # xtrace_disable 00:28:35.084 23:50:20 chaining -- common/autotest_common.sh@10 -- # set +x 00:28:35.084 ************************************ 00:28:35.084 END TEST chaining 00:28:35.084 ************************************ 00:28:35.084 23:50:20 -- spdk/autotest.sh@367 -- # [[ 0 -eq 1 ]] 00:28:35.084 23:50:20 -- spdk/autotest.sh@371 -- # [[ 0 -eq 1 ]] 00:28:35.084 23:50:20 -- spdk/autotest.sh@375 -- # [[ 0 -eq 1 ]] 00:28:35.084 23:50:20 -- spdk/autotest.sh@379 -- # [[ 0 -eq 1 ]] 00:28:35.084 23:50:20 -- spdk/autotest.sh@384 -- # trap - SIGINT SIGTERM EXIT 00:28:35.084 23:50:20 -- spdk/autotest.sh@386 -- # timing_enter post_cleanup 00:28:35.084 23:50:20 -- common/autotest_common.sh@724 -- # xtrace_disable 00:28:35.084 23:50:20 -- common/autotest_common.sh@10 -- # set +x 00:28:35.084 23:50:20 -- spdk/autotest.sh@387 -- # autotest_cleanup 00:28:35.084 23:50:20 -- common/autotest_common.sh@1392 -- # local autotest_es=0 00:28:35.084 23:50:20 -- common/autotest_common.sh@1393 -- # xtrace_disable 00:28:35.084 23:50:20 -- common/autotest_common.sh@10 -- # set +x 00:28:39.316 INFO: APP EXITING 00:28:39.316 INFO: killing all VMs 00:28:39.316 INFO: killing vhost app 00:28:39.316 INFO: EXIT DONE 00:28:42.600 0000:5f:00.0 (1b96 2600): Skipping denied controller at 0000:5f:00.0 00:28:42.600 Waiting for block devices as requested 00:28:42.600 0000:5e:00.0 (8086 0a54): vfio-pci -> nvme 00:28:42.600 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:28:42.600 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:28:42.600 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:28:42.600 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:28:42.859 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:28:42.859 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:28:42.859 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:28:42.859 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:28:43.119 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:28:43.119 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:28:43.119 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:28:43.378 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:28:43.378 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:28:43.378 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:28:43.378 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:28:43.635 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:28:46.166 0000:5f:00.0 (1b96 2600): Skipping denied controller at 0000:5f:00.0 00:28:46.733 Cleaning 00:28:46.733 Removing: /var/run/dpdk/spdk0/config 00:28:46.733 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-0 00:28:46.733 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-1 00:28:46.733 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-2 00:28:46.733 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-3 00:28:46.733 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-0 00:28:46.733 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-1 00:28:46.733 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-2 00:28:46.733 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-3 00:28:46.733 Removing: /var/run/dpdk/spdk0/fbarray_memzone 00:28:46.733 Removing: /var/run/dpdk/spdk0/hugepage_info 00:28:46.733 Removing: /dev/shm/nvmf_trace.0 00:28:46.733 Removing: /dev/shm/spdk_tgt_trace.pid199068 00:28:46.733 Removing: /var/run/dpdk/spdk0 00:28:46.733 Removing: /var/run/dpdk/spdk_pid195454 00:28:46.733 Removing: /var/run/dpdk/spdk_pid197857 00:28:46.733 Removing: /var/run/dpdk/spdk_pid199068 00:28:46.733 Removing: /var/run/dpdk/spdk_pid199633 00:28:46.733 Removing: /var/run/dpdk/spdk_pid200566 00:28:46.733 Removing: /var/run/dpdk/spdk_pid200837 00:28:46.733 Removing: /var/run/dpdk/spdk_pid201955 00:28:46.733 Removing: /var/run/dpdk/spdk_pid201974 00:28:46.733 Removing: /var/run/dpdk/spdk_pid202307 00:28:46.733 Removing: /var/run/dpdk/spdk_pid205350 00:28:46.733 Removing: /var/run/dpdk/spdk_pid207058 00:28:46.733 Removing: /var/run/dpdk/spdk_pid207338 00:28:46.733 Removing: /var/run/dpdk/spdk_pid207623 00:28:46.733 Removing: /var/run/dpdk/spdk_pid207925 00:28:46.733 Removing: /var/run/dpdk/spdk_pid208209 00:28:46.733 Removing: /var/run/dpdk/spdk_pid208467 00:28:46.733 Removing: /var/run/dpdk/spdk_pid208711 00:28:46.733 Removing: /var/run/dpdk/spdk_pid208983 00:28:46.733 Removing: /var/run/dpdk/spdk_pid209814 00:28:46.733 Removing: /var/run/dpdk/spdk_pid212675 00:28:46.733 Removing: /var/run/dpdk/spdk_pid212921 00:28:46.733 Removing: /var/run/dpdk/spdk_pid213211 00:28:46.733 Removing: /var/run/dpdk/spdk_pid213474 00:28:46.733 Removing: /var/run/dpdk/spdk_pid213514 00:28:46.733 Removing: /var/run/dpdk/spdk_pid213776 00:28:46.733 Removing: /var/run/dpdk/spdk_pid214029 00:28:46.733 Removing: /var/run/dpdk/spdk_pid214271 00:28:46.733 Removing: /var/run/dpdk/spdk_pid214519 00:28:46.733 Removing: /var/run/dpdk/spdk_pid214771 00:28:46.733 Removing: /var/run/dpdk/spdk_pid215015 00:28:46.733 Removing: /var/run/dpdk/spdk_pid215263 00:28:46.733 Removing: /var/run/dpdk/spdk_pid215508 00:28:46.733 Removing: /var/run/dpdk/spdk_pid215752 00:28:46.733 Removing: /var/run/dpdk/spdk_pid216000 00:28:46.733 Removing: /var/run/dpdk/spdk_pid216250 00:28:46.733 Removing: /var/run/dpdk/spdk_pid216494 00:28:46.733 Removing: /var/run/dpdk/spdk_pid216744 00:28:46.733 Removing: /var/run/dpdk/spdk_pid216990 00:28:46.733 Removing: /var/run/dpdk/spdk_pid217243 00:28:46.733 Removing: /var/run/dpdk/spdk_pid217491 00:28:46.733 Removing: /var/run/dpdk/spdk_pid217766 00:28:46.733 Removing: /var/run/dpdk/spdk_pid218068 00:28:46.733 Removing: /var/run/dpdk/spdk_pid218348 00:28:46.733 Removing: /var/run/dpdk/spdk_pid218633 00:28:46.733 Removing: /var/run/dpdk/spdk_pid218896 00:28:46.733 Removing: /var/run/dpdk/spdk_pid219183 00:28:46.733 Removing: /var/run/dpdk/spdk_pid219431 00:28:46.733 Removing: /var/run/dpdk/spdk_pid219684 00:28:46.733 Removing: /var/run/dpdk/spdk_pid219954 00:28:46.733 Removing: /var/run/dpdk/spdk_pid220394 00:28:46.733 Removing: /var/run/dpdk/spdk_pid220652 00:28:46.733 Removing: /var/run/dpdk/spdk_pid220908 00:28:46.733 Removing: /var/run/dpdk/spdk_pid221339 00:28:46.733 Removing: /var/run/dpdk/spdk_pid221435 00:28:46.733 Removing: /var/run/dpdk/spdk_pid221744 00:28:46.733 Removing: /var/run/dpdk/spdk_pid222310 00:28:46.733 Removing: /var/run/dpdk/spdk_pid222570 00:28:46.733 Removing: /var/run/dpdk/spdk_pid222805 00:28:46.733 Removing: /var/run/dpdk/spdk_pid226477 00:28:46.733 Removing: /var/run/dpdk/spdk_pid228595 00:28:46.733 Removing: /var/run/dpdk/spdk_pid230552 00:28:46.733 Removing: /var/run/dpdk/spdk_pid231701 00:28:46.733 Removing: /var/run/dpdk/spdk_pid232861 00:28:46.733 Removing: /var/run/dpdk/spdk_pid233138 00:28:46.992 Removing: /var/run/dpdk/spdk_pid233353 00:28:46.992 Removing: /var/run/dpdk/spdk_pid233378 00:28:46.992 Removing: /var/run/dpdk/spdk_pid238532 00:28:46.992 Removing: /var/run/dpdk/spdk_pid239240 00:28:46.992 Removing: /var/run/dpdk/spdk_pid240384 00:28:46.992 Removing: /var/run/dpdk/spdk_pid240634 00:28:46.992 Removing: /var/run/dpdk/spdk_pid245947 00:28:46.992 Removing: /var/run/dpdk/spdk_pid247538 00:28:46.992 Removing: /var/run/dpdk/spdk_pid248383 00:28:46.992 Removing: /var/run/dpdk/spdk_pid252525 00:28:46.992 Removing: /var/run/dpdk/spdk_pid254113 00:28:46.992 Removing: /var/run/dpdk/spdk_pid255093 00:28:46.992 Removing: /var/run/dpdk/spdk_pid259110 00:28:46.992 Removing: /var/run/dpdk/spdk_pid261482 00:28:46.992 Removing: /var/run/dpdk/spdk_pid262312 00:28:46.992 Removing: /var/run/dpdk/spdk_pid272096 00:28:46.992 Removing: /var/run/dpdk/spdk_pid274170 00:28:46.992 Removing: /var/run/dpdk/spdk_pid275188 00:28:46.992 Removing: /var/run/dpdk/spdk_pid284464 00:28:46.992 Removing: /var/run/dpdk/spdk_pid286571 00:28:46.992 Removing: /var/run/dpdk/spdk_pid287582 00:28:46.992 Removing: /var/run/dpdk/spdk_pid296860 00:28:46.992 Removing: /var/run/dpdk/spdk_pid300053 00:28:46.992 Removing: /var/run/dpdk/spdk_pid301064 00:28:46.992 Removing: /var/run/dpdk/spdk_pid311904 00:28:46.992 Removing: /var/run/dpdk/spdk_pid314296 00:28:46.992 Removing: /var/run/dpdk/spdk_pid315503 00:28:46.992 Removing: /var/run/dpdk/spdk_pid325862 00:28:46.992 Removing: /var/run/dpdk/spdk_pid328241 00:28:46.992 Removing: /var/run/dpdk/spdk_pid329259 00:28:46.992 Removing: /var/run/dpdk/spdk_pid339716 00:28:46.992 Removing: /var/run/dpdk/spdk_pid343802 00:28:46.992 Removing: /var/run/dpdk/spdk_pid344818 00:28:46.992 Removing: /var/run/dpdk/spdk_pid345894 00:28:46.992 Removing: /var/run/dpdk/spdk_pid349020 00:28:46.992 Removing: /var/run/dpdk/spdk_pid354036 00:28:46.992 Removing: /var/run/dpdk/spdk_pid356748 00:28:46.992 Removing: /var/run/dpdk/spdk_pid361362 00:28:46.992 Removing: /var/run/dpdk/spdk_pid364776 00:28:46.992 Removing: /var/run/dpdk/spdk_pid370037 00:28:46.992 Removing: /var/run/dpdk/spdk_pid373154 00:28:46.992 Removing: /var/run/dpdk/spdk_pid379933 00:28:46.992 Removing: /var/run/dpdk/spdk_pid382284 00:28:46.992 Removing: /var/run/dpdk/spdk_pid388434 00:28:46.992 Removing: /var/run/dpdk/spdk_pid390578 00:28:46.992 Removing: /var/run/dpdk/spdk_pid396731 00:28:46.992 Removing: /var/run/dpdk/spdk_pid398881 00:28:46.992 Removing: /var/run/dpdk/spdk_pid403282 00:28:46.992 Removing: /var/run/dpdk/spdk_pid403664 00:28:46.992 Removing: /var/run/dpdk/spdk_pid404121 00:28:46.992 Removing: /var/run/dpdk/spdk_pid404575 00:28:46.992 Removing: /var/run/dpdk/spdk_pid405112 00:28:46.992 Removing: /var/run/dpdk/spdk_pid405976 00:28:46.992 Removing: /var/run/dpdk/spdk_pid406816 00:28:46.992 Removing: /var/run/dpdk/spdk_pid407148 00:28:46.992 Removing: /var/run/dpdk/spdk_pid409367 00:28:46.992 Removing: /var/run/dpdk/spdk_pid411209 00:28:46.992 Removing: /var/run/dpdk/spdk_pid413048 00:28:46.992 Removing: /var/run/dpdk/spdk_pid414488 00:28:46.992 Removing: /var/run/dpdk/spdk_pid416330 00:28:46.992 Removing: /var/run/dpdk/spdk_pid418170 00:28:46.992 Removing: /var/run/dpdk/spdk_pid420009 00:28:46.992 Removing: /var/run/dpdk/spdk_pid421443 00:28:46.992 Removing: /var/run/dpdk/spdk_pid422141 00:28:46.992 Removing: /var/run/dpdk/spdk_pid422613 00:28:46.992 Removing: /var/run/dpdk/spdk_pid424692 00:28:46.992 Removing: /var/run/dpdk/spdk_pid426884 00:28:46.992 Removing: /var/run/dpdk/spdk_pid429252 00:28:46.992 Removing: /var/run/dpdk/spdk_pid430510 00:28:46.992 Removing: /var/run/dpdk/spdk_pid431895 00:28:46.992 Removing: /var/run/dpdk/spdk_pid432474 00:28:46.992 Removing: /var/run/dpdk/spdk_pid432646 00:28:46.992 Removing: /var/run/dpdk/spdk_pid432774 00:28:46.992 Removing: /var/run/dpdk/spdk_pid433029 00:28:46.992 Removing: /var/run/dpdk/spdk_pid433142 00:28:46.992 Removing: /var/run/dpdk/spdk_pid434363 00:28:46.992 Removing: /var/run/dpdk/spdk_pid436195 00:28:46.992 Removing: /var/run/dpdk/spdk_pid438318 00:28:46.992 Removing: /var/run/dpdk/spdk_pid439461 00:28:46.992 Removing: /var/run/dpdk/spdk_pid440382 00:28:46.992 Removing: /var/run/dpdk/spdk_pid440708 00:28:46.992 Removing: /var/run/dpdk/spdk_pid440864 00:28:46.992 Removing: /var/run/dpdk/spdk_pid440889 00:28:46.992 Removing: /var/run/dpdk/spdk_pid441944 00:28:46.992 Removing: /var/run/dpdk/spdk_pid442571 00:28:46.992 Removing: /var/run/dpdk/spdk_pid443036 00:28:46.992 Removing: /var/run/dpdk/spdk_pid445162 00:28:46.992 Removing: /var/run/dpdk/spdk_pid447487 00:28:46.992 Removing: /var/run/dpdk/spdk_pid449656 00:28:46.992 Removing: /var/run/dpdk/spdk_pid451021 00:28:46.992 Removing: /var/run/dpdk/spdk_pid452392 00:28:46.992 Removing: /var/run/dpdk/spdk_pid453081 00:28:46.992 Removing: /var/run/dpdk/spdk_pid453106 00:28:46.992 Removing: /var/run/dpdk/spdk_pid456998 00:28:46.992 Removing: /var/run/dpdk/spdk_pid457049 00:28:47.251 Removing: /var/run/dpdk/spdk_pid457296 00:28:47.251 Removing: /var/run/dpdk/spdk_pid457463 00:28:47.251 Removing: /var/run/dpdk/spdk_pid457586 00:28:47.251 Removing: /var/run/dpdk/spdk_pid458155 00:28:47.251 Removing: /var/run/dpdk/spdk_pid459231 00:28:47.251 Removing: /var/run/dpdk/spdk_pid460426 00:28:47.251 Removing: /var/run/dpdk/spdk_pid461579 00:28:47.251 Clean 00:28:47.251 23:50:32 -- common/autotest_common.sh@1451 -- # return 0 00:28:47.251 23:50:32 -- spdk/autotest.sh@388 -- # timing_exit post_cleanup 00:28:47.251 23:50:32 -- common/autotest_common.sh@730 -- # xtrace_disable 00:28:47.251 23:50:32 -- common/autotest_common.sh@10 -- # set +x 00:28:47.251 23:50:32 -- spdk/autotest.sh@390 -- # timing_exit autotest 00:28:47.251 23:50:32 -- common/autotest_common.sh@730 -- # xtrace_disable 00:28:47.251 23:50:32 -- common/autotest_common.sh@10 -- # set +x 00:28:47.251 23:50:32 -- spdk/autotest.sh@391 -- # chmod a+r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/timing.txt 00:28:47.251 23:50:32 -- spdk/autotest.sh@393 -- # [[ -f /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/udev.log ]] 00:28:47.251 23:50:32 -- spdk/autotest.sh@393 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/udev.log 00:28:47.251 23:50:32 -- spdk/autotest.sh@395 -- # hash lcov 00:28:47.251 23:50:32 -- spdk/autotest.sh@395 -- # [[ CC_TYPE=gcc == *\c\l\a\n\g* ]] 00:28:47.251 23:50:32 -- spdk/autotest.sh@397 -- # hostname 00:28:47.251 23:50:32 -- spdk/autotest.sh@397 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -c -d /var/jenkins/workspace/crypto-phy-autotest/spdk -t spdk-wfp-03 -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_test.info 00:28:47.509 geninfo: WARNING: invalid characters removed from testname! 00:29:09.433 23:50:50 -- spdk/autotest.sh@398 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -a /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_base.info -a /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_test.info -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:29:09.433 23:50:53 -- spdk/autotest.sh@399 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/dpdk/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:29:10.000 23:50:54 -- spdk/autotest.sh@400 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '/usr/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:29:11.901 23:50:56 -- spdk/autotest.sh@401 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/examples/vmd/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:29:13.275 23:50:58 -- spdk/autotest.sh@402 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_lspci/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:29:15.178 23:50:59 -- spdk/autotest.sh@403 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_top/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:29:17.082 23:51:01 -- spdk/autotest.sh@404 -- # rm -f cov_base.info cov_test.info OLD_STDOUT OLD_STDERR 00:29:17.082 23:51:01 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:29:17.082 23:51:01 -- scripts/common.sh@508 -- $ [[ -e /bin/wpdk_common.sh ]] 00:29:17.082 23:51:01 -- scripts/common.sh@516 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:29:17.082 23:51:01 -- scripts/common.sh@517 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:29:17.082 23:51:01 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:17.082 23:51:01 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:17.082 23:51:01 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:17.082 23:51:01 -- paths/export.sh@5 -- $ export PATH 00:29:17.082 23:51:01 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:17.082 23:51:01 -- common/autobuild_common.sh@446 -- $ out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:29:17.082 23:51:01 -- common/autobuild_common.sh@447 -- $ date +%s 00:29:17.082 23:51:01 -- common/autobuild_common.sh@447 -- $ mktemp -dt spdk_1721857861.XXXXXX 00:29:17.082 23:51:01 -- common/autobuild_common.sh@447 -- $ SPDK_WORKSPACE=/tmp/spdk_1721857861.LAZuQw 00:29:17.082 23:51:01 -- common/autobuild_common.sh@449 -- $ [[ -n '' ]] 00:29:17.082 23:51:01 -- common/autobuild_common.sh@453 -- $ '[' -n '' ']' 00:29:17.082 23:51:01 -- common/autobuild_common.sh@456 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/' 00:29:17.082 23:51:01 -- common/autobuild_common.sh@460 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp' 00:29:17.082 23:51:01 -- common/autobuild_common.sh@462 -- $ scanbuild='scan-build -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:29:17.082 23:51:01 -- common/autobuild_common.sh@463 -- $ get_config_params 00:29:17.082 23:51:01 -- common/autotest_common.sh@398 -- $ xtrace_disable 00:29:17.082 23:51:01 -- common/autotest_common.sh@10 -- $ set +x 00:29:17.082 23:51:01 -- common/autobuild_common.sh@463 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk' 00:29:17.082 23:51:01 -- common/autobuild_common.sh@465 -- $ start_monitor_resources 00:29:17.082 23:51:01 -- pm/common@17 -- $ local monitor 00:29:17.082 23:51:01 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:29:17.082 23:51:01 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:29:17.082 23:51:01 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:29:17.082 23:51:01 -- pm/common@21 -- $ date +%s 00:29:17.082 23:51:01 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:29:17.082 23:51:01 -- pm/common@21 -- $ date +%s 00:29:17.082 23:51:01 -- pm/common@21 -- $ date +%s 00:29:17.082 23:51:01 -- pm/common@25 -- $ sleep 1 00:29:17.082 23:51:01 -- pm/common@21 -- $ date +%s 00:29:17.082 23:51:01 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1721857861 00:29:17.082 23:51:01 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1721857861 00:29:17.082 23:51:01 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1721857861 00:29:17.082 23:51:01 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1721857861 00:29:17.082 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1721857861_collect-vmstat.pm.log 00:29:17.082 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1721857861_collect-cpu-load.pm.log 00:29:17.082 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1721857861_collect-cpu-temp.pm.log 00:29:17.082 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1721857861_collect-bmc-pm.bmc.pm.log 00:29:17.766 23:51:02 -- common/autobuild_common.sh@466 -- $ trap stop_monitor_resources EXIT 00:29:17.766 23:51:02 -- spdk/autopackage.sh@10 -- $ MAKEFLAGS=-j96 00:29:17.766 23:51:02 -- spdk/autopackage.sh@11 -- $ cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:29:17.767 23:51:02 -- spdk/autopackage.sh@13 -- $ [[ 0 -eq 1 ]] 00:29:17.767 23:51:02 -- spdk/autopackage.sh@18 -- $ [[ 0 -eq 0 ]] 00:29:17.767 23:51:02 -- spdk/autopackage.sh@19 -- $ timing_finish 00:29:17.767 23:51:02 -- common/autotest_common.sh@736 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:29:17.767 23:51:02 -- common/autotest_common.sh@737 -- $ '[' -x /usr/local/FlameGraph/flamegraph.pl ']' 00:29:17.767 23:51:02 -- common/autotest_common.sh@739 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/timing.txt 00:29:17.767 23:51:02 -- spdk/autopackage.sh@20 -- $ exit 0 00:29:17.767 23:51:02 -- spdk/autopackage.sh@1 -- $ stop_monitor_resources 00:29:17.767 23:51:02 -- pm/common@29 -- $ signal_monitor_resources TERM 00:29:17.767 23:51:02 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:29:17.767 23:51:02 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:29:17.767 23:51:02 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:29:17.767 23:51:02 -- pm/common@44 -- $ pid=472911 00:29:17.767 23:51:02 -- pm/common@50 -- $ kill -TERM 472911 00:29:17.767 23:51:02 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:29:17.767 23:51:02 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:29:17.767 23:51:02 -- pm/common@44 -- $ pid=472912 00:29:17.767 23:51:02 -- pm/common@50 -- $ kill -TERM 472912 00:29:17.767 23:51:02 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:29:17.767 23:51:02 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:29:17.767 23:51:02 -- pm/common@44 -- $ pid=472914 00:29:17.767 23:51:02 -- pm/common@50 -- $ kill -TERM 472914 00:29:17.767 23:51:02 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:29:17.767 23:51:02 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:29:17.767 23:51:02 -- pm/common@44 -- $ pid=472939 00:29:17.767 23:51:02 -- pm/common@50 -- $ sudo -E kill -TERM 472939 00:29:18.025 + [[ -n 80539 ]] 00:29:18.025 + sudo kill 80539 00:29:18.034 [Pipeline] } 00:29:18.053 [Pipeline] // stage 00:29:18.060 [Pipeline] } 00:29:18.078 [Pipeline] // timeout 00:29:18.085 [Pipeline] } 00:29:18.103 [Pipeline] // catchError 00:29:18.109 [Pipeline] } 00:29:18.128 [Pipeline] // wrap 00:29:18.135 [Pipeline] } 00:29:18.151 [Pipeline] // catchError 00:29:18.162 [Pipeline] stage 00:29:18.164 [Pipeline] { (Epilogue) 00:29:18.180 [Pipeline] catchError 00:29:18.182 [Pipeline] { 00:29:18.197 [Pipeline] echo 00:29:18.199 Cleanup processes 00:29:18.206 [Pipeline] sh 00:29:18.493 + sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:29:18.493 473029 /usr/bin/ipmitool sdr dump /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/sdr.cache 00:29:18.493 473327 sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:29:18.562 [Pipeline] sh 00:29:18.838 ++ sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:29:18.838 ++ grep -v 'sudo pgrep' 00:29:18.838 ++ awk '{print $1}' 00:29:18.838 + sudo kill -9 473029 00:29:18.849 [Pipeline] sh 00:29:19.129 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:29:27.254 [Pipeline] sh 00:29:27.536 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:29:27.536 Artifacts sizes are good 00:29:27.547 [Pipeline] archiveArtifacts 00:29:27.553 Archiving artifacts 00:29:27.679 [Pipeline] sh 00:29:27.968 + sudo chown -R sys_sgci /var/jenkins/workspace/crypto-phy-autotest 00:29:27.981 [Pipeline] cleanWs 00:29:27.989 [WS-CLEANUP] Deleting project workspace... 00:29:27.989 [WS-CLEANUP] Deferred wipeout is used... 00:29:27.995 [WS-CLEANUP] done 00:29:27.997 [Pipeline] } 00:29:28.015 [Pipeline] // catchError 00:29:28.026 [Pipeline] sh 00:29:28.308 + logger -p user.info -t JENKINS-CI 00:29:28.317 [Pipeline] } 00:29:28.333 [Pipeline] // stage 00:29:28.338 [Pipeline] } 00:29:28.356 [Pipeline] // node 00:29:28.361 [Pipeline] End of Pipeline 00:29:28.394 Finished: SUCCESS